00:00:00.001 Started by upstream project "autotest-per-patch" build number 126107 00:00:00.001 originally caused by: 00:00:00.001 Started by user sys_sgci 00:00:00.127 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.128 The recommended git tool is: git 00:00:00.128 using credential 00000000-0000-0000-0000-000000000002 00:00:00.130 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.159 Fetching changes from the remote Git repository 00:00:00.161 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.194 Using shallow fetch with depth 1 00:00:00.194 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.194 > git --version # timeout=10 00:00:00.219 > git --version # 'git version 2.39.2' 00:00:00.219 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.240 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.240 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:02.833 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:02.885 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:02.906 Checking out Revision 308e970df89ed396a3f9dcf22fba8891259694e4 (FETCH_HEAD) 00:00:02.906 > git config core.sparsecheckout # timeout=10 00:00:02.915 > git read-tree -mu HEAD # timeout=10 00:00:02.929 > git checkout -f 308e970df89ed396a3f9dcf22fba8891259694e4 # timeout=5 00:00:02.947 Commit message: "jjb/create-perf-report: make job run concurrent" 00:00:02.947 > git rev-list --no-walk 308e970df89ed396a3f9dcf22fba8891259694e4 # timeout=10 00:00:03.046 [Pipeline] Start of Pipeline 00:00:03.061 [Pipeline] library 00:00:03.063 Loading library shm_lib@master 00:00:03.063 Library shm_lib@master is cached. Copying from home. 00:00:03.083 [Pipeline] node 00:00:03.094 Running on WFP3 in /var/jenkins/workspace/crypto-phy-autotest 00:00:03.096 [Pipeline] { 00:00:03.107 [Pipeline] catchError 00:00:03.108 [Pipeline] { 00:00:03.118 [Pipeline] wrap 00:00:03.125 [Pipeline] { 00:00:03.133 [Pipeline] stage 00:00:03.135 [Pipeline] { (Prologue) 00:00:03.343 [Pipeline] sh 00:00:03.628 + logger -p user.info -t JENKINS-CI 00:00:03.644 [Pipeline] echo 00:00:03.646 Node: WFP3 00:00:03.653 [Pipeline] sh 00:00:03.946 [Pipeline] setCustomBuildProperty 00:00:03.961 [Pipeline] echo 00:00:03.962 Cleanup processes 00:00:03.967 [Pipeline] sh 00:00:04.248 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:04.248 424177 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:04.258 [Pipeline] sh 00:00:04.535 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:04.535 ++ grep -v 'sudo pgrep' 00:00:04.535 ++ awk '{print $1}' 00:00:04.535 + sudo kill -9 00:00:04.535 + true 00:00:04.548 [Pipeline] cleanWs 00:00:04.558 [WS-CLEANUP] Deleting project workspace... 00:00:04.558 [WS-CLEANUP] Deferred wipeout is used... 00:00:04.564 [WS-CLEANUP] done 00:00:04.567 [Pipeline] setCustomBuildProperty 00:00:04.580 [Pipeline] sh 00:00:04.862 + sudo git config --global --replace-all safe.directory '*' 00:00:04.927 [Pipeline] httpRequest 00:00:04.946 [Pipeline] echo 00:00:04.947 Sorcerer 10.211.164.101 is alive 00:00:04.953 [Pipeline] httpRequest 00:00:04.956 HttpMethod: GET 00:00:04.957 URL: http://10.211.164.101/packages/jbp_308e970df89ed396a3f9dcf22fba8891259694e4.tar.gz 00:00:04.959 Sending request to url: http://10.211.164.101/packages/jbp_308e970df89ed396a3f9dcf22fba8891259694e4.tar.gz 00:00:04.961 Response Code: HTTP/1.1 200 OK 00:00:04.961 Success: Status code 200 is in the accepted range: 200,404 00:00:04.962 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/jbp_308e970df89ed396a3f9dcf22fba8891259694e4.tar.gz 00:00:05.701 [Pipeline] sh 00:00:05.986 + tar --no-same-owner -xf jbp_308e970df89ed396a3f9dcf22fba8891259694e4.tar.gz 00:00:06.002 [Pipeline] httpRequest 00:00:06.022 [Pipeline] echo 00:00:06.024 Sorcerer 10.211.164.101 is alive 00:00:06.030 [Pipeline] httpRequest 00:00:06.034 HttpMethod: GET 00:00:06.034 URL: http://10.211.164.101/packages/spdk_b2ac96cc231173e6cb375d29bc2848cfae3add6a.tar.gz 00:00:06.035 Sending request to url: http://10.211.164.101/packages/spdk_b2ac96cc231173e6cb375d29bc2848cfae3add6a.tar.gz 00:00:06.046 Response Code: HTTP/1.1 200 OK 00:00:06.047 Success: Status code 200 is in the accepted range: 200,404 00:00:06.048 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/spdk_b2ac96cc231173e6cb375d29bc2848cfae3add6a.tar.gz 00:01:08.087 [Pipeline] sh 00:01:08.373 + tar --no-same-owner -xf spdk_b2ac96cc231173e6cb375d29bc2848cfae3add6a.tar.gz 00:01:10.947 [Pipeline] sh 00:01:11.223 + git -C spdk log --oneline -n5 00:01:11.223 b2ac96cc2 lib/reduce: merge consecutive IO requests 00:01:11.223 88c9e0c47 nvmf: fix duplicate service registration and memory leak in mdns_server 00:01:11.223 400e0cafc lib/bdev: Fix race between bdev registration and bdev open 00:01:11.223 338475bd8 lib/thread: Align spdk_thread allocation on cache line 00:01:11.223 cde2142cd scsi pr: only registrants are consider holders when reservation in mode 7/8 00:01:11.234 [Pipeline] } 00:01:11.247 [Pipeline] // stage 00:01:11.255 [Pipeline] stage 00:01:11.257 [Pipeline] { (Prepare) 00:01:11.273 [Pipeline] writeFile 00:01:11.290 [Pipeline] sh 00:01:11.570 + logger -p user.info -t JENKINS-CI 00:01:11.582 [Pipeline] sh 00:01:11.865 + logger -p user.info -t JENKINS-CI 00:01:11.879 [Pipeline] sh 00:01:12.163 + cat autorun-spdk.conf 00:01:12.163 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:12.163 SPDK_TEST_BLOCKDEV=1 00:01:12.163 SPDK_TEST_ISAL=1 00:01:12.163 SPDK_TEST_CRYPTO=1 00:01:12.163 SPDK_TEST_REDUCE=1 00:01:12.163 SPDK_TEST_VBDEV_COMPRESS=1 00:01:12.163 SPDK_RUN_UBSAN=1 00:01:12.171 RUN_NIGHTLY=0 00:01:12.178 [Pipeline] readFile 00:01:12.205 [Pipeline] withEnv 00:01:12.208 [Pipeline] { 00:01:12.219 [Pipeline] sh 00:01:12.503 + set -ex 00:01:12.503 + [[ -f /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf ]] 00:01:12.503 + source /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:01:12.503 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:12.503 ++ SPDK_TEST_BLOCKDEV=1 00:01:12.503 ++ SPDK_TEST_ISAL=1 00:01:12.503 ++ SPDK_TEST_CRYPTO=1 00:01:12.503 ++ SPDK_TEST_REDUCE=1 00:01:12.503 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:01:12.503 ++ SPDK_RUN_UBSAN=1 00:01:12.503 ++ RUN_NIGHTLY=0 00:01:12.503 + case $SPDK_TEST_NVMF_NICS in 00:01:12.503 + DRIVERS= 00:01:12.503 + [[ -n '' ]] 00:01:12.503 + exit 0 00:01:12.513 [Pipeline] } 00:01:12.533 [Pipeline] // withEnv 00:01:12.541 [Pipeline] } 00:01:12.561 [Pipeline] // stage 00:01:12.572 [Pipeline] catchError 00:01:12.575 [Pipeline] { 00:01:12.594 [Pipeline] timeout 00:01:12.594 Timeout set to expire in 40 min 00:01:12.597 [Pipeline] { 00:01:12.616 [Pipeline] stage 00:01:12.619 [Pipeline] { (Tests) 00:01:12.638 [Pipeline] sh 00:01:12.924 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/crypto-phy-autotest 00:01:12.924 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest 00:01:12.924 + DIR_ROOT=/var/jenkins/workspace/crypto-phy-autotest 00:01:12.924 + [[ -n /var/jenkins/workspace/crypto-phy-autotest ]] 00:01:12.924 + DIR_SPDK=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:12.924 + DIR_OUTPUT=/var/jenkins/workspace/crypto-phy-autotest/output 00:01:12.924 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/spdk ]] 00:01:12.924 + [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:01:12.924 + mkdir -p /var/jenkins/workspace/crypto-phy-autotest/output 00:01:12.924 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:01:12.924 + [[ crypto-phy-autotest == pkgdep-* ]] 00:01:12.924 + cd /var/jenkins/workspace/crypto-phy-autotest 00:01:12.924 + source /etc/os-release 00:01:12.924 ++ NAME='Fedora Linux' 00:01:12.924 ++ VERSION='38 (Cloud Edition)' 00:01:12.924 ++ ID=fedora 00:01:12.924 ++ VERSION_ID=38 00:01:12.924 ++ VERSION_CODENAME= 00:01:12.924 ++ PLATFORM_ID=platform:f38 00:01:12.924 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:01:12.924 ++ ANSI_COLOR='0;38;2;60;110;180' 00:01:12.924 ++ LOGO=fedora-logo-icon 00:01:12.924 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:01:12.924 ++ HOME_URL=https://fedoraproject.org/ 00:01:12.924 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:01:12.924 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:01:12.924 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:01:12.924 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:01:12.924 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:01:12.924 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:01:12.924 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:01:12.924 ++ SUPPORT_END=2024-05-14 00:01:12.924 ++ VARIANT='Cloud Edition' 00:01:12.924 ++ VARIANT_ID=cloud 00:01:12.924 + uname -a 00:01:12.924 Linux spdk-wfp-03 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 02:47:10 UTC 2024 x86_64 GNU/Linux 00:01:12.924 + sudo /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:01:16.218 Hugepages 00:01:16.218 node hugesize free / total 00:01:16.218 node0 1048576kB 0 / 0 00:01:16.218 node0 2048kB 0 / 0 00:01:16.218 node1 1048576kB 0 / 0 00:01:16.218 node1 2048kB 0 / 0 00:01:16.218 00:01:16.218 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:16.218 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:01:16.218 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:01:16.218 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:01:16.218 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:01:16.218 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:01:16.218 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:01:16.218 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:01:16.218 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:01:16.218 NVMe 0000:5e:00.0 8086 0a54 0 nvme nvme0 nvme0n1 00:01:16.218 NVMe 0000:5f:00.0 1b96 2600 0 nvme nvme1 nvme1n1 nvme1n2 00:01:16.218 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:01:16.218 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:01:16.218 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:01:16.218 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:01:16.218 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:01:16.218 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:01:16.218 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:01:16.218 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:01:16.218 + rm -f /tmp/spdk-ld-path 00:01:16.218 + source autorun-spdk.conf 00:01:16.218 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:16.218 ++ SPDK_TEST_BLOCKDEV=1 00:01:16.218 ++ SPDK_TEST_ISAL=1 00:01:16.218 ++ SPDK_TEST_CRYPTO=1 00:01:16.218 ++ SPDK_TEST_REDUCE=1 00:01:16.218 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:01:16.218 ++ SPDK_RUN_UBSAN=1 00:01:16.218 ++ RUN_NIGHTLY=0 00:01:16.218 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:16.219 + [[ -n '' ]] 00:01:16.219 + sudo git config --global --add safe.directory /var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:16.219 + for M in /var/spdk/build-*-manifest.txt 00:01:16.219 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:16.219 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:01:16.219 + for M in /var/spdk/build-*-manifest.txt 00:01:16.219 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:16.219 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:01:16.219 ++ uname 00:01:16.219 + [[ Linux == \L\i\n\u\x ]] 00:01:16.219 + sudo dmesg -T 00:01:16.219 + sudo dmesg --clear 00:01:16.219 + dmesg_pid=425363 00:01:16.219 + [[ Fedora Linux == FreeBSD ]] 00:01:16.219 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:16.219 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:16.219 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:16.219 + [[ -x /usr/src/fio-static/fio ]] 00:01:16.219 + export FIO_BIN=/usr/src/fio-static/fio 00:01:16.219 + FIO_BIN=/usr/src/fio-static/fio 00:01:16.219 + sudo dmesg -Tw 00:01:16.219 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\c\r\y\p\t\o\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:16.219 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:16.219 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:16.219 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:16.219 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:16.219 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:16.219 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:16.219 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:16.219 + spdk/autorun.sh /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:01:16.219 Test configuration: 00:01:16.219 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:16.219 SPDK_TEST_BLOCKDEV=1 00:01:16.219 SPDK_TEST_ISAL=1 00:01:16.219 SPDK_TEST_CRYPTO=1 00:01:16.219 SPDK_TEST_REDUCE=1 00:01:16.219 SPDK_TEST_VBDEV_COMPRESS=1 00:01:16.219 SPDK_RUN_UBSAN=1 00:01:16.219 RUN_NIGHTLY=0 11:42:06 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:01:16.219 11:42:06 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:16.219 11:42:06 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:16.219 11:42:06 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:16.219 11:42:06 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:16.219 11:42:06 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:16.219 11:42:06 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:16.219 11:42:06 -- paths/export.sh@5 -- $ export PATH 00:01:16.219 11:42:06 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:16.219 11:42:06 -- common/autobuild_common.sh@443 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:01:16.219 11:42:06 -- common/autobuild_common.sh@444 -- $ date +%s 00:01:16.219 11:42:06 -- common/autobuild_common.sh@444 -- $ mktemp -dt spdk_1720777326.XXXXXX 00:01:16.219 11:42:06 -- common/autobuild_common.sh@444 -- $ SPDK_WORKSPACE=/tmp/spdk_1720777326.Odq2uR 00:01:16.219 11:42:06 -- common/autobuild_common.sh@446 -- $ [[ -n '' ]] 00:01:16.219 11:42:06 -- common/autobuild_common.sh@450 -- $ '[' -n '' ']' 00:01:16.219 11:42:06 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:01:16.219 11:42:06 -- common/autobuild_common.sh@457 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:16.219 11:42:06 -- common/autobuild_common.sh@459 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:16.219 11:42:06 -- common/autobuild_common.sh@460 -- $ get_config_params 00:01:16.219 11:42:06 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:01:16.219 11:42:06 -- common/autotest_common.sh@10 -- $ set +x 00:01:16.219 11:42:06 -- common/autobuild_common.sh@460 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:01:16.219 11:42:06 -- common/autobuild_common.sh@462 -- $ start_monitor_resources 00:01:16.219 11:42:06 -- pm/common@17 -- $ local monitor 00:01:16.219 11:42:06 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:16.219 11:42:06 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:16.219 11:42:06 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:16.219 11:42:06 -- pm/common@21 -- $ date +%s 00:01:16.219 11:42:06 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:16.219 11:42:06 -- pm/common@21 -- $ date +%s 00:01:16.219 11:42:06 -- pm/common@25 -- $ sleep 1 00:01:16.219 11:42:06 -- pm/common@21 -- $ date +%s 00:01:16.219 11:42:06 -- pm/common@21 -- $ date +%s 00:01:16.219 11:42:06 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1720777326 00:01:16.219 11:42:06 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1720777326 00:01:16.219 11:42:06 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1720777326 00:01:16.219 11:42:06 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1720777326 00:01:16.219 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1720777326_collect-vmstat.pm.log 00:01:16.219 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1720777326_collect-cpu-load.pm.log 00:01:16.219 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1720777326_collect-cpu-temp.pm.log 00:01:16.219 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1720777326_collect-bmc-pm.bmc.pm.log 00:01:17.157 11:42:07 -- common/autobuild_common.sh@463 -- $ trap stop_monitor_resources EXIT 00:01:17.157 11:42:07 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:17.157 11:42:07 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:17.157 11:42:07 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:17.157 11:42:07 -- spdk/autobuild.sh@16 -- $ date -u 00:01:17.157 Fri Jul 12 09:42:07 AM UTC 2024 00:01:17.157 11:42:07 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:17.157 v24.09-pre-165-gb2ac96cc2 00:01:17.157 11:42:07 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:17.157 11:42:07 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:17.157 11:42:07 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:17.157 11:42:07 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:01:17.157 11:42:07 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:17.157 11:42:07 -- common/autotest_common.sh@10 -- $ set +x 00:01:17.416 ************************************ 00:01:17.416 START TEST ubsan 00:01:17.416 ************************************ 00:01:17.416 11:42:07 ubsan -- common/autotest_common.sh@1123 -- $ echo 'using ubsan' 00:01:17.416 using ubsan 00:01:17.416 00:01:17.416 real 0m0.000s 00:01:17.416 user 0m0.000s 00:01:17.416 sys 0m0.000s 00:01:17.416 11:42:07 ubsan -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:01:17.416 11:42:07 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:01:17.416 ************************************ 00:01:17.416 END TEST ubsan 00:01:17.416 ************************************ 00:01:17.416 11:42:07 -- common/autotest_common.sh@1142 -- $ return 0 00:01:17.416 11:42:07 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:01:17.416 11:42:07 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:01:17.416 11:42:07 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:01:17.416 11:42:07 -- spdk/autobuild.sh@51 -- $ [[ 0 -eq 1 ]] 00:01:17.417 11:42:07 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:01:17.417 11:42:07 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:01:17.417 11:42:07 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:01:17.417 11:42:07 -- spdk/autobuild.sh@62 -- $ [[ 0 -eq 1 ]] 00:01:17.417 11:42:07 -- spdk/autobuild.sh@67 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk --with-shared 00:01:17.417 Using default SPDK env in /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:01:17.417 Using default DPDK in /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:01:17.675 Using 'verbs' RDMA provider 00:01:31.264 Configuring ISA-L (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal.log)...done. 00:01:43.503 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:01:43.503 Creating mk/config.mk...done. 00:01:43.503 Creating mk/cc.flags.mk...done. 00:01:43.503 Type 'make' to build. 00:01:43.503 11:42:32 -- spdk/autobuild.sh@69 -- $ run_test make make -j96 00:01:43.503 11:42:32 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:01:43.503 11:42:32 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:43.503 11:42:32 -- common/autotest_common.sh@10 -- $ set +x 00:01:43.503 ************************************ 00:01:43.503 START TEST make 00:01:43.503 ************************************ 00:01:43.503 11:42:32 make -- common/autotest_common.sh@1123 -- $ make -j96 00:01:43.503 make[1]: Nothing to be done for 'all'. 00:02:15.608 The Meson build system 00:02:15.608 Version: 1.3.1 00:02:15.608 Source dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk 00:02:15.608 Build dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp 00:02:15.608 Build type: native build 00:02:15.608 Program cat found: YES (/usr/bin/cat) 00:02:15.608 Project name: DPDK 00:02:15.608 Project version: 24.03.0 00:02:15.608 C compiler for the host machine: cc (gcc 13.2.1 "cc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:02:15.608 C linker for the host machine: cc ld.bfd 2.39-16 00:02:15.608 Host machine cpu family: x86_64 00:02:15.608 Host machine cpu: x86_64 00:02:15.608 Message: ## Building in Developer Mode ## 00:02:15.608 Program pkg-config found: YES (/usr/bin/pkg-config) 00:02:15.608 Program check-symbols.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:02:15.608 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:02:15.608 Program python3 found: YES (/usr/bin/python3) 00:02:15.608 Program cat found: YES (/usr/bin/cat) 00:02:15.608 Compiler for C supports arguments -march=native: YES 00:02:15.608 Checking for size of "void *" : 8 00:02:15.608 Checking for size of "void *" : 8 (cached) 00:02:15.608 Compiler for C supports link arguments -Wl,--undefined-version: NO 00:02:15.608 Library m found: YES 00:02:15.608 Library numa found: YES 00:02:15.608 Has header "numaif.h" : YES 00:02:15.608 Library fdt found: NO 00:02:15.608 Library execinfo found: NO 00:02:15.608 Has header "execinfo.h" : YES 00:02:15.608 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:02:15.608 Run-time dependency libarchive found: NO (tried pkgconfig) 00:02:15.608 Run-time dependency libbsd found: NO (tried pkgconfig) 00:02:15.608 Run-time dependency jansson found: NO (tried pkgconfig) 00:02:15.608 Run-time dependency openssl found: YES 3.0.9 00:02:15.608 Run-time dependency libpcap found: YES 1.10.4 00:02:15.608 Has header "pcap.h" with dependency libpcap: YES 00:02:15.608 Compiler for C supports arguments -Wcast-qual: YES 00:02:15.608 Compiler for C supports arguments -Wdeprecated: YES 00:02:15.608 Compiler for C supports arguments -Wformat: YES 00:02:15.608 Compiler for C supports arguments -Wformat-nonliteral: NO 00:02:15.608 Compiler for C supports arguments -Wformat-security: NO 00:02:15.608 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:15.608 Compiler for C supports arguments -Wmissing-prototypes: YES 00:02:15.608 Compiler for C supports arguments -Wnested-externs: YES 00:02:15.608 Compiler for C supports arguments -Wold-style-definition: YES 00:02:15.608 Compiler for C supports arguments -Wpointer-arith: YES 00:02:15.608 Compiler for C supports arguments -Wsign-compare: YES 00:02:15.608 Compiler for C supports arguments -Wstrict-prototypes: YES 00:02:15.608 Compiler for C supports arguments -Wundef: YES 00:02:15.608 Compiler for C supports arguments -Wwrite-strings: YES 00:02:15.608 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:02:15.608 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:02:15.608 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:15.608 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:02:15.608 Program objdump found: YES (/usr/bin/objdump) 00:02:15.608 Compiler for C supports arguments -mavx512f: YES 00:02:15.608 Checking if "AVX512 checking" compiles: YES 00:02:15.608 Fetching value of define "__SSE4_2__" : 1 00:02:15.608 Fetching value of define "__AES__" : 1 00:02:15.608 Fetching value of define "__AVX__" : 1 00:02:15.608 Fetching value of define "__AVX2__" : 1 00:02:15.608 Fetching value of define "__AVX512BW__" : 1 00:02:15.608 Fetching value of define "__AVX512CD__" : 1 00:02:15.608 Fetching value of define "__AVX512DQ__" : 1 00:02:15.608 Fetching value of define "__AVX512F__" : 1 00:02:15.608 Fetching value of define "__AVX512VL__" : 1 00:02:15.608 Fetching value of define "__PCLMUL__" : 1 00:02:15.608 Fetching value of define "__RDRND__" : 1 00:02:15.608 Fetching value of define "__RDSEED__" : 1 00:02:15.608 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:02:15.608 Fetching value of define "__znver1__" : (undefined) 00:02:15.608 Fetching value of define "__znver2__" : (undefined) 00:02:15.608 Fetching value of define "__znver3__" : (undefined) 00:02:15.608 Fetching value of define "__znver4__" : (undefined) 00:02:15.608 Compiler for C supports arguments -Wno-format-truncation: YES 00:02:15.608 Message: lib/log: Defining dependency "log" 00:02:15.608 Message: lib/kvargs: Defining dependency "kvargs" 00:02:15.608 Message: lib/telemetry: Defining dependency "telemetry" 00:02:15.608 Checking for function "getentropy" : NO 00:02:15.608 Message: lib/eal: Defining dependency "eal" 00:02:15.608 Message: lib/ring: Defining dependency "ring" 00:02:15.608 Message: lib/rcu: Defining dependency "rcu" 00:02:15.608 Message: lib/mempool: Defining dependency "mempool" 00:02:15.608 Message: lib/mbuf: Defining dependency "mbuf" 00:02:15.608 Fetching value of define "__PCLMUL__" : 1 (cached) 00:02:15.608 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:15.608 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:15.608 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:15.608 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:15.608 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:02:15.608 Compiler for C supports arguments -mpclmul: YES 00:02:15.608 Compiler for C supports arguments -maes: YES 00:02:15.608 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:15.608 Compiler for C supports arguments -mavx512bw: YES 00:02:15.608 Compiler for C supports arguments -mavx512dq: YES 00:02:15.608 Compiler for C supports arguments -mavx512vl: YES 00:02:15.608 Compiler for C supports arguments -mvpclmulqdq: YES 00:02:15.608 Compiler for C supports arguments -mavx2: YES 00:02:15.608 Compiler for C supports arguments -mavx: YES 00:02:15.608 Message: lib/net: Defining dependency "net" 00:02:15.608 Message: lib/meter: Defining dependency "meter" 00:02:15.608 Message: lib/ethdev: Defining dependency "ethdev" 00:02:15.608 Message: lib/pci: Defining dependency "pci" 00:02:15.608 Message: lib/cmdline: Defining dependency "cmdline" 00:02:15.608 Message: lib/hash: Defining dependency "hash" 00:02:15.608 Message: lib/timer: Defining dependency "timer" 00:02:15.608 Message: lib/compressdev: Defining dependency "compressdev" 00:02:15.608 Message: lib/cryptodev: Defining dependency "cryptodev" 00:02:15.608 Message: lib/dmadev: Defining dependency "dmadev" 00:02:15.608 Compiler for C supports arguments -Wno-cast-qual: YES 00:02:15.608 Message: lib/power: Defining dependency "power" 00:02:15.608 Message: lib/reorder: Defining dependency "reorder" 00:02:15.608 Message: lib/security: Defining dependency "security" 00:02:15.608 Has header "linux/userfaultfd.h" : YES 00:02:15.608 Has header "linux/vduse.h" : YES 00:02:15.608 Message: lib/vhost: Defining dependency "vhost" 00:02:15.608 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:02:15.608 Message: drivers/bus/auxiliary: Defining dependency "bus_auxiliary" 00:02:15.608 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:02:15.608 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:02:15.608 Compiler for C supports arguments -std=c11: YES 00:02:15.608 Compiler for C supports arguments -Wno-strict-prototypes: YES 00:02:15.608 Compiler for C supports arguments -D_BSD_SOURCE: YES 00:02:15.608 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES 00:02:15.608 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES 00:02:15.608 Run-time dependency libmlx5 found: YES 1.24.46.0 00:02:15.608 Run-time dependency libibverbs found: YES 1.14.46.0 00:02:15.608 Library mtcr_ul found: NO 00:02:15.608 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_ESP" with dependencies libmlx5, libibverbs: YES 00:02:15.608 Header "infiniband/verbs.h" has symbol "IBV_RX_HASH_IPSEC_SPI" with dependencies libmlx5, libibverbs: YES 00:02:15.608 Header "infiniband/verbs.h" has symbol "IBV_ACCESS_RELAXED_ORDERING " with dependencies libmlx5, libibverbs: YES 00:02:15.608 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQE_RES_FORMAT_CSUM_STRIDX" with dependencies libmlx5, libibverbs: YES 00:02:15.608 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_MASK_TUNNEL_OFFLOADS" with dependencies libmlx5, libibverbs: YES 00:02:15.608 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_MPW_ALLOWED" with dependencies libmlx5, libibverbs: YES 00:02:15.608 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_CQE_128B_COMP" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQ_INIT_ATTR_FLAGS_CQE_PAD" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_flow_action_packet_reformat" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_MPLS" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAGS_PCI_WRITE_END_PADDING" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAG_RX_END_PADDING" with dependencies libmlx5, libibverbs: NO 00:02:15.609 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_devx_port" with dependencies libmlx5, libibverbs: NO 00:02:15.609 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_port" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_ib_port" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_create" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_COUNTERS_DEVX" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_DEFAULT_MISS" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_query_async" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_qp_query" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_pp_alloc" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_devx_tir" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_get_event" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_meter" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "infiniband/mlx5dv.h" has symbol "MLX5_MMAP_GET_NC_PAGES_CMD" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_NIC_RX" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_FDB" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_push_vlan" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_alloc_var" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ENHANCED_MPSW" with dependencies libmlx5, libibverbs: NO 00:02:15.609 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_SEND_EN" with dependencies libmlx5, libibverbs: NO 00:02:15.609 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_WAIT" with dependencies libmlx5, libibverbs: NO 00:02:15.609 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ACCESS_ASO" with dependencies libmlx5, libibverbs: NO 00:02:15.609 Header "linux/if_link.h" has symbol "IFLA_NUM_VF" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "linux/if_link.h" has symbol "IFLA_EXT_MASK" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "linux/if_link.h" has symbol "IFLA_PHYS_SWITCH_ID" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "linux/if_link.h" has symbol "IFLA_PHYS_PORT_NAME" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "rdma/rdma_netlink.h" has symbol "RDMA_NL_NLDEV" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_GET" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_PORT_GET" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_NAME" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_STATE" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_NDEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_domain" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_sampler" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_set_reclaim_device_memory" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_array" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "linux/devlink.h" has symbol "DEVLINK_GENL_NAME" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_aso" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "infiniband/verbs.h" has symbol "INFINIBAND_VERBS_H" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "infiniband/mlx5dv.h" has symbol "MLX5_WQE_UMR_CTRL_FLAG_INLINE" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_rule" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_ACTION_FLAGS_ASO_CT_DIRECTION_INITIATOR" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_allow_duplicate_rules" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "infiniband/verbs.h" has symbol "ibv_reg_mr_iova" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "infiniband/verbs.h" has symbol "ibv_import_device" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_root_table" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_steering_anchor" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Header "infiniband/verbs.h" has symbol "ibv_is_fork_initialized" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Checking whether type "struct mlx5dv_sw_parsing_caps" has member "sw_parsing_offloads" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Checking whether type "struct ibv_counter_set_init_attr" has member "counter_set_id" with dependencies libmlx5, libibverbs: NO 00:02:15.609 Checking whether type "struct ibv_counters_init_attr" has member "comp_mask" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Checking whether type "struct mlx5dv_devx_uar" has member "mmap_off" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Checking whether type "struct mlx5dv_flow_matcher_attr" has member "ft_type" with dependencies libmlx5, libibverbs: YES 00:02:15.609 Configuring mlx5_autoconf.h using configuration 00:02:15.609 Message: drivers/common/mlx5: Defining dependency "common_mlx5" 00:02:15.609 Run-time dependency libcrypto found: YES 3.0.9 00:02:15.609 Library IPSec_MB found: YES 00:02:15.609 Fetching value of define "IMB_VERSION_STR" : "1.5.0" 00:02:15.609 Message: drivers/common/qat: Defining dependency "common_qat" 00:02:15.609 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:02:15.609 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:02:15.609 Library IPSec_MB found: YES 00:02:15.609 Fetching value of define "IMB_VERSION_STR" : "1.5.0" (cached) 00:02:15.609 Message: drivers/crypto/ipsec_mb: Defining dependency "crypto_ipsec_mb" 00:02:15.609 Compiler for C supports arguments -std=c11: YES (cached) 00:02:15.609 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:02:15.609 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:02:15.609 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:02:15.609 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:02:15.609 Message: drivers/crypto/mlx5: Defining dependency "crypto_mlx5" 00:02:15.609 Run-time dependency libisal found: NO (tried pkgconfig) 00:02:15.609 Library libisal found: NO 00:02:15.609 Message: drivers/compress/isal: Defining dependency "compress_isal" 00:02:15.609 Compiler for C supports arguments -std=c11: YES (cached) 00:02:15.609 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:02:15.609 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:02:15.609 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:02:15.609 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:02:15.609 Message: drivers/compress/mlx5: Defining dependency "compress_mlx5" 00:02:15.609 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:02:15.609 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:02:15.609 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:02:15.609 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:02:15.609 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:02:15.609 Program doxygen found: YES (/usr/bin/doxygen) 00:02:15.609 Configuring doxy-api-html.conf using configuration 00:02:15.609 Configuring doxy-api-man.conf using configuration 00:02:15.609 Program mandb found: YES (/usr/bin/mandb) 00:02:15.609 Program sphinx-build found: NO 00:02:15.609 Configuring rte_build_config.h using configuration 00:02:15.609 Message: 00:02:15.609 ================= 00:02:15.609 Applications Enabled 00:02:15.609 ================= 00:02:15.609 00:02:15.609 apps: 00:02:15.609 00:02:15.609 00:02:15.609 Message: 00:02:15.609 ================= 00:02:15.609 Libraries Enabled 00:02:15.609 ================= 00:02:15.609 00:02:15.609 libs: 00:02:15.609 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:02:15.609 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:02:15.609 cryptodev, dmadev, power, reorder, security, vhost, 00:02:15.609 00:02:15.609 Message: 00:02:15.609 =============== 00:02:15.609 Drivers Enabled 00:02:15.609 =============== 00:02:15.609 00:02:15.609 common: 00:02:15.609 mlx5, qat, 00:02:15.609 bus: 00:02:15.609 auxiliary, pci, vdev, 00:02:15.609 mempool: 00:02:15.609 ring, 00:02:15.609 dma: 00:02:15.609 00:02:15.609 net: 00:02:15.609 00:02:15.609 crypto: 00:02:15.609 ipsec_mb, mlx5, 00:02:15.609 compress: 00:02:15.609 isal, mlx5, 00:02:15.609 vdpa: 00:02:15.609 00:02:15.609 00:02:15.609 Message: 00:02:15.609 ================= 00:02:15.609 Content Skipped 00:02:15.609 ================= 00:02:15.609 00:02:15.609 apps: 00:02:15.609 dumpcap: explicitly disabled via build config 00:02:15.609 graph: explicitly disabled via build config 00:02:15.609 pdump: explicitly disabled via build config 00:02:15.609 proc-info: explicitly disabled via build config 00:02:15.609 test-acl: explicitly disabled via build config 00:02:15.609 test-bbdev: explicitly disabled via build config 00:02:15.609 test-cmdline: explicitly disabled via build config 00:02:15.609 test-compress-perf: explicitly disabled via build config 00:02:15.609 test-crypto-perf: explicitly disabled via build config 00:02:15.609 test-dma-perf: explicitly disabled via build config 00:02:15.609 test-eventdev: explicitly disabled via build config 00:02:15.609 test-fib: explicitly disabled via build config 00:02:15.609 test-flow-perf: explicitly disabled via build config 00:02:15.609 test-gpudev: explicitly disabled via build config 00:02:15.609 test-mldev: explicitly disabled via build config 00:02:15.609 test-pipeline: explicitly disabled via build config 00:02:15.609 test-pmd: explicitly disabled via build config 00:02:15.609 test-regex: explicitly disabled via build config 00:02:15.609 test-sad: explicitly disabled via build config 00:02:15.609 test-security-perf: explicitly disabled via build config 00:02:15.609 00:02:15.610 libs: 00:02:15.610 argparse: explicitly disabled via build config 00:02:15.610 metrics: explicitly disabled via build config 00:02:15.610 acl: explicitly disabled via build config 00:02:15.610 bbdev: explicitly disabled via build config 00:02:15.610 bitratestats: explicitly disabled via build config 00:02:15.610 bpf: explicitly disabled via build config 00:02:15.610 cfgfile: explicitly disabled via build config 00:02:15.610 distributor: explicitly disabled via build config 00:02:15.610 efd: explicitly disabled via build config 00:02:15.610 eventdev: explicitly disabled via build config 00:02:15.610 dispatcher: explicitly disabled via build config 00:02:15.610 gpudev: explicitly disabled via build config 00:02:15.610 gro: explicitly disabled via build config 00:02:15.610 gso: explicitly disabled via build config 00:02:15.610 ip_frag: explicitly disabled via build config 00:02:15.610 jobstats: explicitly disabled via build config 00:02:15.610 latencystats: explicitly disabled via build config 00:02:15.610 lpm: explicitly disabled via build config 00:02:15.610 member: explicitly disabled via build config 00:02:15.610 pcapng: explicitly disabled via build config 00:02:15.610 rawdev: explicitly disabled via build config 00:02:15.610 regexdev: explicitly disabled via build config 00:02:15.610 mldev: explicitly disabled via build config 00:02:15.610 rib: explicitly disabled via build config 00:02:15.610 sched: explicitly disabled via build config 00:02:15.610 stack: explicitly disabled via build config 00:02:15.610 ipsec: explicitly disabled via build config 00:02:15.610 pdcp: explicitly disabled via build config 00:02:15.610 fib: explicitly disabled via build config 00:02:15.610 port: explicitly disabled via build config 00:02:15.610 pdump: explicitly disabled via build config 00:02:15.610 table: explicitly disabled via build config 00:02:15.610 pipeline: explicitly disabled via build config 00:02:15.610 graph: explicitly disabled via build config 00:02:15.610 node: explicitly disabled via build config 00:02:15.610 00:02:15.610 drivers: 00:02:15.610 common/cpt: not in enabled drivers build config 00:02:15.610 common/dpaax: not in enabled drivers build config 00:02:15.610 common/iavf: not in enabled drivers build config 00:02:15.610 common/idpf: not in enabled drivers build config 00:02:15.610 common/ionic: not in enabled drivers build config 00:02:15.610 common/mvep: not in enabled drivers build config 00:02:15.610 common/octeontx: not in enabled drivers build config 00:02:15.610 bus/cdx: not in enabled drivers build config 00:02:15.610 bus/dpaa: not in enabled drivers build config 00:02:15.610 bus/fslmc: not in enabled drivers build config 00:02:15.610 bus/ifpga: not in enabled drivers build config 00:02:15.610 bus/platform: not in enabled drivers build config 00:02:15.610 bus/uacce: not in enabled drivers build config 00:02:15.610 bus/vmbus: not in enabled drivers build config 00:02:15.610 common/cnxk: not in enabled drivers build config 00:02:15.610 common/nfp: not in enabled drivers build config 00:02:15.610 common/nitrox: not in enabled drivers build config 00:02:15.610 common/sfc_efx: not in enabled drivers build config 00:02:15.610 mempool/bucket: not in enabled drivers build config 00:02:15.610 mempool/cnxk: not in enabled drivers build config 00:02:15.610 mempool/dpaa: not in enabled drivers build config 00:02:15.610 mempool/dpaa2: not in enabled drivers build config 00:02:15.610 mempool/octeontx: not in enabled drivers build config 00:02:15.610 mempool/stack: not in enabled drivers build config 00:02:15.610 dma/cnxk: not in enabled drivers build config 00:02:15.610 dma/dpaa: not in enabled drivers build config 00:02:15.610 dma/dpaa2: not in enabled drivers build config 00:02:15.610 dma/hisilicon: not in enabled drivers build config 00:02:15.610 dma/idxd: not in enabled drivers build config 00:02:15.610 dma/ioat: not in enabled drivers build config 00:02:15.610 dma/skeleton: not in enabled drivers build config 00:02:15.610 net/af_packet: not in enabled drivers build config 00:02:15.610 net/af_xdp: not in enabled drivers build config 00:02:15.610 net/ark: not in enabled drivers build config 00:02:15.610 net/atlantic: not in enabled drivers build config 00:02:15.610 net/avp: not in enabled drivers build config 00:02:15.610 net/axgbe: not in enabled drivers build config 00:02:15.610 net/bnx2x: not in enabled drivers build config 00:02:15.610 net/bnxt: not in enabled drivers build config 00:02:15.610 net/bonding: not in enabled drivers build config 00:02:15.610 net/cnxk: not in enabled drivers build config 00:02:15.610 net/cpfl: not in enabled drivers build config 00:02:15.610 net/cxgbe: not in enabled drivers build config 00:02:15.610 net/dpaa: not in enabled drivers build config 00:02:15.610 net/dpaa2: not in enabled drivers build config 00:02:15.610 net/e1000: not in enabled drivers build config 00:02:15.610 net/ena: not in enabled drivers build config 00:02:15.610 net/enetc: not in enabled drivers build config 00:02:15.610 net/enetfec: not in enabled drivers build config 00:02:15.610 net/enic: not in enabled drivers build config 00:02:15.610 net/failsafe: not in enabled drivers build config 00:02:15.610 net/fm10k: not in enabled drivers build config 00:02:15.610 net/gve: not in enabled drivers build config 00:02:15.610 net/hinic: not in enabled drivers build config 00:02:15.610 net/hns3: not in enabled drivers build config 00:02:15.610 net/i40e: not in enabled drivers build config 00:02:15.610 net/iavf: not in enabled drivers build config 00:02:15.610 net/ice: not in enabled drivers build config 00:02:15.610 net/idpf: not in enabled drivers build config 00:02:15.610 net/igc: not in enabled drivers build config 00:02:15.610 net/ionic: not in enabled drivers build config 00:02:15.610 net/ipn3ke: not in enabled drivers build config 00:02:15.610 net/ixgbe: not in enabled drivers build config 00:02:15.610 net/mana: not in enabled drivers build config 00:02:15.610 net/memif: not in enabled drivers build config 00:02:15.610 net/mlx4: not in enabled drivers build config 00:02:15.610 net/mlx5: not in enabled drivers build config 00:02:15.610 net/mvneta: not in enabled drivers build config 00:02:15.610 net/mvpp2: not in enabled drivers build config 00:02:15.610 net/netvsc: not in enabled drivers build config 00:02:15.610 net/nfb: not in enabled drivers build config 00:02:15.610 net/nfp: not in enabled drivers build config 00:02:15.610 net/ngbe: not in enabled drivers build config 00:02:15.610 net/null: not in enabled drivers build config 00:02:15.610 net/octeontx: not in enabled drivers build config 00:02:15.610 net/octeon_ep: not in enabled drivers build config 00:02:15.610 net/pcap: not in enabled drivers build config 00:02:15.610 net/pfe: not in enabled drivers build config 00:02:15.610 net/qede: not in enabled drivers build config 00:02:15.610 net/ring: not in enabled drivers build config 00:02:15.610 net/sfc: not in enabled drivers build config 00:02:15.610 net/softnic: not in enabled drivers build config 00:02:15.610 net/tap: not in enabled drivers build config 00:02:15.610 net/thunderx: not in enabled drivers build config 00:02:15.610 net/txgbe: not in enabled drivers build config 00:02:15.610 net/vdev_netvsc: not in enabled drivers build config 00:02:15.610 net/vhost: not in enabled drivers build config 00:02:15.610 net/virtio: not in enabled drivers build config 00:02:15.610 net/vmxnet3: not in enabled drivers build config 00:02:15.610 raw/*: missing internal dependency, "rawdev" 00:02:15.610 crypto/armv8: not in enabled drivers build config 00:02:15.610 crypto/bcmfs: not in enabled drivers build config 00:02:15.610 crypto/caam_jr: not in enabled drivers build config 00:02:15.610 crypto/ccp: not in enabled drivers build config 00:02:15.610 crypto/cnxk: not in enabled drivers build config 00:02:15.610 crypto/dpaa_sec: not in enabled drivers build config 00:02:15.610 crypto/dpaa2_sec: not in enabled drivers build config 00:02:15.610 crypto/mvsam: not in enabled drivers build config 00:02:15.610 crypto/nitrox: not in enabled drivers build config 00:02:15.610 crypto/null: not in enabled drivers build config 00:02:15.610 crypto/octeontx: not in enabled drivers build config 00:02:15.610 crypto/openssl: not in enabled drivers build config 00:02:15.610 crypto/scheduler: not in enabled drivers build config 00:02:15.610 crypto/uadk: not in enabled drivers build config 00:02:15.610 crypto/virtio: not in enabled drivers build config 00:02:15.610 compress/nitrox: not in enabled drivers build config 00:02:15.610 compress/octeontx: not in enabled drivers build config 00:02:15.610 compress/zlib: not in enabled drivers build config 00:02:15.610 regex/*: missing internal dependency, "regexdev" 00:02:15.610 ml/*: missing internal dependency, "mldev" 00:02:15.610 vdpa/ifc: not in enabled drivers build config 00:02:15.610 vdpa/mlx5: not in enabled drivers build config 00:02:15.610 vdpa/nfp: not in enabled drivers build config 00:02:15.610 vdpa/sfc: not in enabled drivers build config 00:02:15.610 event/*: missing internal dependency, "eventdev" 00:02:15.610 baseband/*: missing internal dependency, "bbdev" 00:02:15.610 gpu/*: missing internal dependency, "gpudev" 00:02:15.610 00:02:15.610 00:02:15.610 Build targets in project: 115 00:02:15.610 00:02:15.610 DPDK 24.03.0 00:02:15.610 00:02:15.610 User defined options 00:02:15.610 buildtype : debug 00:02:15.610 default_library : shared 00:02:15.610 libdir : lib 00:02:15.610 prefix : /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:02:15.610 c_args : -Wno-stringop-overflow -fcommon -Wno-stringop-overread -Wno-array-bounds -I/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -DNO_COMPAT_IMB_API_053 -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isalbuild -fPIC -Werror 00:02:15.610 c_link_args : -L/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -L/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l/.libs -lisal 00:02:15.610 cpu_instruction_set: native 00:02:15.610 disable_apps : test-dma-perf,test,test-sad,test-acl,test-pmd,test-mldev,test-compress-perf,test-cmdline,test-regex,test-fib,graph,test-bbdev,dumpcap,test-gpudev,proc-info,test-pipeline,test-flow-perf,test-crypto-perf,pdump,test-eventdev,test-security-perf 00:02:15.610 disable_libs : port,lpm,ipsec,regexdev,dispatcher,argparse,bitratestats,rawdev,stack,graph,acl,bbdev,pipeline,member,sched,pcapng,mldev,eventdev,efd,metrics,latencystats,cfgfile,ip_frag,jobstats,pdump,pdcp,rib,node,fib,distributor,gso,table,bpf,gpudev,gro 00:02:15.610 enable_docs : false 00:02:15.610 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring,crypto/qat,compress/qat,common/qat,common/mlx5,bus/auxiliary,crypto,crypto/aesni_mb,crypto/mlx5,crypto/ipsec_mb,compress,compress/isal,compress/mlx5 00:02:15.610 enable_kmods : false 00:02:15.610 tests : false 00:02:15.610 00:02:15.610 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:15.610 ninja: Entering directory `/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp' 00:02:15.610 [1/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:02:15.610 [2/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:02:15.610 [3/378] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:02:15.610 [4/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:02:15.610 [5/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:02:15.610 [6/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:02:15.610 [7/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:02:15.610 [8/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:02:15.611 [9/378] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:02:15.611 [10/378] Compiling C object lib/librte_log.a.p/log_log.c.o 00:02:15.611 [11/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:02:15.611 [12/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:02:15.611 [13/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:02:15.611 [14/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:02:15.611 [15/378] Linking static target lib/librte_kvargs.a 00:02:15.611 [16/378] Linking static target lib/librte_log.a 00:02:15.611 [17/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:02:15.611 [18/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:02:15.611 [19/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:02:15.611 [20/378] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:02:15.611 [21/378] Linking static target lib/librte_pci.a 00:02:15.611 [22/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:02:15.611 [23/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:02:15.611 [24/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:02:15.611 [25/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:02:15.611 [26/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:02:15.870 [27/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:02:15.870 [28/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:02:15.870 [29/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:02:15.870 [30/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:02:15.870 [31/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:02:15.870 [32/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:02:15.870 [33/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:02:15.870 [34/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:02:15.870 [35/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:02:15.870 [36/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:02:15.870 [37/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:02:15.870 [38/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:02:15.870 [39/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:02:15.870 [40/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:02:15.870 [41/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:02:15.870 [42/378] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:02:15.870 [43/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:02:15.870 [44/378] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:02:15.870 [45/378] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:02:15.870 [46/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:02:15.870 [47/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:02:15.870 [48/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:02:15.870 [49/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:02:15.870 [50/378] Linking static target lib/net/libnet_crc_avx512_lib.a 00:02:15.870 [51/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:02:15.870 [52/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:02:15.870 [53/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:02:15.870 [54/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:02:15.870 [55/378] Compiling C object lib/librte_hash.a.p/hash_rte_hash_crc.c.o 00:02:15.870 [56/378] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:15.870 [57/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:02:15.870 [58/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:02:15.870 [59/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:02:15.870 [60/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:02:15.870 [61/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:02:15.870 [62/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:02:15.870 [63/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:02:15.870 [64/378] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:02:15.870 [65/378] Linking static target lib/librte_telemetry.a 00:02:15.870 [66/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:02:15.870 [67/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:02:15.870 [68/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:02:15.871 [69/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:02:15.871 [70/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:02:15.871 [71/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:02:15.871 [72/378] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:02:15.871 [73/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:02:15.871 [74/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:02:15.871 [75/378] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:02:15.871 [76/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:02:15.871 [77/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:02:15.871 [78/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:02:15.871 [79/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:02:15.871 [80/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:02:15.871 [81/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:02:15.871 [82/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:02:15.871 [83/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:02:15.871 [84/378] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:02:15.871 [85/378] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:02:16.139 [86/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:02:16.139 [87/378] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:02:16.139 [88/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:02:16.139 [89/378] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:02:16.139 [90/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:02:16.139 [91/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:02:16.139 [92/378] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:02:16.139 [93/378] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:02:16.139 [94/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:02:16.139 [95/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:02:16.139 [96/378] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:02:16.139 [97/378] Linking static target lib/librte_meter.a 00:02:16.139 [98/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:02:16.139 [99/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:02:16.139 [100/378] Linking static target lib/librte_rcu.a 00:02:16.139 [101/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:02:16.139 [102/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:02:16.139 [103/378] Linking static target lib/librte_ring.a 00:02:16.139 [104/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:02:16.139 [105/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:02:16.139 [106/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_params.c.o 00:02:16.139 [107/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:02:16.139 [108/378] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:02:16.139 [109/378] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:02:16.139 [110/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:02:16.139 [111/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:02:16.139 [112/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:02:16.139 [113/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:02:16.139 [114/378] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:02:16.139 [115/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:02:16.139 [116/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:02:16.139 [117/378] Linking static target lib/librte_cmdline.a 00:02:16.139 [118/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:02:16.139 [119/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:02:16.139 [120/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:02:16.139 [121/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:02:16.139 [122/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:02:16.139 [123/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:02:16.139 [124/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:02:16.139 [125/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:02:16.139 [126/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:02:16.139 [127/378] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:02:16.139 [128/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:02:16.139 [129/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:02:16.139 [130/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_linux_ethtool.c.o 00:02:16.139 [131/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_logs.c.o 00:02:16.139 [132/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:02:16.139 [133/378] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.139 [134/378] Linking static target lib/librte_net.a 00:02:16.139 [135/378] Linking static target lib/librte_eal.a 00:02:16.139 [136/378] Linking static target lib/librte_mempool.a 00:02:16.404 [137/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:02:16.404 [138/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash_gfni.c.o 00:02:16.404 [139/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:02:16.404 [140/378] Linking static target lib/librte_mbuf.a 00:02:16.404 [141/378] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:02:16.404 [142/378] Linking target lib/librte_log.so.24.1 00:02:16.404 [143/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_glue.c.o 00:02:16.404 [144/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:02:16.404 [145/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:02:16.404 [146/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:02:16.404 [147/378] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:02:16.404 [148/378] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:02:16.404 [149/378] Linking static target lib/librte_timer.a 00:02:16.404 [150/378] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:02:16.404 [151/378] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:02:16.404 [152/378] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:02:16.405 [153/378] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:02:16.405 [154/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:02:16.405 [155/378] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:02:16.405 [156/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:02:16.405 [157/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:02:16.405 [158/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:02:16.405 [159/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:02:16.405 [160/378] Linking static target lib/librte_dmadev.a 00:02:16.405 [161/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:02:16.405 [162/378] Linking static target lib/librte_compressdev.a 00:02:16.664 [163/378] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.664 [164/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:02:16.664 [165/378] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:02:16.664 [166/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:02:16.664 [167/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:02:16.664 [168/378] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:02:16.664 [169/378] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:02:16.664 [170/378] Generating symbol file lib/librte_log.so.24.1.p/librte_log.so.24.1.symbols 00:02:16.664 [171/378] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:02:16.664 [172/378] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.664 [173/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:02:16.664 [174/378] Linking static target lib/librte_reorder.a 00:02:16.664 [175/378] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.664 [176/378] Linking target lib/librte_kvargs.so.24.1 00:02:16.664 [177/378] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:02:16.664 [178/378] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.664 [179/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen1.c.o 00:02:16.664 [180/378] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:02:16.664 [181/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_linux_auxiliary.c.o 00:02:16.664 [182/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mp.c.o 00:02:16.664 [183/378] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.664 [184/378] Linking static target lib/librte_security.a 00:02:16.664 [185/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen1.c.o 00:02:16.664 [186/378] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:02:16.664 [187/378] Linking target lib/librte_telemetry.so.24.1 00:02:16.664 [188/378] Linking static target lib/librte_power.a 00:02:16.664 [189/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_common.c.o 00:02:16.664 [190/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen2.c.o 00:02:16.664 [191/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen_lce.c.o 00:02:16.664 [192/378] Linking static target drivers/libtmp_rte_bus_auxiliary.a 00:02:16.664 [193/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen4.c.o 00:02:16.664 [194/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_device.c.o 00:02:16.664 [195/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:02:16.664 [196/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:02:16.664 [197/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp_pmd.c.o 00:02:16.664 [198/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:02:16.923 [199/378] Linking static target drivers/libtmp_rte_bus_vdev.a 00:02:16.923 [200/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_utils.c.o 00:02:16.923 [201/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:02:16.923 [202/378] Generating symbol file lib/librte_kvargs.so.24.1.p/librte_kvargs.so.24.1.symbols 00:02:16.923 [203/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_asym_pmd_gen1.c.o 00:02:16.923 [204/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:02:16.923 [205/378] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.923 [206/378] Linking static target drivers/libtmp_rte_bus_pci.a 00:02:16.923 [207/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_os.c.o 00:02:16.923 [208/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen4.c.o 00:02:16.923 [209/378] Generating symbol file lib/librte_telemetry.so.24.1.p/librte_telemetry.so.24.1.symbols 00:02:16.923 [210/378] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:02:16.923 [211/378] Linking static target lib/librte_hash.a 00:02:16.923 [212/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_pf2vf.c.o 00:02:16.923 [213/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:02:16.923 [214/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_common.c.o 00:02:16.923 [215/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:02:16.923 [216/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen3.c.o 00:02:16.923 [217/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_qp.c.o 00:02:16.923 [218/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:02:16.923 [219/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen5.c.o 00:02:16.923 [220/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen2.c.o 00:02:16.924 [221/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen3.c.o 00:02:16.924 [222/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen_lce.c.o 00:02:16.924 [223/378] Generating drivers/rte_bus_auxiliary.pmd.c with a custom command 00:02:16.924 [224/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_ops.c.o 00:02:16.924 [225/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_auxiliary.c.o 00:02:16.924 [226/378] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.924 [227/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen5.c.o 00:02:16.924 [228/378] Compiling C object drivers/librte_bus_auxiliary.a.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:02:16.924 [229/378] Compiling C object drivers/librte_bus_auxiliary.so.24.1.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:02:16.924 [230/378] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:02:16.924 [231/378] Linking static target drivers/librte_bus_auxiliary.a 00:02:16.924 [232/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_private.c.o 00:02:16.924 [233/378] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:02:16.924 [234/378] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:16.924 [235/378] Compiling C object drivers/librte_bus_vdev.so.24.1.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:16.924 [236/378] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.924 [237/378] Linking static target drivers/librte_bus_vdev.a 00:02:16.924 [238/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_malloc.c.o 00:02:16.924 [239/378] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.924 [240/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_verbs.c.o 00:02:16.924 [241/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_pci.c.o 00:02:16.924 [242/378] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.924 [243/378] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.924 [244/378] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:02:16.924 [245/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp.c.o 00:02:16.924 [246/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen2.c.o 00:02:16.924 [247/378] Linking static target drivers/libtmp_rte_mempool_ring.a 00:02:16.924 [248/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen5.c.o 00:02:17.181 [249/378] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:02:17.181 [250/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_devx.c.o 00:02:17.181 [251/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_dek.c.o 00:02:17.181 [252/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_crypto.c.o 00:02:17.181 [253/378] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:17.181 [254/378] Compiling C object drivers/librte_bus_pci.so.24.1.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:17.181 [255/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common.c.o 00:02:17.181 [256/378] Linking static target drivers/librte_bus_pci.a 00:02:17.181 [257/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym.c.o 00:02:17.181 [258/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_xts.c.o 00:02:17.181 [259/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_nl.c.o 00:02:17.181 [260/378] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.181 [261/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto.c.o 00:02:17.181 [262/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mr.c.o 00:02:17.181 [263/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_gcm.c.o 00:02:17.181 [264/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd_ops.c.o 00:02:17.181 [265/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen4.c.o 00:02:17.181 [266/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:02:17.181 [267/378] Linking static target drivers/libtmp_rte_crypto_mlx5.a 00:02:17.181 [268/378] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:02:17.181 [269/378] Linking static target lib/librte_cryptodev.a 00:02:17.181 [270/378] Compiling C object drivers/librte_mempool_ring.so.24.1.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:17.181 [271/378] Compiling C object drivers/libtmp_rte_compress_mlx5.a.p/compress_mlx5_mlx5_compress.c.o 00:02:17.181 [272/378] Generating drivers/rte_bus_auxiliary.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.181 [273/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_mb.c.o 00:02:17.181 [274/378] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:17.181 [275/378] Linking static target drivers/libtmp_rte_compress_mlx5.a 00:02:17.181 [276/378] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.181 [277/378] Linking static target drivers/librte_mempool_ring.a 00:02:17.181 [278/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym_session.c.o 00:02:17.181 [279/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd.c.o 00:02:17.181 [280/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_gcm.c.o 00:02:17.181 [281/378] Linking static target drivers/libtmp_rte_compress_isal.a 00:02:17.181 [282/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:02:17.438 [283/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_snow3g.c.o 00:02:17.438 [284/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_kasumi.c.o 00:02:17.438 [285/378] Linking static target lib/librte_ethdev.a 00:02:17.438 [286/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_sym_pmd_gen1.c.o 00:02:17.438 [287/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_zuc.c.o 00:02:17.438 [288/378] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.438 [289/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_chacha_poly.c.o 00:02:17.438 [290/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen3.c.o 00:02:17.438 [291/378] Linking static target drivers/libtmp_rte_crypto_ipsec_mb.a 00:02:17.438 [292/378] Generating drivers/rte_crypto_mlx5.pmd.c with a custom command 00:02:17.438 [293/378] Generating drivers/rte_compress_mlx5.pmd.c with a custom command 00:02:17.438 [294/378] Compiling C object drivers/librte_crypto_mlx5.so.24.1.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:02:17.438 [295/378] Compiling C object drivers/librte_crypto_mlx5.a.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:02:17.438 [296/378] Linking static target drivers/librte_crypto_mlx5.a 00:02:17.438 [297/378] Compiling C object drivers/librte_compress_mlx5.so.24.1.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:02:17.438 [298/378] Compiling C object drivers/librte_compress_mlx5.a.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:02:17.438 [299/378] Linking static target drivers/librte_compress_mlx5.a 00:02:17.438 [300/378] Generating drivers/rte_compress_isal.pmd.c with a custom command 00:02:17.438 [301/378] Compiling C object drivers/librte_compress_isal.a.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:02:17.438 [302/378] Compiling C object drivers/librte_compress_isal.so.24.1.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:02:17.438 [303/378] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.438 [304/378] Linking static target drivers/librte_compress_isal.a 00:02:17.697 [305/378] Generating drivers/rte_crypto_ipsec_mb.pmd.c with a custom command 00:02:17.697 [306/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_devx_cmds.c.o 00:02:17.697 [307/378] Compiling C object drivers/librte_crypto_ipsec_mb.so.24.1.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:02:17.697 [308/378] Linking static target drivers/libtmp_rte_common_mlx5.a 00:02:17.697 [309/378] Compiling C object drivers/librte_crypto_ipsec_mb.a.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:02:17.697 [310/378] Linking static target drivers/librte_crypto_ipsec_mb.a 00:02:17.697 [311/378] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.697 [312/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:02:17.697 [313/378] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.955 [314/378] Generating drivers/rte_common_mlx5.pmd.c with a custom command 00:02:17.955 [315/378] Compiling C object drivers/librte_common_mlx5.so.24.1.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:02:17.955 [316/378] Compiling C object drivers/librte_common_mlx5.a.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:02:17.955 [317/378] Linking static target drivers/librte_common_mlx5.a 00:02:17.955 [318/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_asym.c.o 00:02:18.212 [319/378] Linking static target drivers/libtmp_rte_common_qat.a 00:02:18.518 [320/378] Generating drivers/rte_common_qat.pmd.c with a custom command 00:02:18.518 [321/378] Compiling C object drivers/librte_common_qat.a.p/meson-generated_.._rte_common_qat.pmd.c.o 00:02:18.518 [322/378] Compiling C object drivers/librte_common_qat.so.24.1.p/meson-generated_.._rte_common_qat.pmd.c.o 00:02:18.518 [323/378] Linking static target drivers/librte_common_qat.a 00:02:19.084 [324/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:19.084 [325/378] Linking static target lib/librte_vhost.a 00:02:19.084 [326/378] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:20.987 [327/378] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:22.895 [328/378] Generating drivers/rte_common_mlx5.sym_chk with a custom command (wrapped by meson to capture output) 00:02:25.434 [329/378] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.372 [330/378] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.372 [331/378] Linking target lib/librte_eal.so.24.1 00:02:26.631 [332/378] Generating symbol file lib/librte_eal.so.24.1.p/librte_eal.so.24.1.symbols 00:02:26.631 [333/378] Linking target lib/librte_ring.so.24.1 00:02:26.631 [334/378] Linking target lib/librte_meter.so.24.1 00:02:26.631 [335/378] Linking target lib/librte_pci.so.24.1 00:02:26.631 [336/378] Linking target lib/librte_timer.so.24.1 00:02:26.631 [337/378] Linking target lib/librte_dmadev.so.24.1 00:02:26.631 [338/378] Linking target drivers/librte_bus_auxiliary.so.24.1 00:02:26.631 [339/378] Linking target drivers/librte_bus_vdev.so.24.1 00:02:26.890 [340/378] Generating symbol file drivers/librte_bus_vdev.so.24.1.p/librte_bus_vdev.so.24.1.symbols 00:02:26.890 [341/378] Generating symbol file drivers/librte_bus_auxiliary.so.24.1.p/librte_bus_auxiliary.so.24.1.symbols 00:02:26.890 [342/378] Generating symbol file lib/librte_dmadev.so.24.1.p/librte_dmadev.so.24.1.symbols 00:02:26.890 [343/378] Generating symbol file lib/librte_meter.so.24.1.p/librte_meter.so.24.1.symbols 00:02:26.890 [344/378] Generating symbol file lib/librte_ring.so.24.1.p/librte_ring.so.24.1.symbols 00:02:26.890 [345/378] Generating symbol file lib/librte_pci.so.24.1.p/librte_pci.so.24.1.symbols 00:02:26.890 [346/378] Generating symbol file lib/librte_timer.so.24.1.p/librte_timer.so.24.1.symbols 00:02:26.890 [347/378] Linking target lib/librte_rcu.so.24.1 00:02:26.890 [348/378] Linking target lib/librte_mempool.so.24.1 00:02:26.890 [349/378] Linking target drivers/librte_bus_pci.so.24.1 00:02:26.890 [350/378] Generating symbol file lib/librte_rcu.so.24.1.p/librte_rcu.so.24.1.symbols 00:02:26.890 [351/378] Generating symbol file lib/librte_mempool.so.24.1.p/librte_mempool.so.24.1.symbols 00:02:26.891 [352/378] Generating symbol file drivers/librte_bus_pci.so.24.1.p/librte_bus_pci.so.24.1.symbols 00:02:26.891 [353/378] Linking target lib/librte_mbuf.so.24.1 00:02:26.891 [354/378] Linking target drivers/librte_mempool_ring.so.24.1 00:02:27.150 [355/378] Generating symbol file lib/librte_mbuf.so.24.1.p/librte_mbuf.so.24.1.symbols 00:02:27.150 [356/378] Linking target lib/librte_reorder.so.24.1 00:02:27.150 [357/378] Linking target lib/librte_compressdev.so.24.1 00:02:27.150 [358/378] Linking target lib/librte_net.so.24.1 00:02:27.150 [359/378] Linking target lib/librte_cryptodev.so.24.1 00:02:27.410 [360/378] Generating symbol file lib/librte_compressdev.so.24.1.p/librte_compressdev.so.24.1.symbols 00:02:27.410 [361/378] Generating symbol file lib/librte_net.so.24.1.p/librte_net.so.24.1.symbols 00:02:27.410 [362/378] Generating symbol file lib/librte_cryptodev.so.24.1.p/librte_cryptodev.so.24.1.symbols 00:02:27.410 [363/378] Linking target lib/librte_cmdline.so.24.1 00:02:27.410 [364/378] Linking target lib/librte_hash.so.24.1 00:02:27.410 [365/378] Linking target drivers/librte_compress_isal.so.24.1 00:02:27.410 [366/378] Linking target lib/librte_ethdev.so.24.1 00:02:27.410 [367/378] Linking target lib/librte_security.so.24.1 00:02:27.410 [368/378] Generating symbol file lib/librte_hash.so.24.1.p/librte_hash.so.24.1.symbols 00:02:27.410 [369/378] Generating symbol file lib/librte_security.so.24.1.p/librte_security.so.24.1.symbols 00:02:27.410 [370/378] Generating symbol file lib/librte_ethdev.so.24.1.p/librte_ethdev.so.24.1.symbols 00:02:27.668 [371/378] Linking target drivers/librte_common_mlx5.so.24.1 00:02:27.668 [372/378] Linking target lib/librte_power.so.24.1 00:02:27.668 [373/378] Linking target lib/librte_vhost.so.24.1 00:02:27.668 [374/378] Generating symbol file drivers/librte_common_mlx5.so.24.1.p/librte_common_mlx5.so.24.1.symbols 00:02:27.668 [375/378] Linking target drivers/librte_crypto_ipsec_mb.so.24.1 00:02:27.668 [376/378] Linking target drivers/librte_common_qat.so.24.1 00:02:27.668 [377/378] Linking target drivers/librte_compress_mlx5.so.24.1 00:02:27.668 [378/378] Linking target drivers/librte_crypto_mlx5.so.24.1 00:02:27.668 INFO: autodetecting backend as ninja 00:02:27.668 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp -j 96 00:02:29.048 CC lib/log/log.o 00:02:29.048 CC lib/log/log_flags.o 00:02:29.048 CC lib/log/log_deprecated.o 00:02:29.048 CC lib/ut/ut.o 00:02:29.048 CC lib/ut_mock/mock.o 00:02:29.048 LIB libspdk_log.a 00:02:29.048 LIB libspdk_ut.a 00:02:29.048 LIB libspdk_ut_mock.a 00:02:29.048 SO libspdk_log.so.7.0 00:02:29.048 SO libspdk_ut.so.2.0 00:02:29.048 SO libspdk_ut_mock.so.6.0 00:02:29.048 SYMLINK libspdk_ut_mock.so 00:02:29.048 SYMLINK libspdk_log.so 00:02:29.048 SYMLINK libspdk_ut.so 00:02:29.307 CC lib/dma/dma.o 00:02:29.307 CC lib/ioat/ioat.o 00:02:29.307 CXX lib/trace_parser/trace.o 00:02:29.307 CC lib/util/base64.o 00:02:29.307 CC lib/util/bit_array.o 00:02:29.307 CC lib/util/cpuset.o 00:02:29.307 CC lib/util/crc16.o 00:02:29.307 CC lib/util/crc32.o 00:02:29.307 CC lib/util/crc32c.o 00:02:29.307 CC lib/util/crc32_ieee.o 00:02:29.307 CC lib/util/crc64.o 00:02:29.307 CC lib/util/dif.o 00:02:29.307 CC lib/util/fd.o 00:02:29.307 CC lib/util/file.o 00:02:29.307 CC lib/util/hexlify.o 00:02:29.307 CC lib/util/iov.o 00:02:29.307 CC lib/util/math.o 00:02:29.307 CC lib/util/pipe.o 00:02:29.307 CC lib/util/strerror_tls.o 00:02:29.307 CC lib/util/string.o 00:02:29.307 CC lib/util/uuid.o 00:02:29.307 CC lib/util/fd_group.o 00:02:29.307 CC lib/util/xor.o 00:02:29.307 CC lib/util/zipf.o 00:02:29.567 CC lib/vfio_user/host/vfio_user.o 00:02:29.567 CC lib/vfio_user/host/vfio_user_pci.o 00:02:29.567 LIB libspdk_dma.a 00:02:29.567 SO libspdk_dma.so.4.0 00:02:29.567 SYMLINK libspdk_dma.so 00:02:29.567 LIB libspdk_ioat.a 00:02:29.567 SO libspdk_ioat.so.7.0 00:02:29.567 SYMLINK libspdk_ioat.so 00:02:29.825 LIB libspdk_vfio_user.a 00:02:29.825 SO libspdk_vfio_user.so.5.0 00:02:29.826 LIB libspdk_util.a 00:02:29.826 SYMLINK libspdk_vfio_user.so 00:02:29.826 SO libspdk_util.so.9.1 00:02:29.826 SYMLINK libspdk_util.so 00:02:30.084 LIB libspdk_trace_parser.a 00:02:30.084 SO libspdk_trace_parser.so.5.0 00:02:30.084 SYMLINK libspdk_trace_parser.so 00:02:30.342 CC lib/idxd/idxd.o 00:02:30.342 CC lib/idxd/idxd_user.o 00:02:30.342 CC lib/idxd/idxd_kernel.o 00:02:30.342 CC lib/json/json_parse.o 00:02:30.342 CC lib/json/json_write.o 00:02:30.342 CC lib/json/json_util.o 00:02:30.342 CC lib/rdma_utils/rdma_utils.o 00:02:30.342 CC lib/env_dpdk/env.o 00:02:30.342 CC lib/env_dpdk/memory.o 00:02:30.342 CC lib/conf/conf.o 00:02:30.342 CC lib/env_dpdk/pci.o 00:02:30.342 CC lib/rdma_provider/common.o 00:02:30.342 CC lib/env_dpdk/init.o 00:02:30.342 CC lib/rdma_provider/rdma_provider_verbs.o 00:02:30.342 CC lib/env_dpdk/threads.o 00:02:30.342 CC lib/env_dpdk/pci_ioat.o 00:02:30.342 CC lib/reduce/reduce.o 00:02:30.342 CC lib/vmd/vmd.o 00:02:30.342 CC lib/env_dpdk/pci_virtio.o 00:02:30.342 CC lib/vmd/led.o 00:02:30.342 CC lib/env_dpdk/pci_vmd.o 00:02:30.342 CC lib/env_dpdk/pci_idxd.o 00:02:30.342 CC lib/env_dpdk/pci_event.o 00:02:30.342 CC lib/env_dpdk/sigbus_handler.o 00:02:30.342 CC lib/env_dpdk/pci_dpdk.o 00:02:30.342 CC lib/env_dpdk/pci_dpdk_2207.o 00:02:30.342 CC lib/env_dpdk/pci_dpdk_2211.o 00:02:30.342 LIB libspdk_rdma_provider.a 00:02:30.601 LIB libspdk_conf.a 00:02:30.601 LIB libspdk_rdma_utils.a 00:02:30.601 SO libspdk_rdma_provider.so.6.0 00:02:30.601 SO libspdk_conf.so.6.0 00:02:30.601 LIB libspdk_json.a 00:02:30.601 SO libspdk_rdma_utils.so.1.0 00:02:30.601 SO libspdk_json.so.6.0 00:02:30.601 SYMLINK libspdk_rdma_provider.so 00:02:30.601 SYMLINK libspdk_conf.so 00:02:30.601 SYMLINK libspdk_rdma_utils.so 00:02:30.601 SYMLINK libspdk_json.so 00:02:30.601 LIB libspdk_idxd.a 00:02:30.601 SO libspdk_idxd.so.12.0 00:02:30.859 LIB libspdk_vmd.a 00:02:30.859 SYMLINK libspdk_idxd.so 00:02:30.859 LIB libspdk_reduce.a 00:02:30.859 SO libspdk_vmd.so.6.0 00:02:30.859 SO libspdk_reduce.so.6.0 00:02:30.859 SYMLINK libspdk_vmd.so 00:02:30.859 SYMLINK libspdk_reduce.so 00:02:30.859 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:02:30.859 CC lib/jsonrpc/jsonrpc_server.o 00:02:30.859 CC lib/jsonrpc/jsonrpc_client.o 00:02:30.859 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:02:31.118 LIB libspdk_jsonrpc.a 00:02:31.118 SO libspdk_jsonrpc.so.6.0 00:02:31.118 SYMLINK libspdk_jsonrpc.so 00:02:31.377 LIB libspdk_env_dpdk.a 00:02:31.377 SO libspdk_env_dpdk.so.14.1 00:02:31.377 SYMLINK libspdk_env_dpdk.so 00:02:31.635 CC lib/rpc/rpc.o 00:02:31.636 LIB libspdk_rpc.a 00:02:31.636 SO libspdk_rpc.so.6.0 00:02:31.893 SYMLINK libspdk_rpc.so 00:02:32.151 CC lib/trace/trace.o 00:02:32.151 CC lib/keyring/keyring.o 00:02:32.151 CC lib/trace/trace_flags.o 00:02:32.151 CC lib/keyring/keyring_rpc.o 00:02:32.151 CC lib/trace/trace_rpc.o 00:02:32.151 CC lib/notify/notify.o 00:02:32.151 CC lib/notify/notify_rpc.o 00:02:32.151 LIB libspdk_notify.a 00:02:32.409 SO libspdk_notify.so.6.0 00:02:32.409 LIB libspdk_keyring.a 00:02:32.409 LIB libspdk_trace.a 00:02:32.409 SO libspdk_trace.so.10.0 00:02:32.409 SYMLINK libspdk_notify.so 00:02:32.409 SO libspdk_keyring.so.1.0 00:02:32.409 SYMLINK libspdk_trace.so 00:02:32.409 SYMLINK libspdk_keyring.so 00:02:32.668 CC lib/thread/thread.o 00:02:32.668 CC lib/thread/iobuf.o 00:02:32.668 CC lib/sock/sock.o 00:02:32.668 CC lib/sock/sock_rpc.o 00:02:32.927 LIB libspdk_sock.a 00:02:33.185 SO libspdk_sock.so.10.0 00:02:33.185 SYMLINK libspdk_sock.so 00:02:33.444 CC lib/nvme/nvme_ctrlr_cmd.o 00:02:33.444 CC lib/nvme/nvme_ctrlr.o 00:02:33.444 CC lib/nvme/nvme_fabric.o 00:02:33.444 CC lib/nvme/nvme_ns_cmd.o 00:02:33.444 CC lib/nvme/nvme_ns.o 00:02:33.444 CC lib/nvme/nvme_pcie_common.o 00:02:33.444 CC lib/nvme/nvme_pcie.o 00:02:33.444 CC lib/nvme/nvme_qpair.o 00:02:33.444 CC lib/nvme/nvme.o 00:02:33.444 CC lib/nvme/nvme_quirks.o 00:02:33.444 CC lib/nvme/nvme_transport.o 00:02:33.444 CC lib/nvme/nvme_discovery.o 00:02:33.444 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:02:33.444 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:02:33.444 CC lib/nvme/nvme_tcp.o 00:02:33.444 CC lib/nvme/nvme_opal.o 00:02:33.444 CC lib/nvme/nvme_io_msg.o 00:02:33.444 CC lib/nvme/nvme_poll_group.o 00:02:33.444 CC lib/nvme/nvme_zns.o 00:02:33.444 CC lib/nvme/nvme_stubs.o 00:02:33.444 CC lib/nvme/nvme_auth.o 00:02:33.444 CC lib/nvme/nvme_cuse.o 00:02:33.444 CC lib/nvme/nvme_rdma.o 00:02:33.702 LIB libspdk_thread.a 00:02:33.702 SO libspdk_thread.so.10.1 00:02:33.961 SYMLINK libspdk_thread.so 00:02:34.221 CC lib/init/json_config.o 00:02:34.221 CC lib/init/subsystem.o 00:02:34.221 CC lib/blob/blobstore.o 00:02:34.221 CC lib/init/subsystem_rpc.o 00:02:34.221 CC lib/blob/request.o 00:02:34.221 CC lib/init/rpc.o 00:02:34.221 CC lib/blob/zeroes.o 00:02:34.221 CC lib/blob/blob_bs_dev.o 00:02:34.221 CC lib/accel/accel.o 00:02:34.221 CC lib/accel/accel_rpc.o 00:02:34.221 CC lib/accel/accel_sw.o 00:02:34.221 CC lib/virtio/virtio.o 00:02:34.221 CC lib/virtio/virtio_vfio_user.o 00:02:34.221 CC lib/virtio/virtio_vhost_user.o 00:02:34.221 CC lib/virtio/virtio_pci.o 00:02:34.480 LIB libspdk_init.a 00:02:34.480 SO libspdk_init.so.5.0 00:02:34.480 LIB libspdk_virtio.a 00:02:34.480 SO libspdk_virtio.so.7.0 00:02:34.480 SYMLINK libspdk_init.so 00:02:34.480 SYMLINK libspdk_virtio.so 00:02:34.739 CC lib/event/app.o 00:02:34.739 CC lib/event/reactor.o 00:02:34.739 CC lib/event/log_rpc.o 00:02:34.739 CC lib/event/app_rpc.o 00:02:34.739 CC lib/event/scheduler_static.o 00:02:34.998 LIB libspdk_accel.a 00:02:34.998 SO libspdk_accel.so.15.1 00:02:34.998 LIB libspdk_nvme.a 00:02:34.998 SYMLINK libspdk_accel.so 00:02:34.998 LIB libspdk_event.a 00:02:34.998 SO libspdk_nvme.so.13.1 00:02:35.257 SO libspdk_event.so.14.0 00:02:35.257 SYMLINK libspdk_event.so 00:02:35.257 CC lib/bdev/bdev.o 00:02:35.257 CC lib/bdev/bdev_rpc.o 00:02:35.257 CC lib/bdev/bdev_zone.o 00:02:35.257 CC lib/bdev/part.o 00:02:35.257 CC lib/bdev/scsi_nvme.o 00:02:35.257 SYMLINK libspdk_nvme.so 00:02:36.198 LIB libspdk_blob.a 00:02:36.198 SO libspdk_blob.so.11.0 00:02:36.198 SYMLINK libspdk_blob.so 00:02:36.767 CC lib/blobfs/blobfs.o 00:02:36.767 CC lib/blobfs/tree.o 00:02:36.767 CC lib/lvol/lvol.o 00:02:37.026 LIB libspdk_bdev.a 00:02:37.026 SO libspdk_bdev.so.15.1 00:02:37.285 LIB libspdk_blobfs.a 00:02:37.285 SYMLINK libspdk_bdev.so 00:02:37.285 SO libspdk_blobfs.so.10.0 00:02:37.285 LIB libspdk_lvol.a 00:02:37.285 SO libspdk_lvol.so.10.0 00:02:37.285 SYMLINK libspdk_blobfs.so 00:02:37.285 SYMLINK libspdk_lvol.so 00:02:37.546 CC lib/scsi/dev.o 00:02:37.546 CC lib/scsi/lun.o 00:02:37.546 CC lib/ftl/ftl_core.o 00:02:37.546 CC lib/scsi/port.o 00:02:37.546 CC lib/ftl/ftl_init.o 00:02:37.546 CC lib/scsi/scsi.o 00:02:37.546 CC lib/ftl/ftl_layout.o 00:02:37.546 CC lib/scsi/scsi_bdev.o 00:02:37.546 CC lib/nvmf/ctrlr.o 00:02:37.546 CC lib/ublk/ublk.o 00:02:37.546 CC lib/ftl/ftl_debug.o 00:02:37.546 CC lib/ublk/ublk_rpc.o 00:02:37.546 CC lib/nvmf/ctrlr_discovery.o 00:02:37.546 CC lib/ftl/ftl_io.o 00:02:37.546 CC lib/scsi/scsi_rpc.o 00:02:37.546 CC lib/nbd/nbd_rpc.o 00:02:37.546 CC lib/ftl/ftl_sb.o 00:02:37.546 CC lib/nvmf/ctrlr_bdev.o 00:02:37.546 CC lib/nbd/nbd.o 00:02:37.546 CC lib/scsi/scsi_pr.o 00:02:37.546 CC lib/scsi/task.o 00:02:37.546 CC lib/ftl/ftl_l2p.o 00:02:37.546 CC lib/nvmf/nvmf.o 00:02:37.546 CC lib/ftl/ftl_l2p_flat.o 00:02:37.546 CC lib/nvmf/subsystem.o 00:02:37.546 CC lib/ftl/ftl_nv_cache.o 00:02:37.546 CC lib/nvmf/nvmf_rpc.o 00:02:37.546 CC lib/nvmf/transport.o 00:02:37.546 CC lib/ftl/ftl_band.o 00:02:37.546 CC lib/ftl/ftl_band_ops.o 00:02:37.546 CC lib/nvmf/tcp.o 00:02:37.546 CC lib/nvmf/mdns_server.o 00:02:37.546 CC lib/nvmf/stubs.o 00:02:37.546 CC lib/ftl/ftl_writer.o 00:02:37.546 CC lib/ftl/ftl_rq.o 00:02:37.546 CC lib/nvmf/rdma.o 00:02:37.546 CC lib/ftl/ftl_reloc.o 00:02:37.546 CC lib/nvmf/auth.o 00:02:37.546 CC lib/ftl/ftl_l2p_cache.o 00:02:37.546 CC lib/ftl/mngt/ftl_mngt.o 00:02:37.546 CC lib/ftl/ftl_p2l.o 00:02:37.546 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:02:37.546 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:02:37.546 CC lib/ftl/mngt/ftl_mngt_md.o 00:02:37.546 CC lib/ftl/mngt/ftl_mngt_startup.o 00:02:37.546 CC lib/ftl/mngt/ftl_mngt_misc.o 00:02:37.546 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:02:37.546 CC lib/ftl/mngt/ftl_mngt_band.o 00:02:37.546 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:02:37.546 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:02:37.546 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:02:37.546 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:02:37.546 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:02:37.546 CC lib/ftl/utils/ftl_conf.o 00:02:37.546 CC lib/ftl/utils/ftl_md.o 00:02:37.546 CC lib/ftl/utils/ftl_mempool.o 00:02:37.546 CC lib/ftl/utils/ftl_bitmap.o 00:02:37.546 CC lib/ftl/utils/ftl_property.o 00:02:37.546 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:02:37.546 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:02:37.546 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:02:37.546 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:02:37.546 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:02:37.546 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:02:37.546 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:02:37.546 CC lib/ftl/upgrade/ftl_sb_v3.o 00:02:37.546 CC lib/ftl/upgrade/ftl_sb_v5.o 00:02:37.546 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:02:37.546 CC lib/ftl/nvc/ftl_nvc_dev.o 00:02:37.546 CC lib/ftl/base/ftl_base_bdev.o 00:02:37.546 CC lib/ftl/base/ftl_base_dev.o 00:02:37.546 CC lib/ftl/ftl_trace.o 00:02:38.114 LIB libspdk_scsi.a 00:02:38.114 LIB libspdk_nbd.a 00:02:38.114 SO libspdk_scsi.so.9.0 00:02:38.114 SO libspdk_nbd.so.7.0 00:02:38.114 SYMLINK libspdk_scsi.so 00:02:38.114 SYMLINK libspdk_nbd.so 00:02:38.114 LIB libspdk_ublk.a 00:02:38.372 SO libspdk_ublk.so.3.0 00:02:38.372 SYMLINK libspdk_ublk.so 00:02:38.372 CC lib/vhost/vhost.o 00:02:38.372 CC lib/vhost/vhost_rpc.o 00:02:38.372 CC lib/vhost/vhost_scsi.o 00:02:38.372 CC lib/vhost/vhost_blk.o 00:02:38.372 CC lib/vhost/rte_vhost_user.o 00:02:38.372 CC lib/iscsi/conn.o 00:02:38.372 CC lib/iscsi/init_grp.o 00:02:38.372 CC lib/iscsi/iscsi.o 00:02:38.372 CC lib/iscsi/md5.o 00:02:38.372 CC lib/iscsi/param.o 00:02:38.372 CC lib/iscsi/portal_grp.o 00:02:38.372 CC lib/iscsi/tgt_node.o 00:02:38.372 CC lib/iscsi/iscsi_subsystem.o 00:02:38.372 CC lib/iscsi/iscsi_rpc.o 00:02:38.372 CC lib/iscsi/task.o 00:02:38.631 LIB libspdk_ftl.a 00:02:38.631 SO libspdk_ftl.so.9.0 00:02:38.890 SYMLINK libspdk_ftl.so 00:02:39.148 LIB libspdk_nvmf.a 00:02:39.148 SO libspdk_nvmf.so.18.1 00:02:39.148 LIB libspdk_vhost.a 00:02:39.407 SO libspdk_vhost.so.8.0 00:02:39.407 SYMLINK libspdk_vhost.so 00:02:39.407 SYMLINK libspdk_nvmf.so 00:02:39.407 LIB libspdk_iscsi.a 00:02:39.407 SO libspdk_iscsi.so.8.0 00:02:39.667 SYMLINK libspdk_iscsi.so 00:02:40.235 CC module/env_dpdk/env_dpdk_rpc.o 00:02:40.235 LIB libspdk_env_dpdk_rpc.a 00:02:40.235 CC module/keyring/linux/keyring.o 00:02:40.235 CC module/accel/iaa/accel_iaa_rpc.o 00:02:40.235 CC module/accel/iaa/accel_iaa.o 00:02:40.235 CC module/blob/bdev/blob_bdev.o 00:02:40.235 CC module/keyring/linux/keyring_rpc.o 00:02:40.235 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:02:40.235 CC module/accel/ioat/accel_ioat.o 00:02:40.235 CC module/accel/error/accel_error.o 00:02:40.235 CC module/accel/error/accel_error_rpc.o 00:02:40.235 CC module/accel/ioat/accel_ioat_rpc.o 00:02:40.235 CC module/scheduler/gscheduler/gscheduler.o 00:02:40.235 CC module/sock/posix/posix.o 00:02:40.235 CC module/scheduler/dynamic/scheduler_dynamic.o 00:02:40.235 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev.o 00:02:40.235 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev_rpc.o 00:02:40.235 CC module/keyring/file/keyring.o 00:02:40.235 CC module/accel/dsa/accel_dsa.o 00:02:40.235 CC module/keyring/file/keyring_rpc.o 00:02:40.235 CC module/accel/dsa/accel_dsa_rpc.o 00:02:40.235 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev.o 00:02:40.235 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev_rpc.o 00:02:40.235 SO libspdk_env_dpdk_rpc.so.6.0 00:02:40.235 SYMLINK libspdk_env_dpdk_rpc.so 00:02:40.494 LIB libspdk_scheduler_dpdk_governor.a 00:02:40.494 LIB libspdk_keyring_linux.a 00:02:40.494 LIB libspdk_scheduler_gscheduler.a 00:02:40.494 LIB libspdk_keyring_file.a 00:02:40.494 LIB libspdk_accel_ioat.a 00:02:40.494 SO libspdk_keyring_linux.so.1.0 00:02:40.494 LIB libspdk_scheduler_dynamic.a 00:02:40.494 SO libspdk_scheduler_dpdk_governor.so.4.0 00:02:40.494 LIB libspdk_accel_error.a 00:02:40.494 SO libspdk_keyring_file.so.1.0 00:02:40.494 LIB libspdk_accel_iaa.a 00:02:40.494 SO libspdk_scheduler_gscheduler.so.4.0 00:02:40.494 SO libspdk_accel_ioat.so.6.0 00:02:40.494 SO libspdk_scheduler_dynamic.so.4.0 00:02:40.494 LIB libspdk_blob_bdev.a 00:02:40.494 SO libspdk_accel_error.so.2.0 00:02:40.494 SO libspdk_accel_iaa.so.3.0 00:02:40.494 LIB libspdk_accel_dsa.a 00:02:40.494 SYMLINK libspdk_keyring_linux.so 00:02:40.494 SYMLINK libspdk_scheduler_dpdk_governor.so 00:02:40.494 SYMLINK libspdk_keyring_file.so 00:02:40.494 SYMLINK libspdk_scheduler_gscheduler.so 00:02:40.494 SO libspdk_blob_bdev.so.11.0 00:02:40.494 SO libspdk_accel_dsa.so.5.0 00:02:40.494 SYMLINK libspdk_accel_ioat.so 00:02:40.494 SYMLINK libspdk_scheduler_dynamic.so 00:02:40.494 SYMLINK libspdk_accel_error.so 00:02:40.494 SYMLINK libspdk_accel_iaa.so 00:02:40.494 SYMLINK libspdk_accel_dsa.so 00:02:40.494 SYMLINK libspdk_blob_bdev.so 00:02:40.752 LIB libspdk_sock_posix.a 00:02:41.012 SO libspdk_sock_posix.so.6.0 00:02:41.012 SYMLINK libspdk_sock_posix.so 00:02:41.012 CC module/blobfs/bdev/blobfs_bdev.o 00:02:41.012 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:02:41.012 CC module/bdev/zone_block/vbdev_zone_block.o 00:02:41.012 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:02:41.012 CC module/bdev/gpt/vbdev_gpt.o 00:02:41.012 CC module/bdev/gpt/gpt.o 00:02:41.012 CC module/bdev/error/vbdev_error_rpc.o 00:02:41.012 CC module/bdev/lvol/vbdev_lvol.o 00:02:41.012 CC module/bdev/error/vbdev_error.o 00:02:41.012 CC module/bdev/malloc/bdev_malloc.o 00:02:41.012 CC module/bdev/nvme/bdev_nvme_rpc.o 00:02:41.012 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:02:41.012 CC module/bdev/nvme/bdev_nvme.o 00:02:41.012 CC module/bdev/malloc/bdev_malloc_rpc.o 00:02:41.012 CC module/bdev/nvme/nvme_rpc.o 00:02:41.012 CC module/bdev/nvme/bdev_mdns_client.o 00:02:41.012 CC module/bdev/nvme/vbdev_opal.o 00:02:41.012 CC module/bdev/nvme/vbdev_opal_rpc.o 00:02:41.012 CC module/bdev/delay/vbdev_delay.o 00:02:41.012 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:02:41.012 CC module/bdev/delay/vbdev_delay_rpc.o 00:02:41.012 CC module/bdev/passthru/vbdev_passthru.o 00:02:41.012 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:02:41.012 CC module/bdev/null/bdev_null.o 00:02:41.012 CC module/bdev/null/bdev_null_rpc.o 00:02:41.012 CC module/bdev/aio/bdev_aio.o 00:02:41.012 CC module/bdev/aio/bdev_aio_rpc.o 00:02:41.012 CC module/bdev/compress/vbdev_compress.o 00:02:41.012 CC module/bdev/iscsi/bdev_iscsi.o 00:02:41.012 CC module/bdev/compress/vbdev_compress_rpc.o 00:02:41.012 CC module/bdev/crypto/vbdev_crypto.o 00:02:41.012 CC module/bdev/raid/bdev_raid.o 00:02:41.012 CC module/bdev/split/vbdev_split.o 00:02:41.012 CC module/bdev/ftl/bdev_ftl.o 00:02:41.012 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:02:41.012 CC module/bdev/split/vbdev_split_rpc.o 00:02:41.012 CC module/bdev/ftl/bdev_ftl_rpc.o 00:02:41.012 CC module/bdev/raid/bdev_raid_rpc.o 00:02:41.012 CC module/bdev/raid/bdev_raid_sb.o 00:02:41.012 CC module/bdev/crypto/vbdev_crypto_rpc.o 00:02:41.012 CC module/bdev/raid/raid0.o 00:02:41.012 CC module/bdev/raid/raid1.o 00:02:41.012 CC module/bdev/raid/concat.o 00:02:41.012 CC module/bdev/virtio/bdev_virtio_scsi.o 00:02:41.012 CC module/bdev/virtio/bdev_virtio_rpc.o 00:02:41.012 CC module/bdev/virtio/bdev_virtio_blk.o 00:02:41.012 LIB libspdk_accel_dpdk_compressdev.a 00:02:41.012 SO libspdk_accel_dpdk_compressdev.so.3.0 00:02:41.271 SYMLINK libspdk_accel_dpdk_compressdev.so 00:02:41.271 LIB libspdk_blobfs_bdev.a 00:02:41.271 SO libspdk_blobfs_bdev.so.6.0 00:02:41.271 SYMLINK libspdk_blobfs_bdev.so 00:02:41.271 LIB libspdk_bdev_gpt.a 00:02:41.271 LIB libspdk_bdev_null.a 00:02:41.271 LIB libspdk_accel_dpdk_cryptodev.a 00:02:41.271 LIB libspdk_bdev_split.a 00:02:41.271 SO libspdk_bdev_gpt.so.6.0 00:02:41.271 LIB libspdk_bdev_error.a 00:02:41.271 SO libspdk_bdev_null.so.6.0 00:02:41.271 SO libspdk_bdev_split.so.6.0 00:02:41.529 SO libspdk_accel_dpdk_cryptodev.so.3.0 00:02:41.529 LIB libspdk_bdev_zone_block.a 00:02:41.529 LIB libspdk_bdev_aio.a 00:02:41.529 LIB libspdk_bdev_passthru.a 00:02:41.529 LIB libspdk_bdev_ftl.a 00:02:41.529 SO libspdk_bdev_error.so.6.0 00:02:41.529 SO libspdk_bdev_zone_block.so.6.0 00:02:41.529 SYMLINK libspdk_bdev_split.so 00:02:41.529 LIB libspdk_bdev_iscsi.a 00:02:41.529 LIB libspdk_bdev_delay.a 00:02:41.529 SYMLINK libspdk_bdev_gpt.so 00:02:41.529 SO libspdk_bdev_aio.so.6.0 00:02:41.529 SYMLINK libspdk_bdev_null.so 00:02:41.529 SO libspdk_bdev_ftl.so.6.0 00:02:41.529 LIB libspdk_bdev_malloc.a 00:02:41.529 SO libspdk_bdev_passthru.so.6.0 00:02:41.529 SYMLINK libspdk_accel_dpdk_cryptodev.so 00:02:41.529 LIB libspdk_bdev_crypto.a 00:02:41.529 SO libspdk_bdev_delay.so.6.0 00:02:41.529 SO libspdk_bdev_iscsi.so.6.0 00:02:41.529 SYMLINK libspdk_bdev_error.so 00:02:41.529 SO libspdk_bdev_malloc.so.6.0 00:02:41.529 SYMLINK libspdk_bdev_zone_block.so 00:02:41.529 SO libspdk_bdev_crypto.so.6.0 00:02:41.529 SYMLINK libspdk_bdev_aio.so 00:02:41.529 SYMLINK libspdk_bdev_ftl.so 00:02:41.529 SYMLINK libspdk_bdev_passthru.so 00:02:41.529 LIB libspdk_bdev_compress.a 00:02:41.529 SYMLINK libspdk_bdev_delay.so 00:02:41.529 LIB libspdk_bdev_lvol.a 00:02:41.529 SYMLINK libspdk_bdev_iscsi.so 00:02:41.529 SYMLINK libspdk_bdev_malloc.so 00:02:41.529 LIB libspdk_bdev_virtio.a 00:02:41.529 SYMLINK libspdk_bdev_crypto.so 00:02:41.529 SO libspdk_bdev_compress.so.6.0 00:02:41.529 SO libspdk_bdev_lvol.so.6.0 00:02:41.529 SO libspdk_bdev_virtio.so.6.0 00:02:41.529 SYMLINK libspdk_bdev_compress.so 00:02:41.529 SYMLINK libspdk_bdev_lvol.so 00:02:41.788 SYMLINK libspdk_bdev_virtio.so 00:02:41.788 LIB libspdk_bdev_raid.a 00:02:42.047 SO libspdk_bdev_raid.so.6.0 00:02:42.047 SYMLINK libspdk_bdev_raid.so 00:02:42.616 LIB libspdk_bdev_nvme.a 00:02:42.878 SO libspdk_bdev_nvme.so.7.0 00:02:42.878 SYMLINK libspdk_bdev_nvme.so 00:02:43.445 CC module/event/subsystems/iobuf/iobuf.o 00:02:43.445 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:02:43.445 CC module/event/subsystems/vmd/vmd.o 00:02:43.445 CC module/event/subsystems/vmd/vmd_rpc.o 00:02:43.445 CC module/event/subsystems/sock/sock.o 00:02:43.445 CC module/event/subsystems/keyring/keyring.o 00:02:43.445 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:02:43.445 CC module/event/subsystems/scheduler/scheduler.o 00:02:43.705 LIB libspdk_event_sock.a 00:02:43.705 LIB libspdk_event_vhost_blk.a 00:02:43.705 LIB libspdk_event_scheduler.a 00:02:43.705 LIB libspdk_event_keyring.a 00:02:43.705 LIB libspdk_event_vmd.a 00:02:43.705 LIB libspdk_event_iobuf.a 00:02:43.705 SO libspdk_event_sock.so.5.0 00:02:43.705 SO libspdk_event_vhost_blk.so.3.0 00:02:43.705 SO libspdk_event_vmd.so.6.0 00:02:43.705 SO libspdk_event_scheduler.so.4.0 00:02:43.705 SO libspdk_event_keyring.so.1.0 00:02:43.705 SO libspdk_event_iobuf.so.3.0 00:02:43.705 SYMLINK libspdk_event_sock.so 00:02:43.705 SYMLINK libspdk_event_vhost_blk.so 00:02:43.705 SYMLINK libspdk_event_scheduler.so 00:02:43.705 SYMLINK libspdk_event_vmd.so 00:02:43.705 SYMLINK libspdk_event_keyring.so 00:02:43.705 SYMLINK libspdk_event_iobuf.so 00:02:43.964 CC module/event/subsystems/accel/accel.o 00:02:44.224 LIB libspdk_event_accel.a 00:02:44.224 SO libspdk_event_accel.so.6.0 00:02:44.224 SYMLINK libspdk_event_accel.so 00:02:44.483 CC module/event/subsystems/bdev/bdev.o 00:02:44.743 LIB libspdk_event_bdev.a 00:02:44.743 SO libspdk_event_bdev.so.6.0 00:02:44.743 SYMLINK libspdk_event_bdev.so 00:02:45.002 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:02:45.262 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:02:45.262 CC module/event/subsystems/scsi/scsi.o 00:02:45.262 CC module/event/subsystems/ublk/ublk.o 00:02:45.262 CC module/event/subsystems/nbd/nbd.o 00:02:45.262 LIB libspdk_event_nbd.a 00:02:45.262 LIB libspdk_event_ublk.a 00:02:45.262 LIB libspdk_event_scsi.a 00:02:45.262 SO libspdk_event_nbd.so.6.0 00:02:45.262 SO libspdk_event_scsi.so.6.0 00:02:45.262 SO libspdk_event_ublk.so.3.0 00:02:45.262 LIB libspdk_event_nvmf.a 00:02:45.262 SYMLINK libspdk_event_ublk.so 00:02:45.262 SYMLINK libspdk_event_nbd.so 00:02:45.262 SO libspdk_event_nvmf.so.6.0 00:02:45.262 SYMLINK libspdk_event_scsi.so 00:02:45.521 SYMLINK libspdk_event_nvmf.so 00:02:45.780 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:02:45.780 CC module/event/subsystems/iscsi/iscsi.o 00:02:45.780 LIB libspdk_event_vhost_scsi.a 00:02:45.780 LIB libspdk_event_iscsi.a 00:02:45.780 SO libspdk_event_vhost_scsi.so.3.0 00:02:45.780 SO libspdk_event_iscsi.so.6.0 00:02:45.780 SYMLINK libspdk_event_vhost_scsi.so 00:02:46.039 SYMLINK libspdk_event_iscsi.so 00:02:46.039 SO libspdk.so.6.0 00:02:46.039 SYMLINK libspdk.so 00:02:46.298 CC app/trace_record/trace_record.o 00:02:46.298 CC test/rpc_client/rpc_client_test.o 00:02:46.298 CXX app/trace/trace.o 00:02:46.298 TEST_HEADER include/spdk/accel.h 00:02:46.298 CC app/spdk_nvme_identify/identify.o 00:02:46.298 TEST_HEADER include/spdk/accel_module.h 00:02:46.298 TEST_HEADER include/spdk/assert.h 00:02:46.298 TEST_HEADER include/spdk/barrier.h 00:02:46.298 TEST_HEADER include/spdk/base64.h 00:02:46.298 CC app/spdk_lspci/spdk_lspci.o 00:02:46.298 TEST_HEADER include/spdk/bdev.h 00:02:46.298 CC app/spdk_nvme_perf/perf.o 00:02:46.298 CC app/spdk_top/spdk_top.o 00:02:46.298 TEST_HEADER include/spdk/bdev_module.h 00:02:46.298 TEST_HEADER include/spdk/bdev_zone.h 00:02:46.298 TEST_HEADER include/spdk/bit_pool.h 00:02:46.298 TEST_HEADER include/spdk/bit_array.h 00:02:46.298 TEST_HEADER include/spdk/blob_bdev.h 00:02:46.298 TEST_HEADER include/spdk/blobfs_bdev.h 00:02:46.298 TEST_HEADER include/spdk/conf.h 00:02:46.298 TEST_HEADER include/spdk/blobfs.h 00:02:46.298 TEST_HEADER include/spdk/blob.h 00:02:46.562 TEST_HEADER include/spdk/config.h 00:02:46.562 TEST_HEADER include/spdk/crc16.h 00:02:46.562 TEST_HEADER include/spdk/cpuset.h 00:02:46.562 CC app/spdk_nvme_discover/discovery_aer.o 00:02:46.562 TEST_HEADER include/spdk/crc64.h 00:02:46.562 TEST_HEADER include/spdk/dif.h 00:02:46.562 TEST_HEADER include/spdk/crc32.h 00:02:46.562 TEST_HEADER include/spdk/dma.h 00:02:46.562 TEST_HEADER include/spdk/env.h 00:02:46.562 TEST_HEADER include/spdk/endian.h 00:02:46.562 TEST_HEADER include/spdk/env_dpdk.h 00:02:46.562 TEST_HEADER include/spdk/fd.h 00:02:46.562 TEST_HEADER include/spdk/fd_group.h 00:02:46.562 TEST_HEADER include/spdk/file.h 00:02:46.562 TEST_HEADER include/spdk/event.h 00:02:46.562 TEST_HEADER include/spdk/hexlify.h 00:02:46.562 TEST_HEADER include/spdk/ftl.h 00:02:46.562 TEST_HEADER include/spdk/histogram_data.h 00:02:46.562 TEST_HEADER include/spdk/gpt_spec.h 00:02:46.562 TEST_HEADER include/spdk/idxd.h 00:02:46.562 TEST_HEADER include/spdk/init.h 00:02:46.562 TEST_HEADER include/spdk/idxd_spec.h 00:02:46.562 TEST_HEADER include/spdk/ioat.h 00:02:46.562 TEST_HEADER include/spdk/iscsi_spec.h 00:02:46.562 TEST_HEADER include/spdk/ioat_spec.h 00:02:46.562 TEST_HEADER include/spdk/json.h 00:02:46.562 TEST_HEADER include/spdk/jsonrpc.h 00:02:46.562 TEST_HEADER include/spdk/keyring.h 00:02:46.562 TEST_HEADER include/spdk/keyring_module.h 00:02:46.562 TEST_HEADER include/spdk/likely.h 00:02:46.562 TEST_HEADER include/spdk/log.h 00:02:46.562 TEST_HEADER include/spdk/lvol.h 00:02:46.562 TEST_HEADER include/spdk/mmio.h 00:02:46.562 CC examples/interrupt_tgt/interrupt_tgt.o 00:02:46.562 TEST_HEADER include/spdk/memory.h 00:02:46.562 TEST_HEADER include/spdk/notify.h 00:02:46.562 TEST_HEADER include/spdk/nbd.h 00:02:46.562 TEST_HEADER include/spdk/nvme.h 00:02:46.562 TEST_HEADER include/spdk/nvme_intel.h 00:02:46.562 TEST_HEADER include/spdk/nvme_ocssd.h 00:02:46.562 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:02:46.562 TEST_HEADER include/spdk/nvmf_cmd.h 00:02:46.562 TEST_HEADER include/spdk/nvme_spec.h 00:02:46.562 TEST_HEADER include/spdk/nvme_zns.h 00:02:46.562 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:02:46.562 CC app/nvmf_tgt/nvmf_main.o 00:02:46.562 TEST_HEADER include/spdk/nvmf.h 00:02:46.562 TEST_HEADER include/spdk/opal.h 00:02:46.562 TEST_HEADER include/spdk/nvmf_spec.h 00:02:46.562 TEST_HEADER include/spdk/nvmf_transport.h 00:02:46.562 TEST_HEADER include/spdk/opal_spec.h 00:02:46.562 CC app/spdk_dd/spdk_dd.o 00:02:46.562 TEST_HEADER include/spdk/pci_ids.h 00:02:46.562 TEST_HEADER include/spdk/queue.h 00:02:46.562 TEST_HEADER include/spdk/pipe.h 00:02:46.562 TEST_HEADER include/spdk/reduce.h 00:02:46.562 TEST_HEADER include/spdk/scheduler.h 00:02:46.562 TEST_HEADER include/spdk/scsi_spec.h 00:02:46.562 TEST_HEADER include/spdk/rpc.h 00:02:46.562 TEST_HEADER include/spdk/scsi.h 00:02:46.562 TEST_HEADER include/spdk/sock.h 00:02:46.562 TEST_HEADER include/spdk/stdinc.h 00:02:46.562 TEST_HEADER include/spdk/string.h 00:02:46.562 TEST_HEADER include/spdk/thread.h 00:02:46.562 TEST_HEADER include/spdk/trace_parser.h 00:02:46.562 TEST_HEADER include/spdk/trace.h 00:02:46.562 TEST_HEADER include/spdk/ublk.h 00:02:46.562 TEST_HEADER include/spdk/util.h 00:02:46.562 TEST_HEADER include/spdk/uuid.h 00:02:46.562 TEST_HEADER include/spdk/version.h 00:02:46.562 TEST_HEADER include/spdk/tree.h 00:02:46.562 TEST_HEADER include/spdk/vfio_user_pci.h 00:02:46.562 TEST_HEADER include/spdk/vfio_user_spec.h 00:02:46.562 TEST_HEADER include/spdk/vhost.h 00:02:46.562 TEST_HEADER include/spdk/vmd.h 00:02:46.562 TEST_HEADER include/spdk/xor.h 00:02:46.562 TEST_HEADER include/spdk/zipf.h 00:02:46.562 CXX test/cpp_headers/accel.o 00:02:46.562 CXX test/cpp_headers/accel_module.o 00:02:46.562 CXX test/cpp_headers/barrier.o 00:02:46.562 CXX test/cpp_headers/assert.o 00:02:46.562 CXX test/cpp_headers/base64.o 00:02:46.562 CXX test/cpp_headers/bdev.o 00:02:46.562 CC app/iscsi_tgt/iscsi_tgt.o 00:02:46.562 CXX test/cpp_headers/bdev_module.o 00:02:46.562 CXX test/cpp_headers/bdev_zone.o 00:02:46.562 CXX test/cpp_headers/bit_array.o 00:02:46.562 CXX test/cpp_headers/bit_pool.o 00:02:46.562 CXX test/cpp_headers/blob_bdev.o 00:02:46.562 CXX test/cpp_headers/blobfs.o 00:02:46.562 CXX test/cpp_headers/blobfs_bdev.o 00:02:46.562 CXX test/cpp_headers/blob.o 00:02:46.562 CXX test/cpp_headers/conf.o 00:02:46.562 CXX test/cpp_headers/config.o 00:02:46.562 CXX test/cpp_headers/crc32.o 00:02:46.562 CXX test/cpp_headers/crc16.o 00:02:46.562 CXX test/cpp_headers/cpuset.o 00:02:46.562 CXX test/cpp_headers/crc64.o 00:02:46.562 CXX test/cpp_headers/dif.o 00:02:46.562 CXX test/cpp_headers/dma.o 00:02:46.562 CXX test/cpp_headers/endian.o 00:02:46.562 CXX test/cpp_headers/env_dpdk.o 00:02:46.562 CXX test/cpp_headers/env.o 00:02:46.562 CXX test/cpp_headers/event.o 00:02:46.562 CXX test/cpp_headers/fd_group.o 00:02:46.562 CXX test/cpp_headers/file.o 00:02:46.562 CXX test/cpp_headers/fd.o 00:02:46.562 CXX test/cpp_headers/ftl.o 00:02:46.562 CXX test/cpp_headers/gpt_spec.o 00:02:46.562 CXX test/cpp_headers/histogram_data.o 00:02:46.562 CXX test/cpp_headers/hexlify.o 00:02:46.562 CXX test/cpp_headers/idxd_spec.o 00:02:46.562 CXX test/cpp_headers/init.o 00:02:46.562 CXX test/cpp_headers/idxd.o 00:02:46.562 CXX test/cpp_headers/ioat_spec.o 00:02:46.562 CXX test/cpp_headers/ioat.o 00:02:46.562 CXX test/cpp_headers/json.o 00:02:46.562 CXX test/cpp_headers/iscsi_spec.o 00:02:46.562 CXX test/cpp_headers/keyring_module.o 00:02:46.562 CXX test/cpp_headers/jsonrpc.o 00:02:46.562 CXX test/cpp_headers/keyring.o 00:02:46.562 CC app/spdk_tgt/spdk_tgt.o 00:02:46.562 CXX test/cpp_headers/log.o 00:02:46.562 CXX test/cpp_headers/likely.o 00:02:46.562 CXX test/cpp_headers/lvol.o 00:02:46.562 CXX test/cpp_headers/mmio.o 00:02:46.562 CXX test/cpp_headers/nbd.o 00:02:46.562 CXX test/cpp_headers/memory.o 00:02:46.562 CXX test/cpp_headers/notify.o 00:02:46.562 CXX test/cpp_headers/nvme.o 00:02:46.562 CXX test/cpp_headers/nvme_ocssd.o 00:02:46.562 CXX test/cpp_headers/nvme_intel.o 00:02:46.562 CXX test/cpp_headers/nvme_ocssd_spec.o 00:02:46.562 CXX test/cpp_headers/nvme_spec.o 00:02:46.562 CXX test/cpp_headers/nvmf_cmd.o 00:02:46.562 CXX test/cpp_headers/nvmf_fc_spec.o 00:02:46.562 CXX test/cpp_headers/nvme_zns.o 00:02:46.562 CXX test/cpp_headers/nvmf.o 00:02:46.562 CXX test/cpp_headers/nvmf_transport.o 00:02:46.562 CXX test/cpp_headers/opal.o 00:02:46.562 CXX test/cpp_headers/nvmf_spec.o 00:02:46.562 CXX test/cpp_headers/opal_spec.o 00:02:46.562 CXX test/cpp_headers/pci_ids.o 00:02:46.562 CXX test/cpp_headers/pipe.o 00:02:46.562 CXX test/cpp_headers/queue.o 00:02:46.562 CXX test/cpp_headers/reduce.o 00:02:46.562 CC test/app/histogram_perf/histogram_perf.o 00:02:46.562 CC test/env/vtophys/vtophys.o 00:02:46.562 CXX test/cpp_headers/rpc.o 00:02:46.562 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:02:46.562 CC test/env/memory/memory_ut.o 00:02:46.562 CC examples/ioat/perf/perf.o 00:02:46.562 CC app/fio/nvme/fio_plugin.o 00:02:46.562 CC test/thread/poller_perf/poller_perf.o 00:02:46.562 CC test/app/jsoncat/jsoncat.o 00:02:46.562 CC examples/ioat/verify/verify.o 00:02:46.562 CC examples/util/zipf/zipf.o 00:02:46.562 CXX test/cpp_headers/scheduler.o 00:02:46.562 CC test/app/stub/stub.o 00:02:46.562 CC test/env/pci/pci_ut.o 00:02:46.835 CC test/dma/test_dma/test_dma.o 00:02:46.835 CC test/app/bdev_svc/bdev_svc.o 00:02:46.835 CC app/fio/bdev/fio_plugin.o 00:02:46.835 LINK rpc_client_test 00:02:46.835 LINK spdk_lspci 00:02:47.096 LINK nvmf_tgt 00:02:47.096 LINK spdk_nvme_discover 00:02:47.096 CC test/env/mem_callbacks/mem_callbacks.o 00:02:47.096 LINK interrupt_tgt 00:02:47.096 LINK spdk_trace_record 00:02:47.096 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:02:47.096 LINK jsoncat 00:02:47.096 LINK env_dpdk_post_init 00:02:47.097 CXX test/cpp_headers/scsi.o 00:02:47.097 CXX test/cpp_headers/scsi_spec.o 00:02:47.097 CXX test/cpp_headers/sock.o 00:02:47.097 CXX test/cpp_headers/stdinc.o 00:02:47.097 CXX test/cpp_headers/string.o 00:02:47.097 LINK zipf 00:02:47.097 CXX test/cpp_headers/thread.o 00:02:47.097 CXX test/cpp_headers/trace.o 00:02:47.097 CXX test/cpp_headers/trace_parser.o 00:02:47.097 CXX test/cpp_headers/tree.o 00:02:47.097 CXX test/cpp_headers/ublk.o 00:02:47.097 CXX test/cpp_headers/util.o 00:02:47.097 CXX test/cpp_headers/uuid.o 00:02:47.097 CXX test/cpp_headers/version.o 00:02:47.097 LINK vtophys 00:02:47.097 CXX test/cpp_headers/vfio_user_pci.o 00:02:47.097 CXX test/cpp_headers/vfio_user_spec.o 00:02:47.097 CXX test/cpp_headers/vhost.o 00:02:47.097 LINK histogram_perf 00:02:47.097 CXX test/cpp_headers/vmd.o 00:02:47.097 CXX test/cpp_headers/xor.o 00:02:47.097 CXX test/cpp_headers/zipf.o 00:02:47.097 LINK poller_perf 00:02:47.097 LINK iscsi_tgt 00:02:47.355 LINK ioat_perf 00:02:47.355 LINK bdev_svc 00:02:47.355 LINK spdk_tgt 00:02:47.355 LINK spdk_dd 00:02:47.355 LINK stub 00:02:47.355 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:02:47.355 LINK verify 00:02:47.355 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:02:47.355 LINK spdk_trace 00:02:47.355 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:02:47.613 LINK pci_ut 00:02:47.613 LINK test_dma 00:02:47.613 LINK spdk_nvme 00:02:47.613 CC examples/idxd/perf/perf.o 00:02:47.613 CC examples/vmd/lsvmd/lsvmd.o 00:02:47.613 CC examples/vmd/led/led.o 00:02:47.613 LINK nvme_fuzz 00:02:47.613 CC examples/sock/hello_world/hello_sock.o 00:02:47.613 CC examples/thread/thread/thread_ex.o 00:02:47.613 LINK spdk_top 00:02:47.613 LINK spdk_bdev 00:02:47.613 LINK spdk_nvme_perf 00:02:47.613 LINK vhost_fuzz 00:02:47.613 LINK mem_callbacks 00:02:47.872 LINK spdk_nvme_identify 00:02:47.872 CC app/vhost/vhost.o 00:02:47.872 CC test/event/event_perf/event_perf.o 00:02:47.872 CC test/event/reactor/reactor.o 00:02:47.872 CC test/event/reactor_perf/reactor_perf.o 00:02:47.872 LINK lsvmd 00:02:47.872 CC test/event/app_repeat/app_repeat.o 00:02:47.872 LINK led 00:02:47.872 CC test/event/scheduler/scheduler.o 00:02:47.872 LINK hello_sock 00:02:47.872 LINK reactor 00:02:47.872 LINK event_perf 00:02:47.872 LINK idxd_perf 00:02:47.872 LINK reactor_perf 00:02:47.872 LINK thread 00:02:47.872 LINK vhost 00:02:47.872 LINK app_repeat 00:02:48.132 LINK scheduler 00:02:48.132 CC test/nvme/compliance/nvme_compliance.o 00:02:48.132 CC test/nvme/sgl/sgl.o 00:02:48.132 CC test/nvme/e2edp/nvme_dp.o 00:02:48.132 CC test/nvme/reserve/reserve.o 00:02:48.132 CC test/nvme/boot_partition/boot_partition.o 00:02:48.132 CC test/nvme/err_injection/err_injection.o 00:02:48.132 CC test/nvme/simple_copy/simple_copy.o 00:02:48.132 CC test/nvme/overhead/overhead.o 00:02:48.132 CC test/nvme/fdp/fdp.o 00:02:48.132 CC test/nvme/fused_ordering/fused_ordering.o 00:02:48.132 CC test/nvme/cuse/cuse.o 00:02:48.132 CC test/nvme/doorbell_aers/doorbell_aers.o 00:02:48.132 CC test/nvme/aer/aer.o 00:02:48.132 CC test/nvme/reset/reset.o 00:02:48.132 CC test/nvme/connect_stress/connect_stress.o 00:02:48.132 CC test/blobfs/mkfs/mkfs.o 00:02:48.132 CC test/nvme/startup/startup.o 00:02:48.132 CC test/accel/dif/dif.o 00:02:48.132 LINK memory_ut 00:02:48.132 CC test/lvol/esnap/esnap.o 00:02:48.391 LINK boot_partition 00:02:48.391 LINK reserve 00:02:48.391 LINK startup 00:02:48.391 LINK err_injection 00:02:48.391 LINK connect_stress 00:02:48.391 LINK fused_ordering 00:02:48.391 LINK simple_copy 00:02:48.391 LINK doorbell_aers 00:02:48.391 CC examples/nvme/cmb_copy/cmb_copy.o 00:02:48.391 LINK mkfs 00:02:48.391 CC examples/nvme/abort/abort.o 00:02:48.391 LINK nvme_dp 00:02:48.391 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:02:48.391 CC examples/nvme/reconnect/reconnect.o 00:02:48.391 CC examples/nvme/hotplug/hotplug.o 00:02:48.391 CC examples/nvme/arbitration/arbitration.o 00:02:48.391 LINK sgl 00:02:48.391 CC examples/nvme/hello_world/hello_world.o 00:02:48.391 CC examples/nvme/nvme_manage/nvme_manage.o 00:02:48.391 LINK reset 00:02:48.391 LINK overhead 00:02:48.391 LINK nvme_compliance 00:02:48.391 LINK aer 00:02:48.391 LINK fdp 00:02:48.391 CC examples/accel/perf/accel_perf.o 00:02:48.391 CC examples/blob/cli/blobcli.o 00:02:48.391 CC examples/blob/hello_world/hello_blob.o 00:02:48.650 LINK cmb_copy 00:02:48.650 LINK pmr_persistence 00:02:48.650 LINK dif 00:02:48.650 LINK hello_world 00:02:48.650 LINK hotplug 00:02:48.650 LINK arbitration 00:02:48.650 LINK reconnect 00:02:48.650 LINK abort 00:02:48.650 LINK iscsi_fuzz 00:02:48.650 LINK hello_blob 00:02:48.650 LINK nvme_manage 00:02:48.908 LINK accel_perf 00:02:48.908 LINK blobcli 00:02:49.167 CC test/bdev/bdevio/bdevio.o 00:02:49.167 LINK cuse 00:02:49.426 CC examples/bdev/hello_world/hello_bdev.o 00:02:49.426 CC examples/bdev/bdevperf/bdevperf.o 00:02:49.426 LINK bdevio 00:02:49.685 LINK hello_bdev 00:02:49.944 LINK bdevperf 00:02:50.511 CC examples/nvmf/nvmf/nvmf.o 00:02:50.769 LINK nvmf 00:02:51.703 LINK esnap 00:02:51.961 00:02:51.961 real 1m9.494s 00:02:51.961 user 14m15.931s 00:02:51.962 sys 4m14.358s 00:02:51.962 11:43:42 make -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:02:51.962 11:43:42 make -- common/autotest_common.sh@10 -- $ set +x 00:02:51.962 ************************************ 00:02:51.962 END TEST make 00:02:51.962 ************************************ 00:02:51.962 11:43:42 -- common/autotest_common.sh@1142 -- $ return 0 00:02:51.962 11:43:42 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:02:51.962 11:43:42 -- pm/common@29 -- $ signal_monitor_resources TERM 00:02:51.962 11:43:42 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:02:51.962 11:43:42 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:51.962 11:43:42 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:02:51.962 11:43:42 -- pm/common@44 -- $ pid=425409 00:02:51.962 11:43:42 -- pm/common@50 -- $ kill -TERM 425409 00:02:51.962 11:43:42 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:51.962 11:43:42 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:02:51.962 11:43:42 -- pm/common@44 -- $ pid=425411 00:02:51.962 11:43:42 -- pm/common@50 -- $ kill -TERM 425411 00:02:51.962 11:43:42 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:51.962 11:43:42 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:02:51.962 11:43:42 -- pm/common@44 -- $ pid=425413 00:02:51.962 11:43:42 -- pm/common@50 -- $ kill -TERM 425413 00:02:51.962 11:43:42 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:51.962 11:43:42 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:02:51.962 11:43:42 -- pm/common@44 -- $ pid=425436 00:02:51.962 11:43:42 -- pm/common@50 -- $ sudo -E kill -TERM 425436 00:02:51.962 11:43:42 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:02:51.962 11:43:42 -- nvmf/common.sh@7 -- # uname -s 00:02:51.962 11:43:42 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:02:51.962 11:43:42 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:02:51.962 11:43:42 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:02:51.962 11:43:42 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:02:51.962 11:43:42 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:02:51.962 11:43:42 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:02:51.962 11:43:42 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:02:51.962 11:43:42 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:02:51.962 11:43:42 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:02:51.962 11:43:42 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:02:51.962 11:43:42 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:801347e8-3fd0-e911-906e-0017a4403562 00:02:51.962 11:43:42 -- nvmf/common.sh@18 -- # NVME_HOSTID=801347e8-3fd0-e911-906e-0017a4403562 00:02:51.962 11:43:42 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:02:51.962 11:43:42 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:02:51.962 11:43:42 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:02:51.962 11:43:42 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:02:51.962 11:43:42 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:02:52.220 11:43:42 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:02:52.220 11:43:42 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:02:52.220 11:43:42 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:02:52.220 11:43:42 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:52.220 11:43:42 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:52.221 11:43:42 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:52.221 11:43:42 -- paths/export.sh@5 -- # export PATH 00:02:52.221 11:43:42 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:52.221 11:43:42 -- nvmf/common.sh@47 -- # : 0 00:02:52.221 11:43:42 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:02:52.221 11:43:42 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:02:52.221 11:43:42 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:02:52.221 11:43:42 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:02:52.221 11:43:42 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:02:52.221 11:43:42 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:02:52.221 11:43:42 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:02:52.221 11:43:42 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:02:52.221 11:43:42 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:02:52.221 11:43:42 -- spdk/autotest.sh@32 -- # uname -s 00:02:52.221 11:43:42 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:02:52.221 11:43:42 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:02:52.221 11:43:42 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:02:52.221 11:43:42 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:02:52.221 11:43:42 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:02:52.221 11:43:42 -- spdk/autotest.sh@44 -- # modprobe nbd 00:02:52.221 11:43:42 -- spdk/autotest.sh@46 -- # type -P udevadm 00:02:52.221 11:43:42 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:02:52.221 11:43:42 -- spdk/autotest.sh@48 -- # udevadm_pid=491973 00:02:52.221 11:43:42 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:02:52.221 11:43:42 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:02:52.221 11:43:42 -- pm/common@17 -- # local monitor 00:02:52.221 11:43:42 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:52.221 11:43:42 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:52.221 11:43:42 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:52.221 11:43:42 -- pm/common@21 -- # date +%s 00:02:52.221 11:43:42 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:52.221 11:43:42 -- pm/common@21 -- # date +%s 00:02:52.221 11:43:42 -- pm/common@25 -- # sleep 1 00:02:52.221 11:43:42 -- pm/common@21 -- # date +%s 00:02:52.221 11:43:42 -- pm/common@21 -- # date +%s 00:02:52.221 11:43:42 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1720777422 00:02:52.221 11:43:42 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1720777422 00:02:52.221 11:43:42 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1720777422 00:02:52.221 11:43:42 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1720777422 00:02:52.221 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1720777422_collect-vmstat.pm.log 00:02:52.221 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1720777422_collect-cpu-load.pm.log 00:02:52.221 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1720777422_collect-cpu-temp.pm.log 00:02:52.221 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1720777422_collect-bmc-pm.bmc.pm.log 00:02:53.214 11:43:43 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:02:53.214 11:43:43 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:02:53.214 11:43:43 -- common/autotest_common.sh@722 -- # xtrace_disable 00:02:53.214 11:43:43 -- common/autotest_common.sh@10 -- # set +x 00:02:53.214 11:43:43 -- spdk/autotest.sh@59 -- # create_test_list 00:02:53.214 11:43:43 -- common/autotest_common.sh@746 -- # xtrace_disable 00:02:53.214 11:43:43 -- common/autotest_common.sh@10 -- # set +x 00:02:53.214 11:43:43 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/autotest.sh 00:02:53.214 11:43:43 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:53.214 11:43:43 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:53.214 11:43:43 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:02:53.214 11:43:43 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:53.214 11:43:43 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:02:53.214 11:43:43 -- common/autotest_common.sh@1455 -- # uname 00:02:53.214 11:43:43 -- common/autotest_common.sh@1455 -- # '[' Linux = FreeBSD ']' 00:02:53.214 11:43:43 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:02:53.214 11:43:43 -- common/autotest_common.sh@1475 -- # uname 00:02:53.214 11:43:43 -- common/autotest_common.sh@1475 -- # [[ Linux = FreeBSD ]] 00:02:53.214 11:43:43 -- spdk/autotest.sh@71 -- # grep CC_TYPE mk/cc.mk 00:02:53.214 11:43:43 -- spdk/autotest.sh@71 -- # CC_TYPE=CC_TYPE=gcc 00:02:53.214 11:43:43 -- spdk/autotest.sh@72 -- # hash lcov 00:02:53.214 11:43:43 -- spdk/autotest.sh@72 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:02:53.214 11:43:43 -- spdk/autotest.sh@80 -- # export 'LCOV_OPTS= 00:02:53.214 --rc lcov_branch_coverage=1 00:02:53.214 --rc lcov_function_coverage=1 00:02:53.214 --rc genhtml_branch_coverage=1 00:02:53.214 --rc genhtml_function_coverage=1 00:02:53.214 --rc genhtml_legend=1 00:02:53.214 --rc geninfo_all_blocks=1 00:02:53.214 ' 00:02:53.214 11:43:43 -- spdk/autotest.sh@80 -- # LCOV_OPTS=' 00:02:53.214 --rc lcov_branch_coverage=1 00:02:53.214 --rc lcov_function_coverage=1 00:02:53.214 --rc genhtml_branch_coverage=1 00:02:53.214 --rc genhtml_function_coverage=1 00:02:53.214 --rc genhtml_legend=1 00:02:53.214 --rc geninfo_all_blocks=1 00:02:53.214 ' 00:02:53.214 11:43:43 -- spdk/autotest.sh@81 -- # export 'LCOV=lcov 00:02:53.214 --rc lcov_branch_coverage=1 00:02:53.214 --rc lcov_function_coverage=1 00:02:53.214 --rc genhtml_branch_coverage=1 00:02:53.214 --rc genhtml_function_coverage=1 00:02:53.214 --rc genhtml_legend=1 00:02:53.214 --rc geninfo_all_blocks=1 00:02:53.214 --no-external' 00:02:53.214 11:43:43 -- spdk/autotest.sh@81 -- # LCOV='lcov 00:02:53.214 --rc lcov_branch_coverage=1 00:02:53.214 --rc lcov_function_coverage=1 00:02:53.214 --rc genhtml_branch_coverage=1 00:02:53.214 --rc genhtml_function_coverage=1 00:02:53.214 --rc genhtml_legend=1 00:02:53.214 --rc geninfo_all_blocks=1 00:02:53.214 --no-external' 00:02:53.214 11:43:43 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -v 00:02:53.214 lcov: LCOV version 1.14 00:02:53.214 11:43:43 -- spdk/autotest.sh@85 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -i -t Baseline -d /var/jenkins/workspace/crypto-phy-autotest/spdk -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno:no functions found 00:02:57.537 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno 00:02:57.537 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno 00:02:57.538 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno:no functions found 00:02:57.538 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno 00:03:12.418 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno:no functions found 00:03:12.418 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:03:17.687 11:44:07 -- spdk/autotest.sh@89 -- # timing_enter pre_cleanup 00:03:17.687 11:44:07 -- common/autotest_common.sh@722 -- # xtrace_disable 00:03:17.687 11:44:07 -- common/autotest_common.sh@10 -- # set +x 00:03:17.687 11:44:07 -- spdk/autotest.sh@91 -- # rm -f 00:03:17.687 11:44:07 -- spdk/autotest.sh@94 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:20.973 0000:5f:00.0 (1b96 2600): Already using the nvme driver 00:03:20.973 0000:5e:00.0 (8086 0a54): Already using the nvme driver 00:03:20.973 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:03:20.973 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:03:20.973 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:03:20.973 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:03:20.973 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:03:20.973 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:03:20.973 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:03:20.973 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:03:20.973 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:03:20.973 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:03:20.973 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:03:20.973 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:03:20.973 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:03:20.973 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:03:20.973 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:03:20.973 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:03:20.973 11:44:11 -- spdk/autotest.sh@96 -- # get_zoned_devs 00:03:20.973 11:44:11 -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:03:20.973 11:44:11 -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:03:20.973 11:44:11 -- common/autotest_common.sh@1670 -- # local nvme bdf 00:03:20.973 11:44:11 -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:03:20.973 11:44:11 -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:03:20.973 11:44:11 -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:03:20.973 11:44:11 -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:20.973 11:44:11 -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:03:20.973 11:44:11 -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:03:20.973 11:44:11 -- common/autotest_common.sh@1673 -- # is_block_zoned nvme1n1 00:03:20.973 11:44:11 -- common/autotest_common.sh@1662 -- # local device=nvme1n1 00:03:20.973 11:44:11 -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme1n1/queue/zoned ]] 00:03:20.973 11:44:11 -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:03:20.973 11:44:11 -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:03:20.973 11:44:11 -- common/autotest_common.sh@1673 -- # is_block_zoned nvme1n2 00:03:20.973 11:44:11 -- common/autotest_common.sh@1662 -- # local device=nvme1n2 00:03:20.973 11:44:11 -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme1n2/queue/zoned ]] 00:03:20.973 11:44:11 -- common/autotest_common.sh@1665 -- # [[ host-managed != none ]] 00:03:20.973 11:44:11 -- common/autotest_common.sh@1674 -- # zoned_devs["${nvme##*/}"]=0000:5f:00.0 00:03:20.973 11:44:11 -- spdk/autotest.sh@98 -- # (( 1 > 0 )) 00:03:20.973 11:44:11 -- spdk/autotest.sh@103 -- # export PCI_BLOCKED=0000:5f:00.0 00:03:20.973 11:44:11 -- spdk/autotest.sh@103 -- # PCI_BLOCKED=0000:5f:00.0 00:03:20.973 11:44:11 -- spdk/autotest.sh@104 -- # export PCI_ZONED=0000:5f:00.0 00:03:20.973 11:44:11 -- spdk/autotest.sh@104 -- # PCI_ZONED=0000:5f:00.0 00:03:20.973 11:44:11 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:03:20.973 11:44:11 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:03:20.973 11:44:11 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme0n1 00:03:20.973 11:44:11 -- scripts/common.sh@378 -- # local block=/dev/nvme0n1 pt 00:03:20.973 11:44:11 -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:03:20.973 No valid GPT data, bailing 00:03:20.973 11:44:11 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:20.973 11:44:11 -- scripts/common.sh@391 -- # pt= 00:03:20.973 11:44:11 -- scripts/common.sh@392 -- # return 1 00:03:20.973 11:44:11 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:03:20.973 1+0 records in 00:03:20.973 1+0 records out 00:03:20.973 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00197286 s, 532 MB/s 00:03:20.973 11:44:11 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:03:20.973 11:44:11 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:03:20.973 11:44:11 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme1n1 00:03:20.973 11:44:11 -- scripts/common.sh@378 -- # local block=/dev/nvme1n1 pt 00:03:20.973 11:44:11 -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme1n1 00:03:21.233 No valid GPT data, bailing 00:03:21.233 11:44:11 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme1n1 00:03:21.233 11:44:11 -- scripts/common.sh@391 -- # pt= 00:03:21.233 11:44:11 -- scripts/common.sh@392 -- # return 1 00:03:21.233 11:44:11 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme1n1 bs=1M count=1 00:03:21.233 1+0 records in 00:03:21.233 1+0 records out 00:03:21.233 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00532356 s, 197 MB/s 00:03:21.233 11:44:11 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:03:21.233 11:44:11 -- spdk/autotest.sh@112 -- # [[ -z 0000:5f:00.0 ]] 00:03:21.233 11:44:11 -- spdk/autotest.sh@112 -- # continue 00:03:21.233 11:44:11 -- spdk/autotest.sh@118 -- # sync 00:03:21.233 11:44:11 -- spdk/autotest.sh@120 -- # xtrace_disable_per_cmd reap_spdk_processes 00:03:21.233 11:44:11 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:03:21.233 11:44:11 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:03:26.504 11:44:16 -- spdk/autotest.sh@124 -- # uname -s 00:03:26.504 11:44:16 -- spdk/autotest.sh@124 -- # '[' Linux = Linux ']' 00:03:26.504 11:44:16 -- spdk/autotest.sh@125 -- # run_test setup.sh /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:03:26.504 11:44:16 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:26.504 11:44:16 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:26.504 11:44:16 -- common/autotest_common.sh@10 -- # set +x 00:03:26.504 ************************************ 00:03:26.504 START TEST setup.sh 00:03:26.504 ************************************ 00:03:26.504 11:44:16 setup.sh -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:03:26.504 * Looking for test storage... 00:03:26.504 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:26.504 11:44:16 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:03:26.504 11:44:16 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:03:26.504 11:44:16 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:03:26.504 11:44:16 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:26.504 11:44:16 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:26.504 11:44:16 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:03:26.504 ************************************ 00:03:26.504 START TEST acl 00:03:26.504 ************************************ 00:03:26.504 11:44:16 setup.sh.acl -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:03:26.504 * Looking for test storage... 00:03:26.504 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:26.504 11:44:16 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:03:26.504 11:44:16 setup.sh.acl -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:03:26.504 11:44:16 setup.sh.acl -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:03:26.504 11:44:16 setup.sh.acl -- common/autotest_common.sh@1670 -- # local nvme bdf 00:03:26.504 11:44:16 setup.sh.acl -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:03:26.504 11:44:16 setup.sh.acl -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:03:26.504 11:44:16 setup.sh.acl -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:03:26.504 11:44:16 setup.sh.acl -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:26.504 11:44:16 setup.sh.acl -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:03:26.504 11:44:16 setup.sh.acl -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:03:26.504 11:44:16 setup.sh.acl -- common/autotest_common.sh@1673 -- # is_block_zoned nvme1n1 00:03:26.504 11:44:16 setup.sh.acl -- common/autotest_common.sh@1662 -- # local device=nvme1n1 00:03:26.504 11:44:16 setup.sh.acl -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme1n1/queue/zoned ]] 00:03:26.504 11:44:16 setup.sh.acl -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:03:26.504 11:44:16 setup.sh.acl -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:03:26.504 11:44:16 setup.sh.acl -- common/autotest_common.sh@1673 -- # is_block_zoned nvme1n2 00:03:26.504 11:44:16 setup.sh.acl -- common/autotest_common.sh@1662 -- # local device=nvme1n2 00:03:26.504 11:44:16 setup.sh.acl -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme1n2/queue/zoned ]] 00:03:26.504 11:44:16 setup.sh.acl -- common/autotest_common.sh@1665 -- # [[ host-managed != none ]] 00:03:26.504 11:44:16 setup.sh.acl -- common/autotest_common.sh@1674 -- # zoned_devs["${nvme##*/}"]=0000:5f:00.0 00:03:26.504 11:44:16 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:03:26.504 11:44:16 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:03:26.504 11:44:16 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:03:26.504 11:44:16 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:03:26.504 11:44:16 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:03:26.504 11:44:16 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:26.504 11:44:16 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:30.698 11:44:20 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:03:30.698 11:44:20 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:03:30.698 11:44:20 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:30.698 11:44:20 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:03:30.698 11:44:20 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:03:30.698 11:44:20 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:03:33.233 Hugepages 00:03:33.233 node hugesize free / total 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.233 00:03:33.233 Type BDF Vendor Device NUMA Driver Device Block devices 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:33.233 11:44:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.490 11:44:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:5e:00.0 == *:*:*.* ]] 00:03:33.490 11:44:23 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:03:33.490 11:44:23 setup.sh.acl -- setup/acl.sh@21 -- # [[ 0000:5f:00.0 == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:03:33.490 11:44:23 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:03:33.490 11:44:23 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:03:33.490 11:44:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.490 11:44:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:5f:00.0 == *:*:*.* ]] 00:03:33.490 11:44:23 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:03:33.490 11:44:23 setup.sh.acl -- setup/acl.sh@21 -- # [[ 0000:5f:00.0 == *\0\0\0\0\:\5\f\:\0\0\.\0* ]] 00:03:33.490 11:44:23 setup.sh.acl -- setup/acl.sh@21 -- # continue 00:03:33.490 11:44:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.490 11:44:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:03:33.490 11:44:23 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:33.490 11:44:23 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:33.490 11:44:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.490 11:44:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:03:33.490 11:44:23 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:33.490 11:44:23 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:33.490 11:44:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.490 11:44:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:03:33.490 11:44:23 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:33.490 11:44:23 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:33.491 11:44:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.491 11:44:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:03:33.491 11:44:23 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:33.491 11:44:23 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:33.491 11:44:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.491 11:44:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:03:33.491 11:44:23 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:33.491 11:44:23 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:33.491 11:44:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.491 11:44:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:03:33.491 11:44:23 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:33.491 11:44:23 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:33.491 11:44:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.491 11:44:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:03:33.491 11:44:23 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:33.491 11:44:23 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:33.491 11:44:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.491 11:44:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:03:33.491 11:44:23 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:33.491 11:44:23 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:33.491 11:44:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.491 11:44:23 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:03:33.491 11:44:23 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:03:33.491 11:44:23 setup.sh.acl -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:33.491 11:44:23 setup.sh.acl -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:33.491 11:44:23 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:03:33.491 ************************************ 00:03:33.491 START TEST denied 00:03:33.491 ************************************ 00:03:33.491 11:44:23 setup.sh.acl.denied -- common/autotest_common.sh@1123 -- # denied 00:03:33.491 11:44:23 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED='0000:5f:00.0 0000:5e:00.0' 00:03:33.491 11:44:23 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:03:33.491 11:44:23 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:5e:00.0' 00:03:33.491 11:44:23 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:03:33.491 11:44:23 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:03:37.682 0000:5e:00.0 (8086 0a54): Skipping denied controller at 0000:5e:00.0 00:03:37.682 11:44:27 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:5e:00.0 00:03:37.682 11:44:27 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:03:37.682 11:44:27 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:03:37.682 11:44:27 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:5e:00.0 ]] 00:03:37.682 11:44:27 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:5e:00.0/driver 00:03:37.682 11:44:27 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:03:37.682 11:44:27 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:03:37.682 11:44:27 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:03:37.682 11:44:27 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:37.682 11:44:27 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:41.877 00:03:41.877 real 0m8.072s 00:03:41.877 user 0m2.637s 00:03:41.877 sys 0m4.687s 00:03:41.877 11:44:31 setup.sh.acl.denied -- common/autotest_common.sh@1124 -- # xtrace_disable 00:03:41.877 11:44:31 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:03:41.877 ************************************ 00:03:41.877 END TEST denied 00:03:41.877 ************************************ 00:03:41.877 11:44:31 setup.sh.acl -- common/autotest_common.sh@1142 -- # return 0 00:03:41.877 11:44:31 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:03:41.877 11:44:31 setup.sh.acl -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:41.877 11:44:31 setup.sh.acl -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:41.877 11:44:31 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:03:41.877 ************************************ 00:03:41.877 START TEST allowed 00:03:41.877 ************************************ 00:03:41.877 11:44:31 setup.sh.acl.allowed -- common/autotest_common.sh@1123 -- # allowed 00:03:41.877 11:44:31 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:5e:00.0 00:03:41.877 11:44:31 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:03:41.877 11:44:31 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:5e:00.0 .*: nvme -> .*' 00:03:41.877 11:44:31 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:03:41.877 11:44:31 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:03:46.071 0000:5e:00.0 (8086 0a54): nvme -> vfio-pci 00:03:46.071 11:44:36 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:03:46.071 11:44:36 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:03:46.071 11:44:36 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:03:46.071 11:44:36 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:46.071 11:44:36 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:50.263 00:03:50.263 real 0m8.023s 00:03:50.263 user 0m2.621s 00:03:50.263 sys 0m4.581s 00:03:50.263 11:44:39 setup.sh.acl.allowed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:03:50.263 11:44:39 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:03:50.263 ************************************ 00:03:50.263 END TEST allowed 00:03:50.263 ************************************ 00:03:50.263 11:44:39 setup.sh.acl -- common/autotest_common.sh@1142 -- # return 0 00:03:50.263 00:03:50.263 real 0m23.429s 00:03:50.263 user 0m8.079s 00:03:50.263 sys 0m13.999s 00:03:50.263 11:44:39 setup.sh.acl -- common/autotest_common.sh@1124 -- # xtrace_disable 00:03:50.263 11:44:39 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:03:50.263 ************************************ 00:03:50.263 END TEST acl 00:03:50.263 ************************************ 00:03:50.263 11:44:39 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:03:50.263 11:44:39 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:03:50.263 11:44:39 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:50.263 11:44:39 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:50.263 11:44:39 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:03:50.263 ************************************ 00:03:50.263 START TEST hugepages 00:03:50.263 ************************************ 00:03:50.263 11:44:39 setup.sh.hugepages -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:03:50.263 * Looking for test storage... 00:03:50.263 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322988 kB' 'MemFree: 76681896 kB' 'MemAvailable: 80062716 kB' 'Buffers: 11472 kB' 'Cached: 9022064 kB' 'SwapCached: 0 kB' 'Active: 6044172 kB' 'Inactive: 3499164 kB' 'Active(anon): 5664744 kB' 'Inactive(anon): 0 kB' 'Active(file): 379428 kB' 'Inactive(file): 3499164 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 513444 kB' 'Mapped: 159768 kB' 'Shmem: 5154944 kB' 'KReclaimable: 196908 kB' 'Slab: 590344 kB' 'SReclaimable: 196908 kB' 'SUnreclaim: 393436 kB' 'KernelStack: 19472 kB' 'PageTables: 8224 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52952944 kB' 'Committed_AS: 7103852 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217788 kB' 'VmallocChunk: 0 kB' 'Percpu: 60288 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 576468 kB' 'DirectMap2M: 8540160 kB' 'DirectMap1G: 93323264 kB' 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.263 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.264 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/hugepages.sh@207 -- # get_nodes 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/hugepages.sh@27 -- # local node 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/hugepages.sh@208 -- # clear_hp 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:50.265 11:44:40 setup.sh.hugepages -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:03:50.265 11:44:40 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:50.265 11:44:40 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:50.265 11:44:40 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:50.265 ************************************ 00:03:50.265 START TEST default_setup 00:03:50.265 ************************************ 00:03:50.265 11:44:40 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1123 -- # default_setup 00:03:50.265 11:44:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:03:50.265 11:44:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@49 -- # local size=2097152 00:03:50.265 11:44:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:50.265 11:44:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@51 -- # shift 00:03:50.265 11:44:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:50.265 11:44:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # local node_ids 00:03:50.265 11:44:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:50.265 11:44:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:50.265 11:44:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:50.265 11:44:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:50.265 11:44:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # local user_nodes 00:03:50.265 11:44:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:50.265 11:44:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:50.265 11:44:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:50.265 11:44:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:50.265 11:44:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:50.265 11:44:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:50.265 11:44:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:50.265 11:44:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@73 -- # return 0 00:03:50.265 11:44:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@137 -- # setup output 00:03:50.265 11:44:40 setup.sh.hugepages.default_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:03:50.265 11:44:40 setup.sh.hugepages.default_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:52.800 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:03:53.367 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:53.367 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:53.367 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:53.367 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:53.367 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:53.367 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:53.368 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:53.368 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:53.368 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:53.368 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:53.368 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:53.368 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:53.368 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:53.368 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:53.368 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:53.368 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:54.305 0000:5e:00.0 (8086 0a54): nvme -> vfio-pci 00:03:54.305 11:44:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:03:54.305 11:44:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@89 -- # local node 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@90 -- # local sorted_t 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@91 -- # local sorted_s 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@92 -- # local surp 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@93 -- # local resv 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@94 -- # local anon 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322988 kB' 'MemFree: 78829084 kB' 'MemAvailable: 82209524 kB' 'Buffers: 11472 kB' 'Cached: 9022176 kB' 'SwapCached: 0 kB' 'Active: 6061476 kB' 'Inactive: 3499164 kB' 'Active(anon): 5682048 kB' 'Inactive(anon): 0 kB' 'Active(file): 379428 kB' 'Inactive(file): 3499164 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 529744 kB' 'Mapped: 159920 kB' 'Shmem: 5155056 kB' 'KReclaimable: 196152 kB' 'Slab: 588320 kB' 'SReclaimable: 196152 kB' 'SUnreclaim: 392168 kB' 'KernelStack: 19552 kB' 'PageTables: 8344 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001520 kB' 'Committed_AS: 7122928 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217964 kB' 'VmallocChunk: 0 kB' 'Percpu: 60288 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576468 kB' 'DirectMap2M: 8540160 kB' 'DirectMap1G: 93323264 kB' 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.306 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # anon=0 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322988 kB' 'MemFree: 78828704 kB' 'MemAvailable: 82209144 kB' 'Buffers: 11472 kB' 'Cached: 9022180 kB' 'SwapCached: 0 kB' 'Active: 6059192 kB' 'Inactive: 3499164 kB' 'Active(anon): 5679764 kB' 'Inactive(anon): 0 kB' 'Active(file): 379428 kB' 'Inactive(file): 3499164 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 527992 kB' 'Mapped: 159900 kB' 'Shmem: 5155060 kB' 'KReclaimable: 196152 kB' 'Slab: 588368 kB' 'SReclaimable: 196152 kB' 'SUnreclaim: 392216 kB' 'KernelStack: 19712 kB' 'PageTables: 8336 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001520 kB' 'Committed_AS: 7122944 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217900 kB' 'VmallocChunk: 0 kB' 'Percpu: 60288 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576468 kB' 'DirectMap2M: 8540160 kB' 'DirectMap1G: 93323264 kB' 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.307 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.308 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # surp=0 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322988 kB' 'MemFree: 78827656 kB' 'MemAvailable: 82208096 kB' 'Buffers: 11472 kB' 'Cached: 9022200 kB' 'SwapCached: 0 kB' 'Active: 6059508 kB' 'Inactive: 3499164 kB' 'Active(anon): 5680080 kB' 'Inactive(anon): 0 kB' 'Active(file): 379428 kB' 'Inactive(file): 3499164 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 528296 kB' 'Mapped: 159900 kB' 'Shmem: 5155080 kB' 'KReclaimable: 196152 kB' 'Slab: 588372 kB' 'SReclaimable: 196152 kB' 'SUnreclaim: 392220 kB' 'KernelStack: 19664 kB' 'PageTables: 8568 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001520 kB' 'Committed_AS: 7122968 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217916 kB' 'VmallocChunk: 0 kB' 'Percpu: 60288 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576468 kB' 'DirectMap2M: 8540160 kB' 'DirectMap1G: 93323264 kB' 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.309 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.310 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.311 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.311 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.311 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.311 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.311 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.311 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.311 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.311 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.311 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.311 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.311 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.311 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.311 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.311 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # resv=0 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:54.572 nr_hugepages=1024 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:54.572 resv_hugepages=0 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:54.572 surplus_hugepages=0 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:54.572 anon_hugepages=0 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322988 kB' 'MemFree: 78827464 kB' 'MemAvailable: 82207904 kB' 'Buffers: 11472 kB' 'Cached: 9022220 kB' 'SwapCached: 0 kB' 'Active: 6059176 kB' 'Inactive: 3499164 kB' 'Active(anon): 5679748 kB' 'Inactive(anon): 0 kB' 'Active(file): 379428 kB' 'Inactive(file): 3499164 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 527940 kB' 'Mapped: 159900 kB' 'Shmem: 5155100 kB' 'KReclaimable: 196152 kB' 'Slab: 588372 kB' 'SReclaimable: 196152 kB' 'SUnreclaim: 392220 kB' 'KernelStack: 19568 kB' 'PageTables: 8104 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001520 kB' 'Committed_AS: 7122988 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217916 kB' 'VmallocChunk: 0 kB' 'Percpu: 60288 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576468 kB' 'DirectMap2M: 8540160 kB' 'DirectMap1G: 93323264 kB' 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.572 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.573 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 1024 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@112 -- # get_nodes 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@27 -- # local node 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node=0 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634628 kB' 'MemFree: 21466884 kB' 'MemUsed: 11167744 kB' 'SwapCached: 0 kB' 'Active: 4372344 kB' 'Inactive: 3400940 kB' 'Active(anon): 4232452 kB' 'Inactive(anon): 0 kB' 'Active(file): 139892 kB' 'Inactive(file): 3400940 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7606428 kB' 'Mapped: 63776 kB' 'AnonPages: 170052 kB' 'Shmem: 4065596 kB' 'KernelStack: 10120 kB' 'PageTables: 3692 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 125032 kB' 'Slab: 365668 kB' 'SReclaimable: 125032 kB' 'SUnreclaim: 240636 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.574 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:54.575 node0=1024 expecting 1024 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:54.575 00:03:54.575 real 0m4.489s 00:03:54.575 user 0m1.476s 00:03:54.575 sys 0m2.252s 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1124 -- # xtrace_disable 00:03:54.575 11:44:44 setup.sh.hugepages.default_setup -- common/autotest_common.sh@10 -- # set +x 00:03:54.575 ************************************ 00:03:54.575 END TEST default_setup 00:03:54.575 ************************************ 00:03:54.575 11:44:44 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:03:54.575 11:44:44 setup.sh.hugepages -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:03:54.575 11:44:44 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:54.575 11:44:44 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:54.575 11:44:44 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:54.575 ************************************ 00:03:54.575 START TEST per_node_1G_alloc 00:03:54.575 ************************************ 00:03:54.575 11:44:44 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1123 -- # per_node_1G_alloc 00:03:54.575 11:44:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@143 -- # local IFS=, 00:03:54.575 11:44:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:03:54.575 11:44:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:03:54.575 11:44:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:03:54.575 11:44:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@51 -- # shift 00:03:54.575 11:44:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:03:54.575 11:44:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:03:54.575 11:44:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:54.575 11:44:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:54.575 11:44:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:03:54.575 11:44:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:03:54.575 11:44:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:54.575 11:44:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:54.575 11:44:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:54.575 11:44:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:54.575 11:44:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:54.575 11:44:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:03:54.575 11:44:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:54.575 11:44:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:54.575 11:44:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:54.575 11:44:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:54.575 11:44:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@73 -- # return 0 00:03:54.575 11:44:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # NRHUGE=512 00:03:54.575 11:44:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:03:54.575 11:44:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # setup output 00:03:54.575 11:44:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:54.575 11:44:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:57.863 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:03:57.863 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:57.863 0000:5e:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:57.863 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:57.863 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:57.863 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:57.863 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:57.863 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:57.863 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:57.863 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:57.863 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:57.863 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:57.863 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:57.863 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:57.863 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:57.863 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:57.863 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:57.863 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:57.863 11:44:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:03:57.863 11:44:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:03:57.863 11:44:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@89 -- # local node 00:03:57.863 11:44:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:03:57.863 11:44:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:03:57.863 11:44:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@92 -- # local surp 00:03:57.863 11:44:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@93 -- # local resv 00:03:57.863 11:44:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@94 -- # local anon 00:03:57.863 11:44:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322988 kB' 'MemFree: 78840684 kB' 'MemAvailable: 82221124 kB' 'Buffers: 11472 kB' 'Cached: 9022316 kB' 'SwapCached: 0 kB' 'Active: 6058440 kB' 'Inactive: 3499164 kB' 'Active(anon): 5679012 kB' 'Inactive(anon): 0 kB' 'Active(file): 379428 kB' 'Inactive(file): 3499164 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 526532 kB' 'Mapped: 158952 kB' 'Shmem: 5155196 kB' 'KReclaimable: 196152 kB' 'Slab: 587856 kB' 'SReclaimable: 196152 kB' 'SUnreclaim: 391704 kB' 'KernelStack: 19472 kB' 'PageTables: 7892 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001520 kB' 'Committed_AS: 7110280 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217948 kB' 'VmallocChunk: 0 kB' 'Percpu: 60288 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576468 kB' 'DirectMap2M: 8540160 kB' 'DirectMap1G: 93323264 kB' 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.863 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.864 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322988 kB' 'MemFree: 78837792 kB' 'MemAvailable: 82218232 kB' 'Buffers: 11472 kB' 'Cached: 9022316 kB' 'SwapCached: 0 kB' 'Active: 6058984 kB' 'Inactive: 3499164 kB' 'Active(anon): 5679556 kB' 'Inactive(anon): 0 kB' 'Active(file): 379428 kB' 'Inactive(file): 3499164 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 528032 kB' 'Mapped: 159356 kB' 'Shmem: 5155196 kB' 'KReclaimable: 196152 kB' 'Slab: 587856 kB' 'SReclaimable: 196152 kB' 'SUnreclaim: 391704 kB' 'KernelStack: 19424 kB' 'PageTables: 7732 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001520 kB' 'Committed_AS: 7112840 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217932 kB' 'VmallocChunk: 0 kB' 'Percpu: 60288 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576468 kB' 'DirectMap2M: 8540160 kB' 'DirectMap1G: 93323264 kB' 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.866 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322988 kB' 'MemFree: 78831744 kB' 'MemAvailable: 82212184 kB' 'Buffers: 11472 kB' 'Cached: 9022340 kB' 'SwapCached: 0 kB' 'Active: 6063276 kB' 'Inactive: 3499164 kB' 'Active(anon): 5683848 kB' 'Inactive(anon): 0 kB' 'Active(file): 379428 kB' 'Inactive(file): 3499164 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 531840 kB' 'Mapped: 159696 kB' 'Shmem: 5155220 kB' 'KReclaimable: 196152 kB' 'Slab: 587856 kB' 'SReclaimable: 196152 kB' 'SUnreclaim: 391704 kB' 'KernelStack: 19472 kB' 'PageTables: 7908 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001520 kB' 'Committed_AS: 7116440 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217936 kB' 'VmallocChunk: 0 kB' 'Percpu: 60288 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576468 kB' 'DirectMap2M: 8540160 kB' 'DirectMap1G: 93323264 kB' 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.867 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.868 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.869 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.870 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.870 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.870 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.870 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.870 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.870 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.870 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.870 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.870 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.870 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.870 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.870 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:57.870 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.870 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.134 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.134 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.134 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.134 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.134 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.134 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.134 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.134 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.134 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.134 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.134 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.134 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.134 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.134 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.134 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.134 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.134 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.134 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.134 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.134 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.134 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.134 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:03:58.134 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:58.135 nr_hugepages=1024 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:58.135 resv_hugepages=0 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:58.135 surplus_hugepages=0 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:58.135 anon_hugepages=0 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322988 kB' 'MemFree: 78833020 kB' 'MemAvailable: 82213460 kB' 'Buffers: 11472 kB' 'Cached: 9022360 kB' 'SwapCached: 0 kB' 'Active: 6057580 kB' 'Inactive: 3499164 kB' 'Active(anon): 5678152 kB' 'Inactive(anon): 0 kB' 'Active(file): 379428 kB' 'Inactive(file): 3499164 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 526104 kB' 'Mapped: 158852 kB' 'Shmem: 5155240 kB' 'KReclaimable: 196152 kB' 'Slab: 587856 kB' 'SReclaimable: 196152 kB' 'SUnreclaim: 391704 kB' 'KernelStack: 19456 kB' 'PageTables: 7832 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001520 kB' 'Committed_AS: 7110344 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217932 kB' 'VmallocChunk: 0 kB' 'Percpu: 60288 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576468 kB' 'DirectMap2M: 8540160 kB' 'DirectMap1G: 93323264 kB' 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.135 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 1024 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@27 -- # local node 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=0 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:58.136 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634628 kB' 'MemFree: 22513824 kB' 'MemUsed: 10120804 kB' 'SwapCached: 0 kB' 'Active: 4371724 kB' 'Inactive: 3400940 kB' 'Active(anon): 4231832 kB' 'Inactive(anon): 0 kB' 'Active(file): 139892 kB' 'Inactive(file): 3400940 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7606524 kB' 'Mapped: 63208 kB' 'AnonPages: 169336 kB' 'Shmem: 4065692 kB' 'KernelStack: 10104 kB' 'PageTables: 3636 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 125032 kB' 'Slab: 365204 kB' 'SReclaimable: 125032 kB' 'SUnreclaim: 240172 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.137 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=1 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60688360 kB' 'MemFree: 56319820 kB' 'MemUsed: 4368540 kB' 'SwapCached: 0 kB' 'Active: 1685880 kB' 'Inactive: 98224 kB' 'Active(anon): 1446344 kB' 'Inactive(anon): 0 kB' 'Active(file): 239536 kB' 'Inactive(file): 98224 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1427356 kB' 'Mapped: 95644 kB' 'AnonPages: 356780 kB' 'Shmem: 1089596 kB' 'KernelStack: 9352 kB' 'PageTables: 4196 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 71120 kB' 'Slab: 222652 kB' 'SReclaimable: 71120 kB' 'SUnreclaim: 151532 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.138 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:58.139 node0=512 expecting 512 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:58.139 node1=512 expecting 512 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:58.139 00:03:58.139 real 0m3.542s 00:03:58.139 user 0m1.452s 00:03:58.139 sys 0m2.159s 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:03:58.139 11:44:48 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@10 -- # set +x 00:03:58.139 ************************************ 00:03:58.139 END TEST per_node_1G_alloc 00:03:58.139 ************************************ 00:03:58.139 11:44:48 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:03:58.139 11:44:48 setup.sh.hugepages -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:03:58.139 11:44:48 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:58.139 11:44:48 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:58.139 11:44:48 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:58.139 ************************************ 00:03:58.139 START TEST even_2G_alloc 00:03:58.139 ************************************ 00:03:58.139 11:44:48 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1123 -- # even_2G_alloc 00:03:58.139 11:44:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:03:58.140 11:44:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:03:58.140 11:44:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:58.140 11:44:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:58.140 11:44:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:58.140 11:44:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:58.140 11:44:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:58.140 11:44:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:58.140 11:44:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:58.140 11:44:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:58.140 11:44:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:58.140 11:44:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:58.140 11:44:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:58.140 11:44:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:58.140 11:44:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:58.140 11:44:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:58.140 11:44:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 512 00:03:58.140 11:44:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 1 00:03:58.140 11:44:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:58.140 11:44:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:58.140 11:44:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:03:58.140 11:44:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 0 00:03:58.140 11:44:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:58.140 11:44:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:03:58.140 11:44:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:03:58.140 11:44:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # setup output 00:03:58.140 11:44:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:58.140 11:44:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:01.433 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:04:01.433 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:01.433 0000:5e:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:01.433 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:01.433 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:01.433 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:01.433 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:01.433 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:01.433 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:01.433 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:01.433 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:01.433 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:01.433 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:01.433 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:01.433 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:01.433 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:01.433 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:01.433 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:01.433 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:04:01.433 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local node 00:04:01.433 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:01.433 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:01.433 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:01.433 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:01.433 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:01.433 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:01.433 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:01.433 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:01.433 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322988 kB' 'MemFree: 78913444 kB' 'MemAvailable: 82293880 kB' 'Buffers: 11472 kB' 'Cached: 9022480 kB' 'SwapCached: 0 kB' 'Active: 6058392 kB' 'Inactive: 3499164 kB' 'Active(anon): 5678964 kB' 'Inactive(anon): 0 kB' 'Active(file): 379428 kB' 'Inactive(file): 3499164 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 526884 kB' 'Mapped: 158892 kB' 'Shmem: 5155360 kB' 'KReclaimable: 196144 kB' 'Slab: 587576 kB' 'SReclaimable: 196144 kB' 'SUnreclaim: 391432 kB' 'KernelStack: 19936 kB' 'PageTables: 8940 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001520 kB' 'Committed_AS: 7111956 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217980 kB' 'VmallocChunk: 0 kB' 'Percpu: 60288 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576468 kB' 'DirectMap2M: 8540160 kB' 'DirectMap1G: 93323264 kB' 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.434 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322988 kB' 'MemFree: 78914036 kB' 'MemAvailable: 82294472 kB' 'Buffers: 11472 kB' 'Cached: 9022480 kB' 'SwapCached: 0 kB' 'Active: 6057736 kB' 'Inactive: 3499164 kB' 'Active(anon): 5678308 kB' 'Inactive(anon): 0 kB' 'Active(file): 379428 kB' 'Inactive(file): 3499164 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 526200 kB' 'Mapped: 158884 kB' 'Shmem: 5155360 kB' 'KReclaimable: 196144 kB' 'Slab: 587568 kB' 'SReclaimable: 196144 kB' 'SUnreclaim: 391424 kB' 'KernelStack: 19776 kB' 'PageTables: 8416 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001520 kB' 'Committed_AS: 7113464 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217932 kB' 'VmallocChunk: 0 kB' 'Percpu: 60288 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576468 kB' 'DirectMap2M: 8540160 kB' 'DirectMap1G: 93323264 kB' 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.435 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.436 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322988 kB' 'MemFree: 78912464 kB' 'MemAvailable: 82292900 kB' 'Buffers: 11472 kB' 'Cached: 9022500 kB' 'SwapCached: 0 kB' 'Active: 6057536 kB' 'Inactive: 3499164 kB' 'Active(anon): 5678108 kB' 'Inactive(anon): 0 kB' 'Active(file): 379428 kB' 'Inactive(file): 3499164 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 525980 kB' 'Mapped: 158884 kB' 'Shmem: 5155380 kB' 'KReclaimable: 196144 kB' 'Slab: 587608 kB' 'SReclaimable: 196144 kB' 'SUnreclaim: 391464 kB' 'KernelStack: 19488 kB' 'PageTables: 7672 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001520 kB' 'Committed_AS: 7122496 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217820 kB' 'VmallocChunk: 0 kB' 'Percpu: 60288 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576468 kB' 'DirectMap2M: 8540160 kB' 'DirectMap1G: 93323264 kB' 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.437 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.701 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:01.702 nr_hugepages=1024 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:01.702 resv_hugepages=0 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:01.702 surplus_hugepages=0 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:01.702 anon_hugepages=0 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.702 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322988 kB' 'MemFree: 78908148 kB' 'MemAvailable: 82288584 kB' 'Buffers: 11472 kB' 'Cached: 9022520 kB' 'SwapCached: 0 kB' 'Active: 6057236 kB' 'Inactive: 3499164 kB' 'Active(anon): 5677808 kB' 'Inactive(anon): 0 kB' 'Active(file): 379428 kB' 'Inactive(file): 3499164 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 525696 kB' 'Mapped: 158872 kB' 'Shmem: 5155400 kB' 'KReclaimable: 196144 kB' 'Slab: 587632 kB' 'SReclaimable: 196144 kB' 'SUnreclaim: 391488 kB' 'KernelStack: 19552 kB' 'PageTables: 7960 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001520 kB' 'Committed_AS: 7110524 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217788 kB' 'VmallocChunk: 0 kB' 'Percpu: 60288 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576468 kB' 'DirectMap2M: 8540160 kB' 'DirectMap1G: 93323264 kB' 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.703 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@27 -- # local node 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634628 kB' 'MemFree: 22554944 kB' 'MemUsed: 10079684 kB' 'SwapCached: 0 kB' 'Active: 4371676 kB' 'Inactive: 3400940 kB' 'Active(anon): 4231784 kB' 'Inactive(anon): 0 kB' 'Active(file): 139892 kB' 'Inactive(file): 3400940 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7606560 kB' 'Mapped: 63228 kB' 'AnonPages: 169196 kB' 'Shmem: 4065728 kB' 'KernelStack: 10072 kB' 'PageTables: 3508 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 125024 kB' 'Slab: 365036 kB' 'SReclaimable: 125024 kB' 'SUnreclaim: 240012 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.704 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.705 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60688360 kB' 'MemFree: 56352968 kB' 'MemUsed: 4335392 kB' 'SwapCached: 0 kB' 'Active: 1685792 kB' 'Inactive: 98224 kB' 'Active(anon): 1446256 kB' 'Inactive(anon): 0 kB' 'Active(file): 239536 kB' 'Inactive(file): 98224 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1427476 kB' 'Mapped: 95644 kB' 'AnonPages: 356632 kB' 'Shmem: 1089716 kB' 'KernelStack: 9480 kB' 'PageTables: 4424 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 71120 kB' 'Slab: 222596 kB' 'SReclaimable: 71120 kB' 'SUnreclaim: 151476 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.706 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:01.707 node0=512 expecting 512 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:01.707 node1=512 expecting 512 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:01.707 00:04:01.707 real 0m3.507s 00:04:01.707 user 0m1.399s 00:04:01.707 sys 0m2.174s 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:01.707 11:44:51 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:01.707 ************************************ 00:04:01.707 END TEST even_2G_alloc 00:04:01.707 ************************************ 00:04:01.707 11:44:51 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:01.707 11:44:51 setup.sh.hugepages -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:04:01.707 11:44:51 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:01.707 11:44:51 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:01.707 11:44:51 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:01.707 ************************************ 00:04:01.707 START TEST odd_alloc 00:04:01.707 ************************************ 00:04:01.707 11:44:51 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1123 -- # odd_alloc 00:04:01.707 11:44:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:04:01.707 11:44:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # local size=2098176 00:04:01.707 11:44:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:01.707 11:44:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:01.707 11:44:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:04:01.707 11:44:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:01.707 11:44:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:01.707 11:44:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:01.707 11:44:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:04:01.707 11:44:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:01.707 11:44:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:01.707 11:44:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:01.707 11:44:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:01.707 11:44:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:01.707 11:44:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:01.707 11:44:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:01.707 11:44:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 513 00:04:01.707 11:44:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:01.707 11:44:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:01.707 11:44:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:04:01.707 11:44:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:01.707 11:44:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:01.707 11:44:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:01.707 11:44:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:04:01.707 11:44:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:04:01.707 11:44:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # setup output 00:04:01.707 11:44:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:01.707 11:44:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:04.997 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:04:04.997 0000:5e:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:04.997 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:04.997 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:04.997 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:04.997 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:04.997 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:04.997 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:04.997 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:04.997 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:04.997 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:04.998 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:04.998 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:04.998 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:04.998 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:04.998 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:04.998 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:04.998 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local node 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322988 kB' 'MemFree: 78886328 kB' 'MemAvailable: 82266764 kB' 'Buffers: 11472 kB' 'Cached: 9022648 kB' 'SwapCached: 0 kB' 'Active: 6058000 kB' 'Inactive: 3499164 kB' 'Active(anon): 5678572 kB' 'Inactive(anon): 0 kB' 'Active(file): 379428 kB' 'Inactive(file): 3499164 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 525888 kB' 'Mapped: 158892 kB' 'Shmem: 5155528 kB' 'KReclaimable: 196144 kB' 'Slab: 587876 kB' 'SReclaimable: 196144 kB' 'SUnreclaim: 391732 kB' 'KernelStack: 19440 kB' 'PageTables: 7800 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54000496 kB' 'Committed_AS: 7111640 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217916 kB' 'VmallocChunk: 0 kB' 'Percpu: 60288 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 576468 kB' 'DirectMap2M: 8540160 kB' 'DirectMap1G: 93323264 kB' 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.998 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322988 kB' 'MemFree: 78886640 kB' 'MemAvailable: 82267076 kB' 'Buffers: 11472 kB' 'Cached: 9022648 kB' 'SwapCached: 0 kB' 'Active: 6057892 kB' 'Inactive: 3499164 kB' 'Active(anon): 5678464 kB' 'Inactive(anon): 0 kB' 'Active(file): 379428 kB' 'Inactive(file): 3499164 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 526276 kB' 'Mapped: 158884 kB' 'Shmem: 5155528 kB' 'KReclaimable: 196144 kB' 'Slab: 587952 kB' 'SReclaimable: 196144 kB' 'SUnreclaim: 391808 kB' 'KernelStack: 19472 kB' 'PageTables: 7892 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54000496 kB' 'Committed_AS: 7111656 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217884 kB' 'VmallocChunk: 0 kB' 'Percpu: 60288 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 576468 kB' 'DirectMap2M: 8540160 kB' 'DirectMap1G: 93323264 kB' 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:04.999 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.000 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.001 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.001 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.001 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.001 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.001 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.001 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.001 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.001 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.001 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.001 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.001 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.001 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.001 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.001 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.001 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.001 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.001 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.001 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.001 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.001 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.001 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.001 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.001 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.001 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.001 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.001 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.001 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.001 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.001 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.001 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.001 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.264 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.264 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.264 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.264 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:05.264 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:05.264 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:05.264 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:05.264 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:05.264 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:05.264 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:05.264 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:05.264 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:05.264 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:05.264 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:05.264 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:05.264 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:05.264 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.264 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322988 kB' 'MemFree: 78887624 kB' 'MemAvailable: 82268060 kB' 'Buffers: 11472 kB' 'Cached: 9022668 kB' 'SwapCached: 0 kB' 'Active: 6057920 kB' 'Inactive: 3499164 kB' 'Active(anon): 5678492 kB' 'Inactive(anon): 0 kB' 'Active(file): 379428 kB' 'Inactive(file): 3499164 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 526284 kB' 'Mapped: 158884 kB' 'Shmem: 5155548 kB' 'KReclaimable: 196144 kB' 'Slab: 587952 kB' 'SReclaimable: 196144 kB' 'SUnreclaim: 391808 kB' 'KernelStack: 19472 kB' 'PageTables: 7892 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54000496 kB' 'Committed_AS: 7111676 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217884 kB' 'VmallocChunk: 0 kB' 'Percpu: 60288 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 576468 kB' 'DirectMap2M: 8540160 kB' 'DirectMap1G: 93323264 kB' 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.265 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:04:05.266 nr_hugepages=1025 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:05.266 resv_hugepages=0 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:05.266 surplus_hugepages=0 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:05.266 anon_hugepages=0 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322988 kB' 'MemFree: 78888496 kB' 'MemAvailable: 82268932 kB' 'Buffers: 11472 kB' 'Cached: 9022688 kB' 'SwapCached: 0 kB' 'Active: 6057928 kB' 'Inactive: 3499164 kB' 'Active(anon): 5678500 kB' 'Inactive(anon): 0 kB' 'Active(file): 379428 kB' 'Inactive(file): 3499164 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 526284 kB' 'Mapped: 158884 kB' 'Shmem: 5155568 kB' 'KReclaimable: 196144 kB' 'Slab: 587952 kB' 'SReclaimable: 196144 kB' 'SUnreclaim: 391808 kB' 'KernelStack: 19472 kB' 'PageTables: 7892 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54000496 kB' 'Committed_AS: 7111700 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217884 kB' 'VmallocChunk: 0 kB' 'Percpu: 60288 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 576468 kB' 'DirectMap2M: 8540160 kB' 'DirectMap1G: 93323264 kB' 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.266 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.267 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@27 -- # local node 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634628 kB' 'MemFree: 22538420 kB' 'MemUsed: 10096208 kB' 'SwapCached: 0 kB' 'Active: 4372292 kB' 'Inactive: 3400940 kB' 'Active(anon): 4232400 kB' 'Inactive(anon): 0 kB' 'Active(file): 139892 kB' 'Inactive(file): 3400940 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7606608 kB' 'Mapped: 63240 kB' 'AnonPages: 169784 kB' 'Shmem: 4065776 kB' 'KernelStack: 10088 kB' 'PageTables: 3584 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 125024 kB' 'Slab: 365316 kB' 'SReclaimable: 125024 kB' 'SUnreclaim: 240292 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.268 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.269 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60688360 kB' 'MemFree: 56350328 kB' 'MemUsed: 4338032 kB' 'SwapCached: 0 kB' 'Active: 1685680 kB' 'Inactive: 98224 kB' 'Active(anon): 1446144 kB' 'Inactive(anon): 0 kB' 'Active(file): 239536 kB' 'Inactive(file): 98224 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1427592 kB' 'Mapped: 95644 kB' 'AnonPages: 356496 kB' 'Shmem: 1089832 kB' 'KernelStack: 9384 kB' 'PageTables: 4308 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 71120 kB' 'Slab: 222636 kB' 'SReclaimable: 71120 kB' 'SUnreclaim: 151516 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.270 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:04:05.271 node0=512 expecting 513 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:04:05.271 node1=513 expecting 512 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:04:05.271 00:04:05.271 real 0m3.531s 00:04:05.271 user 0m1.524s 00:04:05.271 sys 0m2.073s 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:05.271 11:44:55 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:05.271 ************************************ 00:04:05.271 END TEST odd_alloc 00:04:05.271 ************************************ 00:04:05.271 11:44:55 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:05.271 11:44:55 setup.sh.hugepages -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:04:05.271 11:44:55 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:05.271 11:44:55 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:05.271 11:44:55 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:05.271 ************************************ 00:04:05.271 START TEST custom_alloc 00:04:05.271 ************************************ 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1123 -- # custom_alloc 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # local IFS=, 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@169 -- # local node 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # nodes_hp=() 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # local nodes_hp 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 256 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:05.271 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:05.272 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:05.272 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:05.272 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:04:05.272 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:05.272 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:05.272 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:04:05.272 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:04:05.272 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:05.272 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:05.272 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:05.272 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:05.272 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:05.272 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:05.272 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:04:05.272 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:05.272 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:05.272 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:05.272 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:05.272 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:05.272 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:05.272 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:05.272 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:04:05.272 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:05.272 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:05.272 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:05.272 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:04:05.272 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:04:05.272 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:04:05.272 11:44:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # setup output 00:04:05.272 11:44:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:05.272 11:44:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:08.564 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:04:08.564 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:08.564 0000:5e:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:08.564 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:08.564 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:08.564 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:08.564 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:08.564 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:08.564 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:08.564 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:08.564 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:08.564 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:08.564 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:08.564 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:08.564 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:08.564 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:08.564 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:08.564 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local node 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322988 kB' 'MemFree: 77823008 kB' 'MemAvailable: 81203444 kB' 'Buffers: 11472 kB' 'Cached: 9022800 kB' 'SwapCached: 0 kB' 'Active: 6058420 kB' 'Inactive: 3499164 kB' 'Active(anon): 5678992 kB' 'Inactive(anon): 0 kB' 'Active(file): 379428 kB' 'Inactive(file): 3499164 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 526596 kB' 'Mapped: 158908 kB' 'Shmem: 5155680 kB' 'KReclaimable: 196144 kB' 'Slab: 587544 kB' 'SReclaimable: 196144 kB' 'SUnreclaim: 391400 kB' 'KernelStack: 19488 kB' 'PageTables: 7928 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53477232 kB' 'Committed_AS: 7112184 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217900 kB' 'VmallocChunk: 0 kB' 'Percpu: 60288 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 576468 kB' 'DirectMap2M: 8540160 kB' 'DirectMap1G: 93323264 kB' 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.564 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.565 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.565 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.565 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.565 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.565 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.565 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.565 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.565 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.565 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.565 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.565 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.565 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.565 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.829 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322988 kB' 'MemFree: 77823876 kB' 'MemAvailable: 81204312 kB' 'Buffers: 11472 kB' 'Cached: 9022804 kB' 'SwapCached: 0 kB' 'Active: 6058560 kB' 'Inactive: 3499164 kB' 'Active(anon): 5679132 kB' 'Inactive(anon): 0 kB' 'Active(file): 379428 kB' 'Inactive(file): 3499164 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 526756 kB' 'Mapped: 158888 kB' 'Shmem: 5155684 kB' 'KReclaimable: 196144 kB' 'Slab: 587556 kB' 'SReclaimable: 196144 kB' 'SUnreclaim: 391412 kB' 'KernelStack: 19472 kB' 'PageTables: 7880 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53477232 kB' 'Committed_AS: 7112200 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217868 kB' 'VmallocChunk: 0 kB' 'Percpu: 60288 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 576468 kB' 'DirectMap2M: 8540160 kB' 'DirectMap1G: 93323264 kB' 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.830 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.831 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322988 kB' 'MemFree: 77823120 kB' 'MemAvailable: 81203556 kB' 'Buffers: 11472 kB' 'Cached: 9022804 kB' 'SwapCached: 0 kB' 'Active: 6058560 kB' 'Inactive: 3499164 kB' 'Active(anon): 5679132 kB' 'Inactive(anon): 0 kB' 'Active(file): 379428 kB' 'Inactive(file): 3499164 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 526756 kB' 'Mapped: 158888 kB' 'Shmem: 5155684 kB' 'KReclaimable: 196144 kB' 'Slab: 587556 kB' 'SReclaimable: 196144 kB' 'SUnreclaim: 391412 kB' 'KernelStack: 19472 kB' 'PageTables: 7880 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53477232 kB' 'Committed_AS: 7112224 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217868 kB' 'VmallocChunk: 0 kB' 'Percpu: 60288 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 576468 kB' 'DirectMap2M: 8540160 kB' 'DirectMap1G: 93323264 kB' 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.832 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:04:08.833 nr_hugepages=1536 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:08.833 resv_hugepages=0 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:08.833 surplus_hugepages=0 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:08.833 anon_hugepages=0 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.833 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322988 kB' 'MemFree: 77823480 kB' 'MemAvailable: 81203916 kB' 'Buffers: 11472 kB' 'Cached: 9022808 kB' 'SwapCached: 0 kB' 'Active: 6059380 kB' 'Inactive: 3499164 kB' 'Active(anon): 5679952 kB' 'Inactive(anon): 0 kB' 'Active(file): 379428 kB' 'Inactive(file): 3499164 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 527588 kB' 'Mapped: 159660 kB' 'Shmem: 5155688 kB' 'KReclaimable: 196144 kB' 'Slab: 587556 kB' 'SReclaimable: 196144 kB' 'SUnreclaim: 391412 kB' 'KernelStack: 19440 kB' 'PageTables: 7792 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53477232 kB' 'Committed_AS: 7113864 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217868 kB' 'VmallocChunk: 0 kB' 'Percpu: 60288 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 576468 kB' 'DirectMap2M: 8540160 kB' 'DirectMap1G: 93323264 kB' 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.834 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@27 -- # local node 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634628 kB' 'MemFree: 22520784 kB' 'MemUsed: 10113844 kB' 'SwapCached: 0 kB' 'Active: 4376988 kB' 'Inactive: 3400940 kB' 'Active(anon): 4237096 kB' 'Inactive(anon): 0 kB' 'Active(file): 139892 kB' 'Inactive(file): 3400940 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7606688 kB' 'Mapped: 63244 kB' 'AnonPages: 174376 kB' 'Shmem: 4065856 kB' 'KernelStack: 10088 kB' 'PageTables: 3612 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 125024 kB' 'Slab: 365152 kB' 'SReclaimable: 125024 kB' 'SUnreclaim: 240128 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.835 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.836 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60688360 kB' 'MemFree: 55293876 kB' 'MemUsed: 5394484 kB' 'SwapCached: 0 kB' 'Active: 1687492 kB' 'Inactive: 98224 kB' 'Active(anon): 1447956 kB' 'Inactive(anon): 0 kB' 'Active(file): 239536 kB' 'Inactive(file): 98224 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1427652 kB' 'Mapped: 96056 kB' 'AnonPages: 358272 kB' 'Shmem: 1089892 kB' 'KernelStack: 9384 kB' 'PageTables: 4292 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 71120 kB' 'Slab: 222404 kB' 'SReclaimable: 71120 kB' 'SUnreclaim: 151284 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.837 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:08.838 node0=512 expecting 512 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:04:08.838 node1=1024 expecting 1024 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:04:08.838 00:04:08.838 real 0m3.533s 00:04:08.838 user 0m1.478s 00:04:08.838 sys 0m2.123s 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:08.838 11:44:58 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:08.838 ************************************ 00:04:08.838 END TEST custom_alloc 00:04:08.838 ************************************ 00:04:08.838 11:44:58 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:08.838 11:44:58 setup.sh.hugepages -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:04:08.838 11:44:58 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:08.838 11:44:58 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:08.838 11:44:58 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:08.838 ************************************ 00:04:08.838 START TEST no_shrink_alloc 00:04:08.838 ************************************ 00:04:08.838 11:44:59 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1123 -- # no_shrink_alloc 00:04:08.838 11:44:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:04:08.838 11:44:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:08.838 11:44:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:08.838 11:44:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # shift 00:04:08.838 11:44:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:08.838 11:44:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:04:08.838 11:44:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:08.838 11:44:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:08.838 11:44:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:08.838 11:44:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:08.838 11:44:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:08.838 11:44:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:08.838 11:44:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:08.838 11:44:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:08.838 11:44:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:08.838 11:44:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:08.838 11:44:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:08.838 11:44:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:08.838 11:44:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@73 -- # return 0 00:04:08.838 11:44:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@198 -- # setup output 00:04:08.838 11:44:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:08.838 11:44:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:12.127 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:04:12.127 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:12.127 0000:5e:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:12.127 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:12.127 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:12.127 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:12.127 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:12.127 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:12.127 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:12.127 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:12.127 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:12.127 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:12.127 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:12.127 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:12.127 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:12.127 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:12.127 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:12.127 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:12.127 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:04:12.127 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:04:12.127 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:12.127 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:12.127 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:12.127 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:12.127 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:12.127 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:12.127 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:12.127 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:12.127 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:12.127 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:12.127 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:12.127 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:12.127 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:12.127 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:12.127 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:12.127 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:12.127 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.127 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.127 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322988 kB' 'MemFree: 78785296 kB' 'MemAvailable: 82165724 kB' 'Buffers: 11472 kB' 'Cached: 9022960 kB' 'SwapCached: 0 kB' 'Active: 6065992 kB' 'Inactive: 3499164 kB' 'Active(anon): 5686564 kB' 'Inactive(anon): 0 kB' 'Active(file): 379428 kB' 'Inactive(file): 3499164 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 533648 kB' 'Mapped: 159992 kB' 'Shmem: 5155840 kB' 'KReclaimable: 196128 kB' 'Slab: 587700 kB' 'SReclaimable: 196128 kB' 'SUnreclaim: 391572 kB' 'KernelStack: 19536 kB' 'PageTables: 8116 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001520 kB' 'Committed_AS: 7122308 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217872 kB' 'VmallocChunk: 0 kB' 'Percpu: 60288 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576468 kB' 'DirectMap2M: 8540160 kB' 'DirectMap1G: 93323264 kB' 00:04:12.127 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.127 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.127 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.127 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.127 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.127 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.127 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.127 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.127 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.127 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.127 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.127 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.127 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.127 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.127 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.128 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.392 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.392 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.392 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.392 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.392 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.392 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.392 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.392 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.392 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.392 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.392 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.392 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.392 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.392 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.392 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.392 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.392 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322988 kB' 'MemFree: 78786304 kB' 'MemAvailable: 82166732 kB' 'Buffers: 11472 kB' 'Cached: 9022964 kB' 'SwapCached: 0 kB' 'Active: 6066204 kB' 'Inactive: 3499164 kB' 'Active(anon): 5686776 kB' 'Inactive(anon): 0 kB' 'Active(file): 379428 kB' 'Inactive(file): 3499164 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 533900 kB' 'Mapped: 159956 kB' 'Shmem: 5155844 kB' 'KReclaimable: 196128 kB' 'Slab: 587700 kB' 'SReclaimable: 196128 kB' 'SUnreclaim: 391572 kB' 'KernelStack: 19552 kB' 'PageTables: 8156 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001520 kB' 'Committed_AS: 7123448 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217856 kB' 'VmallocChunk: 0 kB' 'Percpu: 60288 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576468 kB' 'DirectMap2M: 8540160 kB' 'DirectMap1G: 93323264 kB' 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.393 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.394 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322988 kB' 'MemFree: 78792284 kB' 'MemAvailable: 82172712 kB' 'Buffers: 11472 kB' 'Cached: 9022984 kB' 'SwapCached: 0 kB' 'Active: 6066204 kB' 'Inactive: 3499164 kB' 'Active(anon): 5686776 kB' 'Inactive(anon): 0 kB' 'Active(file): 379428 kB' 'Inactive(file): 3499164 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 534332 kB' 'Mapped: 159892 kB' 'Shmem: 5155864 kB' 'KReclaimable: 196128 kB' 'Slab: 587712 kB' 'SReclaimable: 196128 kB' 'SUnreclaim: 391584 kB' 'KernelStack: 19520 kB' 'PageTables: 8084 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001520 kB' 'Committed_AS: 7124960 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217840 kB' 'VmallocChunk: 0 kB' 'Percpu: 60288 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576468 kB' 'DirectMap2M: 8540160 kB' 'DirectMap1G: 93323264 kB' 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.395 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.396 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:12.397 nr_hugepages=1024 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:12.397 resv_hugepages=0 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:12.397 surplus_hugepages=0 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:12.397 anon_hugepages=0 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322988 kB' 'MemFree: 78792276 kB' 'MemAvailable: 82172704 kB' 'Buffers: 11472 kB' 'Cached: 9022984 kB' 'SwapCached: 0 kB' 'Active: 6065904 kB' 'Inactive: 3499164 kB' 'Active(anon): 5686476 kB' 'Inactive(anon): 0 kB' 'Active(file): 379428 kB' 'Inactive(file): 3499164 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 534036 kB' 'Mapped: 159892 kB' 'Shmem: 5155864 kB' 'KReclaimable: 196128 kB' 'Slab: 587680 kB' 'SReclaimable: 196128 kB' 'SUnreclaim: 391552 kB' 'KernelStack: 19648 kB' 'PageTables: 8064 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001520 kB' 'Committed_AS: 7124984 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217824 kB' 'VmallocChunk: 0 kB' 'Percpu: 60288 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576468 kB' 'DirectMap2M: 8540160 kB' 'DirectMap1G: 93323264 kB' 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.397 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.398 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634628 kB' 'MemFree: 21419912 kB' 'MemUsed: 11214716 kB' 'SwapCached: 0 kB' 'Active: 4378764 kB' 'Inactive: 3400940 kB' 'Active(anon): 4238872 kB' 'Inactive(anon): 0 kB' 'Active(file): 139892 kB' 'Inactive(file): 3400940 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7606740 kB' 'Mapped: 63428 kB' 'AnonPages: 176216 kB' 'Shmem: 4065908 kB' 'KernelStack: 10184 kB' 'PageTables: 4080 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 125024 kB' 'Slab: 365320 kB' 'SReclaimable: 125024 kB' 'SUnreclaim: 240296 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.399 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.400 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.401 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.401 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.401 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.401 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.401 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.401 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.401 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.401 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.401 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.401 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.401 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.401 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.401 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.401 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.401 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.401 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.401 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.401 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:12.401 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.401 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.401 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.401 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:12.401 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:12.401 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:12.401 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:12.401 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:12.401 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:12.401 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:12.401 node0=1024 expecting 1024 00:04:12.401 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:12.401 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:04:12.401 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # NRHUGE=512 00:04:12.401 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # setup output 00:04:12.401 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:12.401 11:45:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:15.695 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:04:15.695 0000:5e:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:15.695 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:15.695 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:15.695 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:15.695 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:15.695 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:15.695 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:15.695 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:15.695 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:15.695 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:15.695 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:15.696 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:15.696 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:15.696 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:15.696 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:15.696 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:15.696 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:15.696 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322988 kB' 'MemFree: 78791472 kB' 'MemAvailable: 82171900 kB' 'Buffers: 11472 kB' 'Cached: 9023100 kB' 'SwapCached: 0 kB' 'Active: 6066696 kB' 'Inactive: 3499164 kB' 'Active(anon): 5687268 kB' 'Inactive(anon): 0 kB' 'Active(file): 379428 kB' 'Inactive(file): 3499164 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 534632 kB' 'Mapped: 159752 kB' 'Shmem: 5155980 kB' 'KReclaimable: 196128 kB' 'Slab: 587916 kB' 'SReclaimable: 196128 kB' 'SUnreclaim: 391788 kB' 'KernelStack: 19536 kB' 'PageTables: 8096 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001520 kB' 'Committed_AS: 7122356 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217888 kB' 'VmallocChunk: 0 kB' 'Percpu: 60288 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576468 kB' 'DirectMap2M: 8540160 kB' 'DirectMap1G: 93323264 kB' 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.696 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322988 kB' 'MemFree: 78792684 kB' 'MemAvailable: 82173112 kB' 'Buffers: 11472 kB' 'Cached: 9023104 kB' 'SwapCached: 0 kB' 'Active: 6066392 kB' 'Inactive: 3499164 kB' 'Active(anon): 5686964 kB' 'Inactive(anon): 0 kB' 'Active(file): 379428 kB' 'Inactive(file): 3499164 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 534360 kB' 'Mapped: 159736 kB' 'Shmem: 5155984 kB' 'KReclaimable: 196128 kB' 'Slab: 588024 kB' 'SReclaimable: 196128 kB' 'SUnreclaim: 391896 kB' 'KernelStack: 19552 kB' 'PageTables: 8156 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001520 kB' 'Committed_AS: 7122376 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217840 kB' 'VmallocChunk: 0 kB' 'Percpu: 60288 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576468 kB' 'DirectMap2M: 8540160 kB' 'DirectMap1G: 93323264 kB' 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.697 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.698 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322988 kB' 'MemFree: 78794460 kB' 'MemAvailable: 82174888 kB' 'Buffers: 11472 kB' 'Cached: 9023104 kB' 'SwapCached: 0 kB' 'Active: 6066116 kB' 'Inactive: 3499164 kB' 'Active(anon): 5686688 kB' 'Inactive(anon): 0 kB' 'Active(file): 379428 kB' 'Inactive(file): 3499164 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 534080 kB' 'Mapped: 159736 kB' 'Shmem: 5155984 kB' 'KReclaimable: 196128 kB' 'Slab: 588024 kB' 'SReclaimable: 196128 kB' 'SUnreclaim: 391896 kB' 'KernelStack: 19552 kB' 'PageTables: 8156 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001520 kB' 'Committed_AS: 7122396 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217840 kB' 'VmallocChunk: 0 kB' 'Percpu: 60288 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576468 kB' 'DirectMap2M: 8540160 kB' 'DirectMap1G: 93323264 kB' 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.699 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.700 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.701 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:15.963 nr_hugepages=1024 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:15.963 resv_hugepages=0 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:15.963 surplus_hugepages=0 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:15.963 anon_hugepages=0 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322988 kB' 'MemFree: 78795432 kB' 'MemAvailable: 82175860 kB' 'Buffers: 11472 kB' 'Cached: 9023144 kB' 'SwapCached: 0 kB' 'Active: 6066428 kB' 'Inactive: 3499164 kB' 'Active(anon): 5687000 kB' 'Inactive(anon): 0 kB' 'Active(file): 379428 kB' 'Inactive(file): 3499164 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 534356 kB' 'Mapped: 159736 kB' 'Shmem: 5156024 kB' 'KReclaimable: 196128 kB' 'Slab: 588024 kB' 'SReclaimable: 196128 kB' 'SUnreclaim: 391896 kB' 'KernelStack: 19552 kB' 'PageTables: 8156 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001520 kB' 'Committed_AS: 7122420 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217840 kB' 'VmallocChunk: 0 kB' 'Percpu: 60288 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576468 kB' 'DirectMap2M: 8540160 kB' 'DirectMap1G: 93323264 kB' 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.963 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.964 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634628 kB' 'MemFree: 21418144 kB' 'MemUsed: 11216484 kB' 'SwapCached: 0 kB' 'Active: 4378616 kB' 'Inactive: 3400940 kB' 'Active(anon): 4238724 kB' 'Inactive(anon): 0 kB' 'Active(file): 139892 kB' 'Inactive(file): 3400940 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7606776 kB' 'Mapped: 63272 kB' 'AnonPages: 176088 kB' 'Shmem: 4065944 kB' 'KernelStack: 10168 kB' 'PageTables: 3740 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 125024 kB' 'Slab: 365368 kB' 'SReclaimable: 125024 kB' 'SUnreclaim: 240344 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.965 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.966 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:15.967 node0=1024 expecting 1024 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:15.967 00:04:15.967 real 0m6.952s 00:04:15.967 user 0m2.776s 00:04:15.967 sys 0m4.306s 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:15.967 11:45:05 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:15.967 ************************************ 00:04:15.967 END TEST no_shrink_alloc 00:04:15.967 ************************************ 00:04:15.967 11:45:06 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:15.967 11:45:06 setup.sh.hugepages -- setup/hugepages.sh@217 -- # clear_hp 00:04:15.967 11:45:06 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:04:15.967 11:45:06 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:15.967 11:45:06 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:15.967 11:45:06 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:15.967 11:45:06 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:15.967 11:45:06 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:15.967 11:45:06 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:15.967 11:45:06 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:15.967 11:45:06 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:15.967 11:45:06 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:15.967 11:45:06 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:15.967 11:45:06 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:15.967 11:45:06 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:15.967 00:04:15.967 real 0m26.109s 00:04:15.967 user 0m10.344s 00:04:15.967 sys 0m15.439s 00:04:15.967 11:45:06 setup.sh.hugepages -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:15.967 11:45:06 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:15.967 ************************************ 00:04:15.967 END TEST hugepages 00:04:15.967 ************************************ 00:04:15.967 11:45:06 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:04:15.967 11:45:06 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:04:15.967 11:45:06 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:15.967 11:45:06 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:15.967 11:45:06 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:15.967 ************************************ 00:04:15.967 START TEST driver 00:04:15.967 ************************************ 00:04:15.967 11:45:06 setup.sh.driver -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:04:15.967 * Looking for test storage... 00:04:15.967 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:15.967 11:45:06 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:04:15.967 11:45:06 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:15.967 11:45:06 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:21.239 11:45:10 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:04:21.239 11:45:10 setup.sh.driver -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:21.239 11:45:10 setup.sh.driver -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:21.239 11:45:10 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:04:21.239 ************************************ 00:04:21.239 START TEST guess_driver 00:04:21.239 ************************************ 00:04:21.239 11:45:10 setup.sh.driver.guess_driver -- common/autotest_common.sh@1123 -- # guess_driver 00:04:21.239 11:45:10 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:04:21.239 11:45:10 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:04:21.239 11:45:10 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:04:21.239 11:45:10 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:04:21.239 11:45:10 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:04:21.239 11:45:10 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:04:21.239 11:45:10 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:04:21.239 11:45:10 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:04:21.239 11:45:10 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:04:21.239 11:45:10 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 220 > 0 )) 00:04:21.239 11:45:10 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:04:21.239 11:45:10 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:04:21.239 11:45:10 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:04:21.239 11:45:10 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:04:21.239 11:45:10 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:04:21.239 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:21.239 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:21.239 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:21.239 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:21.239 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:04:21.239 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:04:21.239 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:04:21.239 11:45:10 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:04:21.239 11:45:10 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:04:21.239 11:45:10 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:04:21.239 11:45:10 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:04:21.239 11:45:10 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:04:21.239 Looking for driver=vfio-pci 00:04:21.239 11:45:10 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:21.239 11:45:10 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:04:21.239 11:45:10 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:04:21.239 11:45:10 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:23.774 11:45:13 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ denied == \-\> ]] 00:04:23.774 11:45:13 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # continue 00:04:23.774 11:45:13 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:24.034 11:45:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:24.972 11:45:15 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:24.972 11:45:15 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:24.972 11:45:15 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:24.972 11:45:15 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:04:24.972 11:45:15 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:04:24.972 11:45:15 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:24.972 11:45:15 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:30.249 00:04:30.249 real 0m8.927s 00:04:30.249 user 0m2.752s 00:04:30.249 sys 0m4.607s 00:04:30.249 11:45:19 setup.sh.driver.guess_driver -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:30.249 11:45:19 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:04:30.249 ************************************ 00:04:30.249 END TEST guess_driver 00:04:30.249 ************************************ 00:04:30.249 11:45:19 setup.sh.driver -- common/autotest_common.sh@1142 -- # return 0 00:04:30.249 00:04:30.249 real 0m13.686s 00:04:30.249 user 0m4.165s 00:04:30.249 sys 0m7.147s 00:04:30.249 11:45:19 setup.sh.driver -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:30.249 11:45:19 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:04:30.249 ************************************ 00:04:30.249 END TEST driver 00:04:30.249 ************************************ 00:04:30.249 11:45:19 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:04:30.249 11:45:19 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:04:30.249 11:45:19 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:30.249 11:45:19 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:30.249 11:45:19 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:30.249 ************************************ 00:04:30.249 START TEST devices 00:04:30.249 ************************************ 00:04:30.249 11:45:19 setup.sh.devices -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:04:30.249 * Looking for test storage... 00:04:30.249 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:30.249 11:45:19 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:04:30.249 11:45:19 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:04:30.249 11:45:19 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:30.249 11:45:19 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:33.537 11:45:23 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:04:33.537 11:45:23 setup.sh.devices -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:04:33.537 11:45:23 setup.sh.devices -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:04:33.537 11:45:23 setup.sh.devices -- common/autotest_common.sh@1670 -- # local nvme bdf 00:04:33.537 11:45:23 setup.sh.devices -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:04:33.537 11:45:23 setup.sh.devices -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:04:33.537 11:45:23 setup.sh.devices -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:04:33.537 11:45:23 setup.sh.devices -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:33.537 11:45:23 setup.sh.devices -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:04:33.537 11:45:23 setup.sh.devices -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:04:33.537 11:45:23 setup.sh.devices -- common/autotest_common.sh@1673 -- # is_block_zoned nvme1n1 00:04:33.537 11:45:23 setup.sh.devices -- common/autotest_common.sh@1662 -- # local device=nvme1n1 00:04:33.537 11:45:23 setup.sh.devices -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme1n1/queue/zoned ]] 00:04:33.537 11:45:23 setup.sh.devices -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:04:33.537 11:45:23 setup.sh.devices -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:04:33.537 11:45:23 setup.sh.devices -- common/autotest_common.sh@1673 -- # is_block_zoned nvme1n2 00:04:33.537 11:45:23 setup.sh.devices -- common/autotest_common.sh@1662 -- # local device=nvme1n2 00:04:33.537 11:45:23 setup.sh.devices -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme1n2/queue/zoned ]] 00:04:33.537 11:45:23 setup.sh.devices -- common/autotest_common.sh@1665 -- # [[ host-managed != none ]] 00:04:33.537 11:45:23 setup.sh.devices -- common/autotest_common.sh@1674 -- # zoned_devs["${nvme##*/}"]=0000:5f:00.0 00:04:33.537 11:45:23 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:04:33.537 11:45:23 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:04:33.537 11:45:23 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:04:33.537 11:45:23 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:04:33.537 11:45:23 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:04:33.537 11:45:23 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:33.537 11:45:23 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:04:33.537 11:45:23 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:04:33.537 11:45:23 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:5e:00.0 00:04:33.537 11:45:23 setup.sh.devices -- setup/devices.sh@203 -- # [[ 0000:5f:00.0 == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:04:33.537 11:45:23 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:04:33.537 11:45:23 setup.sh.devices -- scripts/common.sh@378 -- # local block=nvme0n1 pt 00:04:33.537 11:45:23 setup.sh.devices -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:04:33.537 No valid GPT data, bailing 00:04:33.537 11:45:23 setup.sh.devices -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:33.537 11:45:23 setup.sh.devices -- scripts/common.sh@391 -- # pt= 00:04:33.537 11:45:23 setup.sh.devices -- scripts/common.sh@392 -- # return 1 00:04:33.537 11:45:23 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:04:33.537 11:45:23 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:04:33.537 11:45:23 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:04:33.537 11:45:23 setup.sh.devices -- setup/common.sh@80 -- # echo 1000204886016 00:04:33.537 11:45:23 setup.sh.devices -- setup/devices.sh@204 -- # (( 1000204886016 >= min_disk_size )) 00:04:33.537 11:45:23 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:04:33.537 11:45:23 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:5e:00.0 00:04:33.537 11:45:23 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:33.537 11:45:23 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme1n1 00:04:33.537 11:45:23 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme1 00:04:33.537 11:45:23 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:5f:00.0 00:04:33.537 11:45:23 setup.sh.devices -- setup/devices.sh@203 -- # [[ 0000:5f:00.0 == *\0\0\0\0\:\5\f\:\0\0\.\0* ]] 00:04:33.537 11:45:23 setup.sh.devices -- setup/devices.sh@203 -- # continue 00:04:33.537 11:45:23 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:33.537 11:45:23 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme1n2 00:04:33.537 11:45:23 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme1 00:04:33.537 11:45:23 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:5f:00.0 00:04:33.537 11:45:23 setup.sh.devices -- setup/devices.sh@203 -- # [[ 0000:5f:00.0 == *\0\0\0\0\:\5\f\:\0\0\.\0* ]] 00:04:33.537 11:45:23 setup.sh.devices -- setup/devices.sh@203 -- # continue 00:04:33.537 11:45:23 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:04:33.537 11:45:23 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:04:33.537 11:45:23 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:04:33.537 11:45:23 setup.sh.devices -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:33.537 11:45:23 setup.sh.devices -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:33.537 11:45:23 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:04:33.537 ************************************ 00:04:33.537 START TEST nvme_mount 00:04:33.537 ************************************ 00:04:33.537 11:45:23 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1123 -- # nvme_mount 00:04:33.537 11:45:23 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:04:33.537 11:45:23 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:04:33.537 11:45:23 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:33.537 11:45:23 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:33.537 11:45:23 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:04:33.537 11:45:23 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:33.537 11:45:23 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:04:33.537 11:45:23 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:04:33.538 11:45:23 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:33.538 11:45:23 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:04:33.538 11:45:23 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:04:33.538 11:45:23 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:04:33.538 11:45:23 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:33.538 11:45:23 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:33.538 11:45:23 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:04:33.538 11:45:23 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:33.538 11:45:23 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:33.538 11:45:23 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:33.538 11:45:23 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:04:34.512 Creating new GPT entries in memory. 00:04:34.513 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:34.513 other utilities. 00:04:34.513 11:45:24 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:04:34.513 11:45:24 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:34.513 11:45:24 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:34.513 11:45:24 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:34.513 11:45:24 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:35.573 Creating new GPT entries in memory. 00:04:35.573 The operation has completed successfully. 00:04:35.573 11:45:25 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:04:35.573 11:45:25 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:35.573 11:45:25 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 527926 00:04:35.573 11:45:25 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:35.573 11:45:25 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size= 00:04:35.573 11:45:25 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:35.573 11:45:25 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:04:35.573 11:45:25 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:04:35.573 11:45:25 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:35.573 11:45:25 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:35.573 11:45:25 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:04:35.573 11:45:25 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:04:35.573 11:45:25 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:35.573 11:45:25 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:35.573 11:45:25 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:04:35.573 11:45:25 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:35.573 11:45:25 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:04:35.573 11:45:25 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:04:35.573 11:45:25 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:35.573 11:45:25 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:04:35.573 11:45:25 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:04:35.573 11:45:25 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:35.573 11:45:25 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:38.864 11:45:28 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5f:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:38.864 11:45:28 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:38.864 11:45:28 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:38.864 11:45:28 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:04:38.864 11:45:28 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:04:38.864 11:45:28 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:38.864 11:45:28 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:38.864 11:45:28 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:38.864 11:45:28 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:38.864 11:45:28 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:38.864 11:45:28 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:38.864 11:45:28 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:38.864 11:45:28 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:38.864 11:45:28 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:38.864 11:45:28 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:38.864 11:45:28 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:38.865 11:45:28 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:38.865 11:45:28 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:38.865 11:45:28 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:38.865 11:45:28 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:38.865 11:45:28 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:38.865 11:45:28 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:38.865 11:45:28 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:38.865 11:45:28 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:38.865 11:45:28 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:38.865 11:45:28 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:38.865 11:45:28 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:38.865 11:45:28 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:38.865 11:45:28 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:38.865 11:45:28 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:38.865 11:45:28 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:38.865 11:45:28 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:38.865 11:45:28 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:38.865 11:45:28 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:38.865 11:45:28 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:38.865 11:45:28 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:38.865 11:45:28 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:38.865 11:45:28 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:38.865 11:45:29 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:38.865 11:45:29 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:38.865 11:45:29 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:38.865 11:45:29 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:38.865 11:45:29 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:38.865 11:45:29 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:04:38.865 11:45:29 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:39.124 11:45:29 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:39.124 11:45:29 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:39.124 11:45:29 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:39.124 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:39.124 11:45:29 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:39.124 11:45:29 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:39.382 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:39.382 /dev/nvme0n1: 8 bytes were erased at offset 0xe8e0db5e00 (gpt): 45 46 49 20 50 41 52 54 00:04:39.382 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:39.382 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:39.382 11:45:29 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:04:39.382 11:45:29 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:04:39.382 11:45:29 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:39.382 11:45:29 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:04:39.382 11:45:29 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:04:39.382 11:45:29 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:39.382 11:45:29 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:39.382 11:45:29 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:04:39.382 11:45:29 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:04:39.382 11:45:29 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:39.382 11:45:29 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:39.382 11:45:29 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:04:39.382 11:45:29 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:39.382 11:45:29 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:04:39.382 11:45:29 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:04:39.382 11:45:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.382 11:45:29 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:04:39.382 11:45:29 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:04:39.382 11:45:29 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:39.382 11:45:29 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5f:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:5e:00.0 data@nvme0n1 '' '' 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:04:42.674 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:04:42.675 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:42.675 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:04:42.675 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:42.675 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:04:42.675 11:45:32 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:04:42.675 11:45:32 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:42.675 11:45:32 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:45.961 11:45:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5f:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:45.961 11:45:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.961 11:45:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:45.961 11:45:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:04:45.961 11:45:35 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:04:45.961 11:45:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.961 11:45:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:45.961 11:45:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.961 11:45:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:45.961 11:45:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.961 11:45:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:45.961 11:45:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.961 11:45:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:45.961 11:45:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.961 11:45:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:45.961 11:45:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.961 11:45:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:45.961 11:45:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.961 11:45:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:45.961 11:45:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.961 11:45:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:45.961 11:45:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.961 11:45:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:45.961 11:45:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.961 11:45:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:45.961 11:45:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.961 11:45:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:45.961 11:45:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.961 11:45:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:45.961 11:45:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.961 11:45:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:45.961 11:45:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.961 11:45:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:45.961 11:45:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.961 11:45:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:45.961 11:45:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.961 11:45:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:45.961 11:45:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.961 11:45:36 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:45.961 11:45:36 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:45.961 11:45:36 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:04:45.961 11:45:36 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:04:45.961 11:45:36 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:45.961 11:45:36 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:45.961 11:45:36 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:45.961 11:45:36 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:45.961 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:45.961 00:04:45.961 real 0m12.505s 00:04:45.961 user 0m3.922s 00:04:45.961 sys 0m6.403s 00:04:45.961 11:45:36 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:45.961 11:45:36 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:04:45.961 ************************************ 00:04:45.961 END TEST nvme_mount 00:04:45.961 ************************************ 00:04:46.220 11:45:36 setup.sh.devices -- common/autotest_common.sh@1142 -- # return 0 00:04:46.220 11:45:36 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:04:46.220 11:45:36 setup.sh.devices -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:46.220 11:45:36 setup.sh.devices -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:46.220 11:45:36 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:04:46.220 ************************************ 00:04:46.220 START TEST dm_mount 00:04:46.220 ************************************ 00:04:46.220 11:45:36 setup.sh.devices.dm_mount -- common/autotest_common.sh@1123 -- # dm_mount 00:04:46.220 11:45:36 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:04:46.220 11:45:36 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:04:46.220 11:45:36 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:04:46.220 11:45:36 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:04:46.220 11:45:36 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:46.220 11:45:36 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:04:46.220 11:45:36 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:04:46.220 11:45:36 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:46.220 11:45:36 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:04:46.221 11:45:36 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:04:46.221 11:45:36 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:04:46.221 11:45:36 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:46.221 11:45:36 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:46.221 11:45:36 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:04:46.221 11:45:36 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:46.221 11:45:36 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:46.221 11:45:36 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:04:46.221 11:45:36 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:46.221 11:45:36 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:46.221 11:45:36 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:46.221 11:45:36 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:04:47.156 Creating new GPT entries in memory. 00:04:47.156 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:47.156 other utilities. 00:04:47.156 11:45:37 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:04:47.156 11:45:37 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:47.156 11:45:37 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:47.156 11:45:37 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:47.156 11:45:37 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:48.093 Creating new GPT entries in memory. 00:04:48.093 The operation has completed successfully. 00:04:48.093 11:45:38 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:04:48.093 11:45:38 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:48.093 11:45:38 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:48.093 11:45:38 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:48.093 11:45:38 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:04:49.471 The operation has completed successfully. 00:04:49.472 11:45:39 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:04:49.472 11:45:39 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:49.472 11:45:39 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 532659 00:04:49.472 11:45:39 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:04:49.472 11:45:39 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:49.472 11:45:39 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:49.472 11:45:39 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:04:49.472 11:45:39 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:04:49.472 11:45:39 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:49.472 11:45:39 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:04:49.472 11:45:39 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:49.472 11:45:39 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:04:49.472 11:45:39 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:04:49.472 11:45:39 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:04:49.472 11:45:39 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:04:49.472 11:45:39 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:04:49.472 11:45:39 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:49.472 11:45:39 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount size= 00:04:49.472 11:45:39 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:49.472 11:45:39 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:49.472 11:45:39 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:04:49.472 11:45:39 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:49.472 11:45:39 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:5e:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:49.472 11:45:39 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:04:49.472 11:45:39 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:04:49.472 11:45:39 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:49.472 11:45:39 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:49.472 11:45:39 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:04:49.472 11:45:39 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:49.472 11:45:39 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:04:49.472 11:45:39 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:04:49.472 11:45:39 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.472 11:45:39 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:04:49.472 11:45:39 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:04:49.472 11:45:39 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:49.472 11:45:39 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5f:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount ]] 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:5e:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:52.762 11:45:42 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:56.053 11:45:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5f:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:56.053 11:45:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.053 11:45:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:56.053 11:45:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:04:56.053 11:45:45 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:04:56.053 11:45:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.053 11:45:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:56.053 11:45:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.053 11:45:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:56.053 11:45:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.053 11:45:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:56.053 11:45:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.053 11:45:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:56.053 11:45:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.053 11:45:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:56.053 11:45:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.054 11:45:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:56.054 11:45:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.054 11:45:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:56.054 11:45:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.054 11:45:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:56.054 11:45:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.054 11:45:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:56.054 11:45:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.054 11:45:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:56.054 11:45:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.054 11:45:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:56.054 11:45:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.054 11:45:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:56.054 11:45:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.054 11:45:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:56.054 11:45:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.054 11:45:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:56.054 11:45:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.054 11:45:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:56.054 11:45:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.054 11:45:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:56.054 11:45:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.054 11:45:46 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:56.054 11:45:46 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:56.054 11:45:46 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:04:56.054 11:45:46 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:04:56.054 11:45:46 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:56.054 11:45:46 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:04:56.054 11:45:46 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:04:56.054 11:45:46 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:56.054 11:45:46 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:04:56.054 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:56.054 11:45:46 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:04:56.054 11:45:46 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:04:56.054 00:04:56.054 real 0m9.908s 00:04:56.054 user 0m2.684s 00:04:56.054 sys 0m4.256s 00:04:56.054 11:45:46 setup.sh.devices.dm_mount -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:56.054 11:45:46 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:04:56.054 ************************************ 00:04:56.054 END TEST dm_mount 00:04:56.054 ************************************ 00:04:56.054 11:45:46 setup.sh.devices -- common/autotest_common.sh@1142 -- # return 0 00:04:56.054 11:45:46 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:04:56.054 11:45:46 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:04:56.054 11:45:46 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:56.054 11:45:46 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:56.054 11:45:46 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:56.054 11:45:46 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:56.054 11:45:46 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:56.314 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:56.314 /dev/nvme0n1: 8 bytes were erased at offset 0xe8e0db5e00 (gpt): 45 46 49 20 50 41 52 54 00:04:56.314 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:56.314 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:56.314 11:45:46 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:04:56.314 11:45:46 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:56.314 11:45:46 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:04:56.314 11:45:46 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:56.314 11:45:46 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:04:56.314 11:45:46 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:04:56.314 11:45:46 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:04:56.314 00:04:56.314 real 0m26.618s 00:04:56.314 user 0m8.121s 00:04:56.314 sys 0m13.232s 00:04:56.314 11:45:46 setup.sh.devices -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:56.314 11:45:46 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:04:56.314 ************************************ 00:04:56.314 END TEST devices 00:04:56.314 ************************************ 00:04:56.314 11:45:46 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:04:56.314 00:04:56.314 real 1m30.218s 00:04:56.314 user 0m30.851s 00:04:56.314 sys 0m50.079s 00:04:56.314 11:45:46 setup.sh -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:56.314 11:45:46 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:56.314 ************************************ 00:04:56.314 END TEST setup.sh 00:04:56.314 ************************************ 00:04:56.314 11:45:46 -- common/autotest_common.sh@1142 -- # return 0 00:04:56.314 11:45:46 -- spdk/autotest.sh@128 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:04:59.605 Hugepages 00:04:59.605 node hugesize free / total 00:04:59.605 node0 1048576kB 0 / 0 00:04:59.605 node0 2048kB 1024 / 1024 00:04:59.605 node1 1048576kB 0 / 0 00:04:59.605 node1 2048kB 1024 / 1024 00:04:59.605 00:04:59.605 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:59.605 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:04:59.605 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:04:59.605 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:04:59.605 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:04:59.605 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:04:59.605 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:04:59.605 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:04:59.605 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:04:59.605 NVMe 0000:5e:00.0 8086 0a54 0 nvme nvme0 nvme0n1 00:04:59.864 NVMe 0000:5f:00.0 1b96 2600 0 nvme nvme1 nvme1n1 nvme1n2 00:04:59.864 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:04:59.864 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:04:59.864 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:04:59.864 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:04:59.864 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:04:59.864 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:04:59.864 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:04:59.864 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:04:59.864 11:45:49 -- spdk/autotest.sh@130 -- # uname -s 00:04:59.864 11:45:49 -- spdk/autotest.sh@130 -- # [[ Linux == Linux ]] 00:04:59.864 11:45:49 -- spdk/autotest.sh@132 -- # nvme_namespace_revert 00:04:59.864 11:45:49 -- common/autotest_common.sh@1531 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:03.156 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:05:03.156 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:03.156 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:03.156 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:03.156 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:03.156 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:03.156 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:03.156 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:03.156 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:03.156 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:03.156 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:03.156 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:03.156 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:03.156 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:03.156 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:03.156 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:03.156 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:04.095 0000:5e:00.0 (8086 0a54): nvme -> vfio-pci 00:05:04.095 11:45:54 -- common/autotest_common.sh@1532 -- # sleep 1 00:05:05.033 11:45:55 -- common/autotest_common.sh@1533 -- # bdfs=() 00:05:05.033 11:45:55 -- common/autotest_common.sh@1533 -- # local bdfs 00:05:05.033 11:45:55 -- common/autotest_common.sh@1534 -- # bdfs=($(get_nvme_bdfs)) 00:05:05.033 11:45:55 -- common/autotest_common.sh@1534 -- # get_nvme_bdfs 00:05:05.033 11:45:55 -- common/autotest_common.sh@1513 -- # bdfs=() 00:05:05.033 11:45:55 -- common/autotest_common.sh@1513 -- # local bdfs 00:05:05.033 11:45:55 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:05.033 11:45:55 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:05.033 11:45:55 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:05:05.294 11:45:55 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:05:05.294 11:45:55 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:5e:00.0 00:05:05.294 11:45:55 -- common/autotest_common.sh@1536 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:08.581 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:05:08.581 Waiting for block devices as requested 00:05:08.581 0000:5e:00.0 (8086 0a54): vfio-pci -> nvme 00:05:08.581 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:08.581 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:08.581 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:08.840 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:08.840 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:08.840 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:09.100 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:09.100 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:09.100 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:09.100 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:09.359 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:09.359 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:09.359 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:09.619 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:09.619 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:09.619 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:09.619 11:45:59 -- common/autotest_common.sh@1538 -- # for bdf in "${bdfs[@]}" 00:05:09.619 11:45:59 -- common/autotest_common.sh@1539 -- # get_nvme_ctrlr_from_bdf 0000:5e:00.0 00:05:09.619 11:45:59 -- common/autotest_common.sh@1502 -- # readlink -f /sys/class/nvme/nvme0 /sys/class/nvme/nvme1 00:05:09.619 11:45:59 -- common/autotest_common.sh@1502 -- # grep 0000:5e:00.0/nvme/nvme 00:05:09.879 11:45:59 -- common/autotest_common.sh@1502 -- # bdf_sysfs_path=/sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:05:09.879 11:45:59 -- common/autotest_common.sh@1503 -- # [[ -z /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 ]] 00:05:09.879 11:45:59 -- common/autotest_common.sh@1507 -- # basename /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:05:09.879 11:45:59 -- common/autotest_common.sh@1507 -- # printf '%s\n' nvme0 00:05:09.879 11:45:59 -- common/autotest_common.sh@1539 -- # nvme_ctrlr=/dev/nvme0 00:05:09.879 11:45:59 -- common/autotest_common.sh@1540 -- # [[ -z /dev/nvme0 ]] 00:05:09.879 11:45:59 -- common/autotest_common.sh@1545 -- # nvme id-ctrl /dev/nvme0 00:05:09.879 11:45:59 -- common/autotest_common.sh@1545 -- # grep oacs 00:05:09.879 11:45:59 -- common/autotest_common.sh@1545 -- # cut -d: -f2 00:05:09.879 11:45:59 -- common/autotest_common.sh@1545 -- # oacs=' 0xf' 00:05:09.879 11:45:59 -- common/autotest_common.sh@1546 -- # oacs_ns_manage=8 00:05:09.879 11:45:59 -- common/autotest_common.sh@1548 -- # [[ 8 -ne 0 ]] 00:05:09.879 11:45:59 -- common/autotest_common.sh@1554 -- # nvme id-ctrl /dev/nvme0 00:05:09.879 11:45:59 -- common/autotest_common.sh@1554 -- # grep unvmcap 00:05:09.879 11:45:59 -- common/autotest_common.sh@1554 -- # cut -d: -f2 00:05:09.879 11:45:59 -- common/autotest_common.sh@1554 -- # unvmcap=' 0' 00:05:09.879 11:45:59 -- common/autotest_common.sh@1555 -- # [[ 0 -eq 0 ]] 00:05:09.879 11:45:59 -- common/autotest_common.sh@1557 -- # continue 00:05:09.879 11:45:59 -- spdk/autotest.sh@135 -- # timing_exit pre_cleanup 00:05:09.879 11:45:59 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:09.879 11:45:59 -- common/autotest_common.sh@10 -- # set +x 00:05:09.879 11:45:59 -- spdk/autotest.sh@138 -- # timing_enter afterboot 00:05:09.879 11:45:59 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:09.879 11:45:59 -- common/autotest_common.sh@10 -- # set +x 00:05:09.879 11:45:59 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:13.171 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:05:13.171 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:13.171 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:13.171 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:13.171 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:13.171 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:13.172 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:13.172 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:13.172 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:13.172 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:13.172 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:13.172 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:13.172 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:13.172 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:13.172 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:13.172 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:13.172 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:14.108 0000:5e:00.0 (8086 0a54): nvme -> vfio-pci 00:05:14.108 11:46:04 -- spdk/autotest.sh@140 -- # timing_exit afterboot 00:05:14.108 11:46:04 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:14.108 11:46:04 -- common/autotest_common.sh@10 -- # set +x 00:05:14.108 11:46:04 -- spdk/autotest.sh@144 -- # opal_revert_cleanup 00:05:14.108 11:46:04 -- common/autotest_common.sh@1591 -- # mapfile -t bdfs 00:05:14.108 11:46:04 -- common/autotest_common.sh@1591 -- # get_nvme_bdfs_by_id 0x0a54 00:05:14.108 11:46:04 -- common/autotest_common.sh@1577 -- # bdfs=() 00:05:14.108 11:46:04 -- common/autotest_common.sh@1577 -- # local bdfs 00:05:14.108 11:46:04 -- common/autotest_common.sh@1579 -- # get_nvme_bdfs 00:05:14.108 11:46:04 -- common/autotest_common.sh@1513 -- # bdfs=() 00:05:14.108 11:46:04 -- common/autotest_common.sh@1513 -- # local bdfs 00:05:14.108 11:46:04 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:14.108 11:46:04 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:14.108 11:46:04 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:05:14.108 11:46:04 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:05:14.108 11:46:04 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:5e:00.0 00:05:14.108 11:46:04 -- common/autotest_common.sh@1579 -- # for bdf in $(get_nvme_bdfs) 00:05:14.108 11:46:04 -- common/autotest_common.sh@1580 -- # cat /sys/bus/pci/devices/0000:5e:00.0/device 00:05:14.108 11:46:04 -- common/autotest_common.sh@1580 -- # device=0x0a54 00:05:14.108 11:46:04 -- common/autotest_common.sh@1581 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:05:14.108 11:46:04 -- common/autotest_common.sh@1582 -- # bdfs+=($bdf) 00:05:14.108 11:46:04 -- common/autotest_common.sh@1586 -- # printf '%s\n' 0000:5e:00.0 00:05:14.108 11:46:04 -- common/autotest_common.sh@1592 -- # [[ -z 0000:5e:00.0 ]] 00:05:14.108 11:46:04 -- common/autotest_common.sh@1597 -- # spdk_tgt_pid=542930 00:05:14.108 11:46:04 -- common/autotest_common.sh@1598 -- # waitforlisten 542930 00:05:14.108 11:46:04 -- common/autotest_common.sh@1596 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:14.108 11:46:04 -- common/autotest_common.sh@829 -- # '[' -z 542930 ']' 00:05:14.108 11:46:04 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:14.108 11:46:04 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:14.108 11:46:04 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:14.108 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:14.108 11:46:04 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:14.108 11:46:04 -- common/autotest_common.sh@10 -- # set +x 00:05:14.367 [2024-07-12 11:46:04.382598] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:05:14.367 [2024-07-12 11:46:04.382640] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid542930 ] 00:05:14.367 [2024-07-12 11:46:04.454023] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:14.367 [2024-07-12 11:46:04.530189] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:14.934 11:46:05 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:14.934 11:46:05 -- common/autotest_common.sh@862 -- # return 0 00:05:14.934 11:46:05 -- common/autotest_common.sh@1600 -- # bdf_id=0 00:05:14.934 11:46:05 -- common/autotest_common.sh@1601 -- # for bdf in "${bdfs[@]}" 00:05:14.934 11:46:05 -- common/autotest_common.sh@1602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:5e:00.0 00:05:18.222 nvme0n1 00:05:18.222 11:46:08 -- common/autotest_common.sh@1604 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:05:18.222 [2024-07-12 11:46:08.309722] nvme_opal.c:2063:spdk_opal_cmd_revert_tper: *ERROR*: Error on starting admin SP session with error 18 00:05:18.222 [2024-07-12 11:46:08.309750] vbdev_opal_rpc.c: 134:rpc_bdev_nvme_opal_revert: *ERROR*: Revert TPer failure: 18 00:05:18.222 request: 00:05:18.222 { 00:05:18.222 "nvme_ctrlr_name": "nvme0", 00:05:18.222 "password": "test", 00:05:18.222 "method": "bdev_nvme_opal_revert", 00:05:18.222 "req_id": 1 00:05:18.222 } 00:05:18.222 Got JSON-RPC error response 00:05:18.222 response: 00:05:18.222 { 00:05:18.222 "code": -32603, 00:05:18.222 "message": "Internal error" 00:05:18.222 } 00:05:18.222 11:46:08 -- common/autotest_common.sh@1604 -- # true 00:05:18.222 11:46:08 -- common/autotest_common.sh@1605 -- # (( ++bdf_id )) 00:05:18.222 11:46:08 -- common/autotest_common.sh@1608 -- # killprocess 542930 00:05:18.222 11:46:08 -- common/autotest_common.sh@948 -- # '[' -z 542930 ']' 00:05:18.222 11:46:08 -- common/autotest_common.sh@952 -- # kill -0 542930 00:05:18.222 11:46:08 -- common/autotest_common.sh@953 -- # uname 00:05:18.222 11:46:08 -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:05:18.222 11:46:08 -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 542930 00:05:18.222 11:46:08 -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:05:18.222 11:46:08 -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:05:18.222 11:46:08 -- common/autotest_common.sh@966 -- # echo 'killing process with pid 542930' 00:05:18.222 killing process with pid 542930 00:05:18.222 11:46:08 -- common/autotest_common.sh@967 -- # kill 542930 00:05:18.222 11:46:08 -- common/autotest_common.sh@972 -- # wait 542930 00:05:20.126 11:46:09 -- spdk/autotest.sh@150 -- # '[' 0 -eq 1 ']' 00:05:20.126 11:46:09 -- spdk/autotest.sh@154 -- # '[' 1 -eq 1 ']' 00:05:20.126 11:46:09 -- spdk/autotest.sh@155 -- # [[ 1 -eq 1 ]] 00:05:20.126 11:46:09 -- spdk/autotest.sh@156 -- # [[ 0 -eq 1 ]] 00:05:20.126 11:46:09 -- spdk/autotest.sh@159 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/qat_setup.sh 00:05:20.386 Restarting all devices. 00:05:24.580 lstat() error: No such file or directory 00:05:24.580 QAT Error: No GENERAL section found 00:05:24.580 Failed to configure qat_dev0 00:05:24.580 lstat() error: No such file or directory 00:05:24.580 QAT Error: No GENERAL section found 00:05:24.580 Failed to configure qat_dev1 00:05:24.580 lstat() error: No such file or directory 00:05:24.580 QAT Error: No GENERAL section found 00:05:24.580 Failed to configure qat_dev2 00:05:24.580 enable sriov 00:05:24.580 Checking status of all devices. 00:05:24.580 There is 3 QAT acceleration device(s) in the system: 00:05:24.580 qat_dev0 - type: c6xx, inst_id: 0, node_id: 0, bsf: 0000:1a:00.0, #accel: 5 #engines: 10 state: down 00:05:24.580 qat_dev1 - type: c6xx, inst_id: 1, node_id: 0, bsf: 0000:1c:00.0, #accel: 5 #engines: 10 state: down 00:05:24.580 qat_dev2 - type: c6xx, inst_id: 2, node_id: 0, bsf: 0000:1e:00.0, #accel: 5 #engines: 10 state: down 00:05:24.580 0000:1a:00.0 set to 16 VFs 00:05:25.518 0000:1c:00.0 set to 16 VFs 00:05:26.086 0000:1e:00.0 set to 16 VFs 00:05:27.463 Properly configured the qat device with driver uio_pci_generic. 00:05:27.463 11:46:17 -- spdk/autotest.sh@162 -- # timing_enter lib 00:05:27.463 11:46:17 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:27.463 11:46:17 -- common/autotest_common.sh@10 -- # set +x 00:05:27.463 11:46:17 -- spdk/autotest.sh@164 -- # [[ 0 -eq 1 ]] 00:05:27.463 11:46:17 -- spdk/autotest.sh@168 -- # run_test env /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:05:27.463 11:46:17 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:27.463 11:46:17 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:27.463 11:46:17 -- common/autotest_common.sh@10 -- # set +x 00:05:27.463 ************************************ 00:05:27.463 START TEST env 00:05:27.463 ************************************ 00:05:27.463 11:46:17 env -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:05:27.723 * Looking for test storage... 00:05:27.723 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env 00:05:27.723 11:46:17 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:05:27.723 11:46:17 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:27.723 11:46:17 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:27.723 11:46:17 env -- common/autotest_common.sh@10 -- # set +x 00:05:27.723 ************************************ 00:05:27.723 START TEST env_memory 00:05:27.723 ************************************ 00:05:27.723 11:46:17 env.env_memory -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:05:27.723 00:05:27.723 00:05:27.723 CUnit - A unit testing framework for C - Version 2.1-3 00:05:27.723 http://cunit.sourceforge.net/ 00:05:27.723 00:05:27.723 00:05:27.723 Suite: memory 00:05:27.723 Test: alloc and free memory map ...[2024-07-12 11:46:17.821818] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:05:27.723 passed 00:05:27.723 Test: mem map translation ...[2024-07-12 11:46:17.840448] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:05:27.723 [2024-07-12 11:46:17.840464] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:05:27.723 [2024-07-12 11:46:17.840500] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:05:27.723 [2024-07-12 11:46:17.840509] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:05:27.723 passed 00:05:27.723 Test: mem map registration ...[2024-07-12 11:46:17.878979] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:05:27.723 [2024-07-12 11:46:17.878994] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:05:27.723 passed 00:05:27.723 Test: mem map adjacent registrations ...passed 00:05:27.723 00:05:27.723 Run Summary: Type Total Ran Passed Failed Inactive 00:05:27.723 suites 1 1 n/a 0 0 00:05:27.723 tests 4 4 4 0 0 00:05:27.723 asserts 152 152 152 0 n/a 00:05:27.723 00:05:27.723 Elapsed time = 0.143 seconds 00:05:27.723 00:05:27.723 real 0m0.156s 00:05:27.723 user 0m0.149s 00:05:27.723 sys 0m0.006s 00:05:27.723 11:46:17 env.env_memory -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:27.723 11:46:17 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:05:27.723 ************************************ 00:05:27.723 END TEST env_memory 00:05:27.723 ************************************ 00:05:27.984 11:46:17 env -- common/autotest_common.sh@1142 -- # return 0 00:05:27.984 11:46:17 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:27.984 11:46:17 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:27.984 11:46:17 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:27.984 11:46:17 env -- common/autotest_common.sh@10 -- # set +x 00:05:27.984 ************************************ 00:05:27.984 START TEST env_vtophys 00:05:27.984 ************************************ 00:05:27.984 11:46:18 env.env_vtophys -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:27.984 EAL: lib.eal log level changed from notice to debug 00:05:27.984 EAL: Detected lcore 0 as core 0 on socket 0 00:05:27.984 EAL: Detected lcore 1 as core 1 on socket 0 00:05:27.984 EAL: Detected lcore 2 as core 2 on socket 0 00:05:27.984 EAL: Detected lcore 3 as core 3 on socket 0 00:05:27.984 EAL: Detected lcore 4 as core 4 on socket 0 00:05:27.984 EAL: Detected lcore 5 as core 5 on socket 0 00:05:27.984 EAL: Detected lcore 6 as core 6 on socket 0 00:05:27.984 EAL: Detected lcore 7 as core 8 on socket 0 00:05:27.984 EAL: Detected lcore 8 as core 9 on socket 0 00:05:27.984 EAL: Detected lcore 9 as core 10 on socket 0 00:05:27.984 EAL: Detected lcore 10 as core 11 on socket 0 00:05:27.984 EAL: Detected lcore 11 as core 12 on socket 0 00:05:27.984 EAL: Detected lcore 12 as core 13 on socket 0 00:05:27.984 EAL: Detected lcore 13 as core 16 on socket 0 00:05:27.984 EAL: Detected lcore 14 as core 17 on socket 0 00:05:27.984 EAL: Detected lcore 15 as core 18 on socket 0 00:05:27.984 EAL: Detected lcore 16 as core 19 on socket 0 00:05:27.984 EAL: Detected lcore 17 as core 20 on socket 0 00:05:27.984 EAL: Detected lcore 18 as core 21 on socket 0 00:05:27.984 EAL: Detected lcore 19 as core 25 on socket 0 00:05:27.984 EAL: Detected lcore 20 as core 26 on socket 0 00:05:27.984 EAL: Detected lcore 21 as core 27 on socket 0 00:05:27.984 EAL: Detected lcore 22 as core 28 on socket 0 00:05:27.984 EAL: Detected lcore 23 as core 29 on socket 0 00:05:27.984 EAL: Detected lcore 24 as core 0 on socket 1 00:05:27.984 EAL: Detected lcore 25 as core 1 on socket 1 00:05:27.984 EAL: Detected lcore 26 as core 2 on socket 1 00:05:27.984 EAL: Detected lcore 27 as core 3 on socket 1 00:05:27.984 EAL: Detected lcore 28 as core 4 on socket 1 00:05:27.984 EAL: Detected lcore 29 as core 5 on socket 1 00:05:27.984 EAL: Detected lcore 30 as core 6 on socket 1 00:05:27.984 EAL: Detected lcore 31 as core 8 on socket 1 00:05:27.984 EAL: Detected lcore 32 as core 9 on socket 1 00:05:27.984 EAL: Detected lcore 33 as core 10 on socket 1 00:05:27.984 EAL: Detected lcore 34 as core 11 on socket 1 00:05:27.984 EAL: Detected lcore 35 as core 12 on socket 1 00:05:27.984 EAL: Detected lcore 36 as core 13 on socket 1 00:05:27.984 EAL: Detected lcore 37 as core 16 on socket 1 00:05:27.984 EAL: Detected lcore 38 as core 17 on socket 1 00:05:27.984 EAL: Detected lcore 39 as core 18 on socket 1 00:05:27.984 EAL: Detected lcore 40 as core 19 on socket 1 00:05:27.984 EAL: Detected lcore 41 as core 20 on socket 1 00:05:27.984 EAL: Detected lcore 42 as core 21 on socket 1 00:05:27.984 EAL: Detected lcore 43 as core 25 on socket 1 00:05:27.984 EAL: Detected lcore 44 as core 26 on socket 1 00:05:27.984 EAL: Detected lcore 45 as core 27 on socket 1 00:05:27.984 EAL: Detected lcore 46 as core 28 on socket 1 00:05:27.984 EAL: Detected lcore 47 as core 29 on socket 1 00:05:27.984 EAL: Detected lcore 48 as core 0 on socket 0 00:05:27.984 EAL: Detected lcore 49 as core 1 on socket 0 00:05:27.984 EAL: Detected lcore 50 as core 2 on socket 0 00:05:27.984 EAL: Detected lcore 51 as core 3 on socket 0 00:05:27.984 EAL: Detected lcore 52 as core 4 on socket 0 00:05:27.984 EAL: Detected lcore 53 as core 5 on socket 0 00:05:27.984 EAL: Detected lcore 54 as core 6 on socket 0 00:05:27.984 EAL: Detected lcore 55 as core 8 on socket 0 00:05:27.984 EAL: Detected lcore 56 as core 9 on socket 0 00:05:27.984 EAL: Detected lcore 57 as core 10 on socket 0 00:05:27.984 EAL: Detected lcore 58 as core 11 on socket 0 00:05:27.984 EAL: Detected lcore 59 as core 12 on socket 0 00:05:27.984 EAL: Detected lcore 60 as core 13 on socket 0 00:05:27.984 EAL: Detected lcore 61 as core 16 on socket 0 00:05:27.984 EAL: Detected lcore 62 as core 17 on socket 0 00:05:27.984 EAL: Detected lcore 63 as core 18 on socket 0 00:05:27.984 EAL: Detected lcore 64 as core 19 on socket 0 00:05:27.984 EAL: Detected lcore 65 as core 20 on socket 0 00:05:27.984 EAL: Detected lcore 66 as core 21 on socket 0 00:05:27.984 EAL: Detected lcore 67 as core 25 on socket 0 00:05:27.984 EAL: Detected lcore 68 as core 26 on socket 0 00:05:27.984 EAL: Detected lcore 69 as core 27 on socket 0 00:05:27.984 EAL: Detected lcore 70 as core 28 on socket 0 00:05:27.984 EAL: Detected lcore 71 as core 29 on socket 0 00:05:27.984 EAL: Detected lcore 72 as core 0 on socket 1 00:05:27.984 EAL: Detected lcore 73 as core 1 on socket 1 00:05:27.984 EAL: Detected lcore 74 as core 2 on socket 1 00:05:27.984 EAL: Detected lcore 75 as core 3 on socket 1 00:05:27.984 EAL: Detected lcore 76 as core 4 on socket 1 00:05:27.984 EAL: Detected lcore 77 as core 5 on socket 1 00:05:27.984 EAL: Detected lcore 78 as core 6 on socket 1 00:05:27.984 EAL: Detected lcore 79 as core 8 on socket 1 00:05:27.984 EAL: Detected lcore 80 as core 9 on socket 1 00:05:27.984 EAL: Detected lcore 81 as core 10 on socket 1 00:05:27.984 EAL: Detected lcore 82 as core 11 on socket 1 00:05:27.984 EAL: Detected lcore 83 as core 12 on socket 1 00:05:27.984 EAL: Detected lcore 84 as core 13 on socket 1 00:05:27.984 EAL: Detected lcore 85 as core 16 on socket 1 00:05:27.984 EAL: Detected lcore 86 as core 17 on socket 1 00:05:27.984 EAL: Detected lcore 87 as core 18 on socket 1 00:05:27.984 EAL: Detected lcore 88 as core 19 on socket 1 00:05:27.984 EAL: Detected lcore 89 as core 20 on socket 1 00:05:27.984 EAL: Detected lcore 90 as core 21 on socket 1 00:05:27.984 EAL: Detected lcore 91 as core 25 on socket 1 00:05:27.984 EAL: Detected lcore 92 as core 26 on socket 1 00:05:27.985 EAL: Detected lcore 93 as core 27 on socket 1 00:05:27.985 EAL: Detected lcore 94 as core 28 on socket 1 00:05:27.985 EAL: Detected lcore 95 as core 29 on socket 1 00:05:27.985 EAL: Maximum logical cores by configuration: 128 00:05:27.985 EAL: Detected CPU lcores: 96 00:05:27.985 EAL: Detected NUMA nodes: 2 00:05:27.985 EAL: Checking presence of .so 'librte_eal.so.24.1' 00:05:27.985 EAL: Detected shared linkage of DPDK 00:05:27.985 EAL: No shared files mode enabled, IPC will be disabled 00:05:27.985 EAL: No shared files mode enabled, IPC is disabled 00:05:27.985 EAL: PCI driver qat for device 0000:1a:01.0 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1a:01.1 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1a:01.2 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1a:01.3 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1a:01.4 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1a:01.5 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1a:01.6 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1a:01.7 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1a:02.0 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1a:02.1 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1a:02.2 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1a:02.3 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1a:02.4 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1a:02.5 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1a:02.6 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1a:02.7 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1c:01.0 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1c:01.1 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1c:01.2 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1c:01.3 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1c:01.4 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1c:01.5 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1c:01.6 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1c:01.7 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1c:02.0 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1c:02.1 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1c:02.2 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1c:02.3 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1c:02.4 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1c:02.5 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1c:02.6 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1c:02.7 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1e:01.0 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1e:01.1 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1e:01.2 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1e:01.3 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1e:01.4 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1e:01.5 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1e:01.6 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1e:01.7 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1e:02.0 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1e:02.1 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1e:02.2 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1e:02.3 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1e:02.4 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1e:02.5 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1e:02.6 wants IOVA as 'PA' 00:05:27.985 EAL: PCI driver qat for device 0000:1e:02.7 wants IOVA as 'PA' 00:05:27.985 EAL: Bus pci wants IOVA as 'PA' 00:05:27.985 EAL: Bus auxiliary wants IOVA as 'DC' 00:05:27.985 EAL: Bus vdev wants IOVA as 'DC' 00:05:27.985 EAL: Selected IOVA mode 'PA' 00:05:27.985 EAL: Probing VFIO support... 00:05:27.985 EAL: IOMMU type 1 (Type 1) is supported 00:05:27.985 EAL: IOMMU type 7 (sPAPR) is not supported 00:05:27.985 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:05:27.985 EAL: VFIO support initialized 00:05:27.985 EAL: Ask a virtual area of 0x2e000 bytes 00:05:27.985 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:05:27.985 EAL: Setting up physically contiguous memory... 00:05:27.985 EAL: Setting maximum number of open files to 524288 00:05:27.985 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:05:27.985 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:05:27.985 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:05:27.985 EAL: Ask a virtual area of 0x61000 bytes 00:05:27.985 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:05:27.985 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:27.985 EAL: Ask a virtual area of 0x400000000 bytes 00:05:27.985 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:05:27.985 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:05:27.985 EAL: Ask a virtual area of 0x61000 bytes 00:05:27.985 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:05:27.985 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:27.985 EAL: Ask a virtual area of 0x400000000 bytes 00:05:27.985 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:05:27.985 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:05:27.985 EAL: Ask a virtual area of 0x61000 bytes 00:05:27.985 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:05:27.985 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:27.985 EAL: Ask a virtual area of 0x400000000 bytes 00:05:27.985 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:05:27.985 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:05:27.985 EAL: Ask a virtual area of 0x61000 bytes 00:05:27.985 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:05:27.985 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:27.985 EAL: Ask a virtual area of 0x400000000 bytes 00:05:27.985 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:05:27.985 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:05:27.985 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:05:27.985 EAL: Ask a virtual area of 0x61000 bytes 00:05:27.985 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:05:27.985 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:27.985 EAL: Ask a virtual area of 0x400000000 bytes 00:05:27.985 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:05:27.985 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:05:27.985 EAL: Ask a virtual area of 0x61000 bytes 00:05:27.985 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:05:27.985 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:27.985 EAL: Ask a virtual area of 0x400000000 bytes 00:05:27.985 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:05:27.985 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:05:27.985 EAL: Ask a virtual area of 0x61000 bytes 00:05:27.985 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:05:27.985 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:27.985 EAL: Ask a virtual area of 0x400000000 bytes 00:05:27.985 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:05:27.985 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:05:27.985 EAL: Ask a virtual area of 0x61000 bytes 00:05:27.985 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:05:27.985 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:27.985 EAL: Ask a virtual area of 0x400000000 bytes 00:05:27.985 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:05:27.985 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:05:27.985 EAL: Hugepages will be freed exactly as allocated. 00:05:27.985 EAL: No shared files mode enabled, IPC is disabled 00:05:27.985 EAL: No shared files mode enabled, IPC is disabled 00:05:27.985 EAL: TSC frequency is ~2100000 KHz 00:05:27.985 EAL: Main lcore 0 is ready (tid=7f06d3c14b00;cpuset=[0]) 00:05:27.985 EAL: Trying to obtain current memory policy. 00:05:27.985 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:27.985 EAL: Restoring previous memory policy: 0 00:05:27.985 EAL: request: mp_malloc_sync 00:05:27.985 EAL: No shared files mode enabled, IPC is disabled 00:05:27.985 EAL: Heap on socket 0 was expanded by 2MB 00:05:27.985 EAL: PCI device 0000:1a:01.0 on NUMA socket 0 00:05:27.985 EAL: probe driver: 8086:37c9 qat 00:05:27.985 EAL: PCI memory mapped at 0x202001000000 00:05:27.985 EAL: PCI memory mapped at 0x202001001000 00:05:27.985 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.0 (socket 0) 00:05:27.985 EAL: PCI device 0000:1a:01.1 on NUMA socket 0 00:05:27.985 EAL: probe driver: 8086:37c9 qat 00:05:27.985 EAL: PCI memory mapped at 0x202001002000 00:05:27.985 EAL: PCI memory mapped at 0x202001003000 00:05:27.985 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.1 (socket 0) 00:05:27.985 EAL: PCI device 0000:1a:01.2 on NUMA socket 0 00:05:27.985 EAL: probe driver: 8086:37c9 qat 00:05:27.985 EAL: PCI memory mapped at 0x202001004000 00:05:27.985 EAL: PCI memory mapped at 0x202001005000 00:05:27.985 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.2 (socket 0) 00:05:27.985 EAL: PCI device 0000:1a:01.3 on NUMA socket 0 00:05:27.985 EAL: probe driver: 8086:37c9 qat 00:05:27.985 EAL: PCI memory mapped at 0x202001006000 00:05:27.985 EAL: PCI memory mapped at 0x202001007000 00:05:27.985 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.3 (socket 0) 00:05:27.985 EAL: PCI device 0000:1a:01.4 on NUMA socket 0 00:05:27.985 EAL: probe driver: 8086:37c9 qat 00:05:27.985 EAL: PCI memory mapped at 0x202001008000 00:05:27.985 EAL: PCI memory mapped at 0x202001009000 00:05:27.985 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.4 (socket 0) 00:05:27.985 EAL: PCI device 0000:1a:01.5 on NUMA socket 0 00:05:27.985 EAL: probe driver: 8086:37c9 qat 00:05:27.985 EAL: PCI memory mapped at 0x20200100a000 00:05:27.985 EAL: PCI memory mapped at 0x20200100b000 00:05:27.985 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.5 (socket 0) 00:05:27.985 EAL: PCI device 0000:1a:01.6 on NUMA socket 0 00:05:27.985 EAL: probe driver: 8086:37c9 qat 00:05:27.985 EAL: PCI memory mapped at 0x20200100c000 00:05:27.985 EAL: PCI memory mapped at 0x20200100d000 00:05:27.985 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.6 (socket 0) 00:05:27.985 EAL: PCI device 0000:1a:01.7 on NUMA socket 0 00:05:27.985 EAL: probe driver: 8086:37c9 qat 00:05:27.985 EAL: PCI memory mapped at 0x20200100e000 00:05:27.985 EAL: PCI memory mapped at 0x20200100f000 00:05:27.985 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.7 (socket 0) 00:05:27.985 EAL: PCI device 0000:1a:02.0 on NUMA socket 0 00:05:27.985 EAL: probe driver: 8086:37c9 qat 00:05:27.985 EAL: PCI memory mapped at 0x202001010000 00:05:27.985 EAL: PCI memory mapped at 0x202001011000 00:05:27.985 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.0 (socket 0) 00:05:27.985 EAL: PCI device 0000:1a:02.1 on NUMA socket 0 00:05:27.985 EAL: probe driver: 8086:37c9 qat 00:05:27.986 EAL: PCI memory mapped at 0x202001012000 00:05:27.986 EAL: PCI memory mapped at 0x202001013000 00:05:27.986 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.1 (socket 0) 00:05:27.986 EAL: PCI device 0000:1a:02.2 on NUMA socket 0 00:05:27.986 EAL: probe driver: 8086:37c9 qat 00:05:27.986 EAL: PCI memory mapped at 0x202001014000 00:05:27.986 EAL: PCI memory mapped at 0x202001015000 00:05:27.986 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.2 (socket 0) 00:05:27.986 EAL: PCI device 0000:1a:02.3 on NUMA socket 0 00:05:27.986 EAL: probe driver: 8086:37c9 qat 00:05:27.986 EAL: PCI memory mapped at 0x202001016000 00:05:27.986 EAL: PCI memory mapped at 0x202001017000 00:05:27.986 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.3 (socket 0) 00:05:27.986 EAL: PCI device 0000:1a:02.4 on NUMA socket 0 00:05:27.986 EAL: probe driver: 8086:37c9 qat 00:05:27.986 EAL: PCI memory mapped at 0x202001018000 00:05:27.986 EAL: PCI memory mapped at 0x202001019000 00:05:27.986 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.4 (socket 0) 00:05:27.986 EAL: PCI device 0000:1a:02.5 on NUMA socket 0 00:05:27.986 EAL: probe driver: 8086:37c9 qat 00:05:27.986 EAL: PCI memory mapped at 0x20200101a000 00:05:27.986 EAL: PCI memory mapped at 0x20200101b000 00:05:27.986 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.5 (socket 0) 00:05:27.986 EAL: PCI device 0000:1a:02.6 on NUMA socket 0 00:05:27.986 EAL: probe driver: 8086:37c9 qat 00:05:27.986 EAL: PCI memory mapped at 0x20200101c000 00:05:27.986 EAL: PCI memory mapped at 0x20200101d000 00:05:27.986 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.6 (socket 0) 00:05:27.986 EAL: PCI device 0000:1a:02.7 on NUMA socket 0 00:05:27.986 EAL: probe driver: 8086:37c9 qat 00:05:27.986 EAL: PCI memory mapped at 0x20200101e000 00:05:27.986 EAL: PCI memory mapped at 0x20200101f000 00:05:27.986 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.7 (socket 0) 00:05:27.986 EAL: PCI device 0000:1c:01.0 on NUMA socket 0 00:05:27.986 EAL: probe driver: 8086:37c9 qat 00:05:27.986 EAL: PCI memory mapped at 0x202001020000 00:05:27.986 EAL: PCI memory mapped at 0x202001021000 00:05:27.986 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.0 (socket 0) 00:05:27.986 EAL: PCI device 0000:1c:01.1 on NUMA socket 0 00:05:27.986 EAL: probe driver: 8086:37c9 qat 00:05:27.986 EAL: PCI memory mapped at 0x202001022000 00:05:27.986 EAL: PCI memory mapped at 0x202001023000 00:05:27.986 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.1 (socket 0) 00:05:27.986 EAL: PCI device 0000:1c:01.2 on NUMA socket 0 00:05:27.986 EAL: probe driver: 8086:37c9 qat 00:05:27.986 EAL: PCI memory mapped at 0x202001024000 00:05:27.986 EAL: PCI memory mapped at 0x202001025000 00:05:27.986 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.2 (socket 0) 00:05:27.986 EAL: PCI device 0000:1c:01.3 on NUMA socket 0 00:05:27.986 EAL: probe driver: 8086:37c9 qat 00:05:27.986 EAL: PCI memory mapped at 0x202001026000 00:05:27.986 EAL: PCI memory mapped at 0x202001027000 00:05:27.986 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.3 (socket 0) 00:05:27.986 EAL: PCI device 0000:1c:01.4 on NUMA socket 0 00:05:27.986 EAL: probe driver: 8086:37c9 qat 00:05:27.986 EAL: PCI memory mapped at 0x202001028000 00:05:27.986 EAL: PCI memory mapped at 0x202001029000 00:05:27.986 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.4 (socket 0) 00:05:27.986 EAL: PCI device 0000:1c:01.5 on NUMA socket 0 00:05:27.986 EAL: probe driver: 8086:37c9 qat 00:05:27.986 EAL: PCI memory mapped at 0x20200102a000 00:05:27.986 EAL: PCI memory mapped at 0x20200102b000 00:05:27.986 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.5 (socket 0) 00:05:27.986 EAL: PCI device 0000:1c:01.6 on NUMA socket 0 00:05:27.986 EAL: probe driver: 8086:37c9 qat 00:05:27.986 EAL: PCI memory mapped at 0x20200102c000 00:05:27.986 EAL: PCI memory mapped at 0x20200102d000 00:05:27.986 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.6 (socket 0) 00:05:27.986 EAL: PCI device 0000:1c:01.7 on NUMA socket 0 00:05:27.986 EAL: probe driver: 8086:37c9 qat 00:05:27.986 EAL: PCI memory mapped at 0x20200102e000 00:05:27.986 EAL: PCI memory mapped at 0x20200102f000 00:05:27.986 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.7 (socket 0) 00:05:27.986 EAL: PCI device 0000:1c:02.0 on NUMA socket 0 00:05:27.986 EAL: probe driver: 8086:37c9 qat 00:05:27.986 EAL: PCI memory mapped at 0x202001030000 00:05:27.986 EAL: PCI memory mapped at 0x202001031000 00:05:27.986 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.0 (socket 0) 00:05:27.986 EAL: PCI device 0000:1c:02.1 on NUMA socket 0 00:05:27.986 EAL: probe driver: 8086:37c9 qat 00:05:27.986 EAL: PCI memory mapped at 0x202001032000 00:05:27.986 EAL: PCI memory mapped at 0x202001033000 00:05:27.986 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.1 (socket 0) 00:05:27.986 EAL: PCI device 0000:1c:02.2 on NUMA socket 0 00:05:27.986 EAL: probe driver: 8086:37c9 qat 00:05:27.986 EAL: PCI memory mapped at 0x202001034000 00:05:27.986 EAL: PCI memory mapped at 0x202001035000 00:05:27.986 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.2 (socket 0) 00:05:27.986 EAL: PCI device 0000:1c:02.3 on NUMA socket 0 00:05:27.986 EAL: probe driver: 8086:37c9 qat 00:05:27.986 EAL: PCI memory mapped at 0x202001036000 00:05:27.986 EAL: PCI memory mapped at 0x202001037000 00:05:27.986 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.3 (socket 0) 00:05:27.986 EAL: PCI device 0000:1c:02.4 on NUMA socket 0 00:05:27.986 EAL: probe driver: 8086:37c9 qat 00:05:27.986 EAL: PCI memory mapped at 0x202001038000 00:05:27.986 EAL: PCI memory mapped at 0x202001039000 00:05:27.986 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.4 (socket 0) 00:05:27.986 EAL: PCI device 0000:1c:02.5 on NUMA socket 0 00:05:27.986 EAL: probe driver: 8086:37c9 qat 00:05:27.986 EAL: PCI memory mapped at 0x20200103a000 00:05:27.986 EAL: PCI memory mapped at 0x20200103b000 00:05:27.986 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.5 (socket 0) 00:05:27.986 EAL: PCI device 0000:1c:02.6 on NUMA socket 0 00:05:27.986 EAL: probe driver: 8086:37c9 qat 00:05:27.986 EAL: PCI memory mapped at 0x20200103c000 00:05:27.986 EAL: PCI memory mapped at 0x20200103d000 00:05:27.986 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.6 (socket 0) 00:05:27.986 EAL: PCI device 0000:1c:02.7 on NUMA socket 0 00:05:27.986 EAL: probe driver: 8086:37c9 qat 00:05:27.986 EAL: PCI memory mapped at 0x20200103e000 00:05:27.986 EAL: PCI memory mapped at 0x20200103f000 00:05:27.986 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.7 (socket 0) 00:05:27.986 EAL: PCI device 0000:1e:01.0 on NUMA socket 0 00:05:27.986 EAL: probe driver: 8086:37c9 qat 00:05:27.986 EAL: PCI memory mapped at 0x202001040000 00:05:27.986 EAL: PCI memory mapped at 0x202001041000 00:05:27.986 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.0 (socket 0) 00:05:27.986 EAL: PCI device 0000:1e:01.1 on NUMA socket 0 00:05:27.986 EAL: probe driver: 8086:37c9 qat 00:05:27.986 EAL: PCI memory mapped at 0x202001042000 00:05:27.986 EAL: PCI memory mapped at 0x202001043000 00:05:27.986 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.1 (socket 0) 00:05:27.986 EAL: PCI device 0000:1e:01.2 on NUMA socket 0 00:05:27.986 EAL: probe driver: 8086:37c9 qat 00:05:27.986 EAL: PCI memory mapped at 0x202001044000 00:05:27.986 EAL: PCI memory mapped at 0x202001045000 00:05:27.986 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.2 (socket 0) 00:05:27.986 EAL: PCI device 0000:1e:01.3 on NUMA socket 0 00:05:27.986 EAL: probe driver: 8086:37c9 qat 00:05:27.986 EAL: PCI memory mapped at 0x202001046000 00:05:27.986 EAL: PCI memory mapped at 0x202001047000 00:05:27.986 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.3 (socket 0) 00:05:27.986 EAL: PCI device 0000:1e:01.4 on NUMA socket 0 00:05:27.986 EAL: probe driver: 8086:37c9 qat 00:05:27.986 EAL: PCI memory mapped at 0x202001048000 00:05:27.986 EAL: PCI memory mapped at 0x202001049000 00:05:27.986 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.4 (socket 0) 00:05:27.986 EAL: PCI device 0000:1e:01.5 on NUMA socket 0 00:05:27.986 EAL: probe driver: 8086:37c9 qat 00:05:27.986 EAL: PCI memory mapped at 0x20200104a000 00:05:27.986 EAL: PCI memory mapped at 0x20200104b000 00:05:27.986 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.5 (socket 0) 00:05:27.986 EAL: PCI device 0000:1e:01.6 on NUMA socket 0 00:05:27.986 EAL: probe driver: 8086:37c9 qat 00:05:27.986 EAL: PCI memory mapped at 0x20200104c000 00:05:27.986 EAL: PCI memory mapped at 0x20200104d000 00:05:27.986 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.6 (socket 0) 00:05:27.986 EAL: PCI device 0000:1e:01.7 on NUMA socket 0 00:05:27.986 EAL: probe driver: 8086:37c9 qat 00:05:27.986 EAL: PCI memory mapped at 0x20200104e000 00:05:27.986 EAL: PCI memory mapped at 0x20200104f000 00:05:27.986 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.7 (socket 0) 00:05:27.986 EAL: PCI device 0000:1e:02.0 on NUMA socket 0 00:05:27.986 EAL: probe driver: 8086:37c9 qat 00:05:27.986 EAL: PCI memory mapped at 0x202001050000 00:05:27.986 EAL: PCI memory mapped at 0x202001051000 00:05:27.986 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.0 (socket 0) 00:05:27.986 EAL: PCI device 0000:1e:02.1 on NUMA socket 0 00:05:27.986 EAL: probe driver: 8086:37c9 qat 00:05:27.986 EAL: PCI memory mapped at 0x202001052000 00:05:27.986 EAL: PCI memory mapped at 0x202001053000 00:05:27.986 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.1 (socket 0) 00:05:27.986 EAL: PCI device 0000:1e:02.2 on NUMA socket 0 00:05:27.986 EAL: probe driver: 8086:37c9 qat 00:05:27.986 EAL: PCI memory mapped at 0x202001054000 00:05:27.986 EAL: PCI memory mapped at 0x202001055000 00:05:27.986 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.2 (socket 0) 00:05:27.986 EAL: PCI device 0000:1e:02.3 on NUMA socket 0 00:05:27.986 EAL: probe driver: 8086:37c9 qat 00:05:27.986 EAL: PCI memory mapped at 0x202001056000 00:05:27.986 EAL: PCI memory mapped at 0x202001057000 00:05:27.986 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.3 (socket 0) 00:05:27.986 EAL: PCI device 0000:1e:02.4 on NUMA socket 0 00:05:27.986 EAL: probe driver: 8086:37c9 qat 00:05:27.986 EAL: PCI memory mapped at 0x202001058000 00:05:27.986 EAL: PCI memory mapped at 0x202001059000 00:05:27.986 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.4 (socket 0) 00:05:27.986 EAL: PCI device 0000:1e:02.5 on NUMA socket 0 00:05:27.986 EAL: probe driver: 8086:37c9 qat 00:05:27.986 EAL: PCI memory mapped at 0x20200105a000 00:05:27.986 EAL: PCI memory mapped at 0x20200105b000 00:05:27.986 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.5 (socket 0) 00:05:27.986 EAL: PCI device 0000:1e:02.6 on NUMA socket 0 00:05:27.986 EAL: probe driver: 8086:37c9 qat 00:05:27.986 EAL: PCI memory mapped at 0x20200105c000 00:05:27.986 EAL: PCI memory mapped at 0x20200105d000 00:05:27.986 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.6 (socket 0) 00:05:27.986 EAL: PCI device 0000:1e:02.7 on NUMA socket 0 00:05:27.986 EAL: probe driver: 8086:37c9 qat 00:05:27.986 EAL: PCI memory mapped at 0x20200105e000 00:05:27.986 EAL: PCI memory mapped at 0x20200105f000 00:05:27.986 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.7 (socket 0) 00:05:27.986 EAL: No shared files mode enabled, IPC is disabled 00:05:27.986 EAL: No shared files mode enabled, IPC is disabled 00:05:27.986 EAL: No PCI address specified using 'addr=' in: bus=pci 00:05:27.986 EAL: Mem event callback 'spdk:(nil)' registered 00:05:27.986 00:05:27.987 00:05:27.987 CUnit - A unit testing framework for C - Version 2.1-3 00:05:27.987 http://cunit.sourceforge.net/ 00:05:27.987 00:05:27.987 00:05:27.987 Suite: components_suite 00:05:27.987 Test: vtophys_malloc_test ...passed 00:05:27.987 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:05:27.987 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:27.987 EAL: Restoring previous memory policy: 4 00:05:27.987 EAL: Calling mem event callback 'spdk:(nil)' 00:05:27.987 EAL: request: mp_malloc_sync 00:05:27.987 EAL: No shared files mode enabled, IPC is disabled 00:05:27.987 EAL: Heap on socket 0 was expanded by 4MB 00:05:27.987 EAL: Calling mem event callback 'spdk:(nil)' 00:05:27.987 EAL: request: mp_malloc_sync 00:05:27.987 EAL: No shared files mode enabled, IPC is disabled 00:05:27.987 EAL: Heap on socket 0 was shrunk by 4MB 00:05:27.987 EAL: Trying to obtain current memory policy. 00:05:27.987 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:27.987 EAL: Restoring previous memory policy: 4 00:05:27.987 EAL: Calling mem event callback 'spdk:(nil)' 00:05:27.987 EAL: request: mp_malloc_sync 00:05:27.987 EAL: No shared files mode enabled, IPC is disabled 00:05:27.987 EAL: Heap on socket 0 was expanded by 6MB 00:05:27.987 EAL: Calling mem event callback 'spdk:(nil)' 00:05:27.987 EAL: request: mp_malloc_sync 00:05:27.987 EAL: No shared files mode enabled, IPC is disabled 00:05:27.987 EAL: Heap on socket 0 was shrunk by 6MB 00:05:27.987 EAL: Trying to obtain current memory policy. 00:05:27.987 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:27.987 EAL: Restoring previous memory policy: 4 00:05:27.987 EAL: Calling mem event callback 'spdk:(nil)' 00:05:27.987 EAL: request: mp_malloc_sync 00:05:27.987 EAL: No shared files mode enabled, IPC is disabled 00:05:27.987 EAL: Heap on socket 0 was expanded by 10MB 00:05:27.987 EAL: Calling mem event callback 'spdk:(nil)' 00:05:27.987 EAL: request: mp_malloc_sync 00:05:27.987 EAL: No shared files mode enabled, IPC is disabled 00:05:27.987 EAL: Heap on socket 0 was shrunk by 10MB 00:05:27.987 EAL: Trying to obtain current memory policy. 00:05:27.987 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:27.987 EAL: Restoring previous memory policy: 4 00:05:27.987 EAL: Calling mem event callback 'spdk:(nil)' 00:05:27.987 EAL: request: mp_malloc_sync 00:05:27.987 EAL: No shared files mode enabled, IPC is disabled 00:05:27.987 EAL: Heap on socket 0 was expanded by 18MB 00:05:27.987 EAL: Calling mem event callback 'spdk:(nil)' 00:05:27.987 EAL: request: mp_malloc_sync 00:05:27.987 EAL: No shared files mode enabled, IPC is disabled 00:05:27.987 EAL: Heap on socket 0 was shrunk by 18MB 00:05:27.987 EAL: Trying to obtain current memory policy. 00:05:27.987 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:27.987 EAL: Restoring previous memory policy: 4 00:05:27.987 EAL: Calling mem event callback 'spdk:(nil)' 00:05:27.987 EAL: request: mp_malloc_sync 00:05:27.987 EAL: No shared files mode enabled, IPC is disabled 00:05:27.987 EAL: Heap on socket 0 was expanded by 34MB 00:05:27.987 EAL: Calling mem event callback 'spdk:(nil)' 00:05:27.987 EAL: request: mp_malloc_sync 00:05:27.987 EAL: No shared files mode enabled, IPC is disabled 00:05:27.987 EAL: Heap on socket 0 was shrunk by 34MB 00:05:27.987 EAL: Trying to obtain current memory policy. 00:05:27.987 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:27.987 EAL: Restoring previous memory policy: 4 00:05:27.987 EAL: Calling mem event callback 'spdk:(nil)' 00:05:27.987 EAL: request: mp_malloc_sync 00:05:27.987 EAL: No shared files mode enabled, IPC is disabled 00:05:27.987 EAL: Heap on socket 0 was expanded by 66MB 00:05:27.987 EAL: Calling mem event callback 'spdk:(nil)' 00:05:27.987 EAL: request: mp_malloc_sync 00:05:27.987 EAL: No shared files mode enabled, IPC is disabled 00:05:27.987 EAL: Heap on socket 0 was shrunk by 66MB 00:05:27.987 EAL: Trying to obtain current memory policy. 00:05:27.987 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:27.987 EAL: Restoring previous memory policy: 4 00:05:27.987 EAL: Calling mem event callback 'spdk:(nil)' 00:05:27.987 EAL: request: mp_malloc_sync 00:05:27.987 EAL: No shared files mode enabled, IPC is disabled 00:05:27.987 EAL: Heap on socket 0 was expanded by 130MB 00:05:27.987 EAL: Calling mem event callback 'spdk:(nil)' 00:05:27.987 EAL: request: mp_malloc_sync 00:05:27.987 EAL: No shared files mode enabled, IPC is disabled 00:05:27.987 EAL: Heap on socket 0 was shrunk by 130MB 00:05:27.987 EAL: Trying to obtain current memory policy. 00:05:27.987 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:28.246 EAL: Restoring previous memory policy: 4 00:05:28.246 EAL: Calling mem event callback 'spdk:(nil)' 00:05:28.246 EAL: request: mp_malloc_sync 00:05:28.246 EAL: No shared files mode enabled, IPC is disabled 00:05:28.246 EAL: Heap on socket 0 was expanded by 258MB 00:05:28.246 EAL: Calling mem event callback 'spdk:(nil)' 00:05:28.246 EAL: request: mp_malloc_sync 00:05:28.246 EAL: No shared files mode enabled, IPC is disabled 00:05:28.246 EAL: Heap on socket 0 was shrunk by 258MB 00:05:28.246 EAL: Trying to obtain current memory policy. 00:05:28.246 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:28.246 EAL: Restoring previous memory policy: 4 00:05:28.246 EAL: Calling mem event callback 'spdk:(nil)' 00:05:28.246 EAL: request: mp_malloc_sync 00:05:28.246 EAL: No shared files mode enabled, IPC is disabled 00:05:28.246 EAL: Heap on socket 0 was expanded by 514MB 00:05:28.505 EAL: Calling mem event callback 'spdk:(nil)' 00:05:28.505 EAL: request: mp_malloc_sync 00:05:28.505 EAL: No shared files mode enabled, IPC is disabled 00:05:28.505 EAL: Heap on socket 0 was shrunk by 514MB 00:05:28.505 EAL: Trying to obtain current memory policy. 00:05:28.505 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:28.763 EAL: Restoring previous memory policy: 4 00:05:28.763 EAL: Calling mem event callback 'spdk:(nil)' 00:05:28.763 EAL: request: mp_malloc_sync 00:05:28.763 EAL: No shared files mode enabled, IPC is disabled 00:05:28.763 EAL: Heap on socket 0 was expanded by 1026MB 00:05:28.763 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.022 EAL: request: mp_malloc_sync 00:05:29.022 EAL: No shared files mode enabled, IPC is disabled 00:05:29.022 EAL: Heap on socket 0 was shrunk by 1026MB 00:05:29.022 passed 00:05:29.022 00:05:29.022 Run Summary: Type Total Ran Passed Failed Inactive 00:05:29.022 suites 1 1 n/a 0 0 00:05:29.022 tests 2 2 2 0 0 00:05:29.022 asserts 6569 6569 6569 0 n/a 00:05:29.022 00:05:29.022 Elapsed time = 0.965 seconds 00:05:29.022 EAL: No shared files mode enabled, IPC is disabled 00:05:29.022 EAL: No shared files mode enabled, IPC is disabled 00:05:29.022 EAL: No shared files mode enabled, IPC is disabled 00:05:29.022 00:05:29.022 real 0m1.097s 00:05:29.022 user 0m0.646s 00:05:29.022 sys 0m0.430s 00:05:29.022 11:46:19 env.env_vtophys -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:29.022 11:46:19 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:05:29.022 ************************************ 00:05:29.022 END TEST env_vtophys 00:05:29.022 ************************************ 00:05:29.022 11:46:19 env -- common/autotest_common.sh@1142 -- # return 0 00:05:29.023 11:46:19 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:05:29.023 11:46:19 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:29.023 11:46:19 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:29.023 11:46:19 env -- common/autotest_common.sh@10 -- # set +x 00:05:29.023 ************************************ 00:05:29.023 START TEST env_pci 00:05:29.023 ************************************ 00:05:29.023 11:46:19 env.env_pci -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:05:29.023 00:05:29.023 00:05:29.023 CUnit - A unit testing framework for C - Version 2.1-3 00:05:29.023 http://cunit.sourceforge.net/ 00:05:29.023 00:05:29.023 00:05:29.023 Suite: pci 00:05:29.023 Test: pci_hook ...[2024-07-12 11:46:19.188999] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/pci.c:1040:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 545499 has claimed it 00:05:29.023 EAL: Cannot find device (10000:00:01.0) 00:05:29.023 EAL: Failed to attach device on primary process 00:05:29.023 passed 00:05:29.023 00:05:29.023 Run Summary: Type Total Ran Passed Failed Inactive 00:05:29.023 suites 1 1 n/a 0 0 00:05:29.023 tests 1 1 1 0 0 00:05:29.023 asserts 25 25 25 0 n/a 00:05:29.023 00:05:29.023 Elapsed time = 0.028 seconds 00:05:29.023 00:05:29.023 real 0m0.053s 00:05:29.023 user 0m0.018s 00:05:29.023 sys 0m0.035s 00:05:29.023 11:46:19 env.env_pci -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:29.023 11:46:19 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:05:29.023 ************************************ 00:05:29.023 END TEST env_pci 00:05:29.023 ************************************ 00:05:29.023 11:46:19 env -- common/autotest_common.sh@1142 -- # return 0 00:05:29.023 11:46:19 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:05:29.023 11:46:19 env -- env/env.sh@15 -- # uname 00:05:29.023 11:46:19 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:05:29.023 11:46:19 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:05:29.023 11:46:19 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:29.023 11:46:19 env -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:05:29.023 11:46:19 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:29.023 11:46:19 env -- common/autotest_common.sh@10 -- # set +x 00:05:29.283 ************************************ 00:05:29.283 START TEST env_dpdk_post_init 00:05:29.283 ************************************ 00:05:29.283 11:46:19 env.env_dpdk_post_init -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:29.283 EAL: Detected CPU lcores: 96 00:05:29.283 EAL: Detected NUMA nodes: 2 00:05:29.283 EAL: Detected shared linkage of DPDK 00:05:29.283 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:29.283 EAL: Selected IOVA mode 'PA' 00:05:29.283 EAL: VFIO support initialized 00:05:29.283 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.0 (socket 0) 00:05:29.283 CRYPTODEV: Creating cryptodev 0000:1a:01.0_qat_asym 00:05:29.283 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.283 CRYPTODEV: Creating cryptodev 0000:1a:01.0_qat_sym 00:05:29.283 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.283 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.1 (socket 0) 00:05:29.283 CRYPTODEV: Creating cryptodev 0000:1a:01.1_qat_asym 00:05:29.283 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.283 CRYPTODEV: Creating cryptodev 0000:1a:01.1_qat_sym 00:05:29.283 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.283 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.2 (socket 0) 00:05:29.283 CRYPTODEV: Creating cryptodev 0000:1a:01.2_qat_asym 00:05:29.283 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.283 CRYPTODEV: Creating cryptodev 0000:1a:01.2_qat_sym 00:05:29.283 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.283 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.3 (socket 0) 00:05:29.283 CRYPTODEV: Creating cryptodev 0000:1a:01.3_qat_asym 00:05:29.283 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.283 CRYPTODEV: Creating cryptodev 0000:1a:01.3_qat_sym 00:05:29.283 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.283 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.4 (socket 0) 00:05:29.283 CRYPTODEV: Creating cryptodev 0000:1a:01.4_qat_asym 00:05:29.283 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.283 CRYPTODEV: Creating cryptodev 0000:1a:01.4_qat_sym 00:05:29.283 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.283 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.5 (socket 0) 00:05:29.283 CRYPTODEV: Creating cryptodev 0000:1a:01.5_qat_asym 00:05:29.283 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.283 CRYPTODEV: Creating cryptodev 0000:1a:01.5_qat_sym 00:05:29.283 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.283 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.6 (socket 0) 00:05:29.283 CRYPTODEV: Creating cryptodev 0000:1a:01.6_qat_asym 00:05:29.283 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.283 CRYPTODEV: Creating cryptodev 0000:1a:01.6_qat_sym 00:05:29.283 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.283 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.7 (socket 0) 00:05:29.283 CRYPTODEV: Creating cryptodev 0000:1a:01.7_qat_asym 00:05:29.283 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.283 CRYPTODEV: Creating cryptodev 0000:1a:01.7_qat_sym 00:05:29.283 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.283 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.0 (socket 0) 00:05:29.283 CRYPTODEV: Creating cryptodev 0000:1a:02.0_qat_asym 00:05:29.283 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.283 CRYPTODEV: Creating cryptodev 0000:1a:02.0_qat_sym 00:05:29.283 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.283 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.1 (socket 0) 00:05:29.283 CRYPTODEV: Creating cryptodev 0000:1a:02.1_qat_asym 00:05:29.283 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.283 CRYPTODEV: Creating cryptodev 0000:1a:02.1_qat_sym 00:05:29.283 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.283 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.2 (socket 0) 00:05:29.283 CRYPTODEV: Creating cryptodev 0000:1a:02.2_qat_asym 00:05:29.283 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.283 CRYPTODEV: Creating cryptodev 0000:1a:02.2_qat_sym 00:05:29.283 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.283 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.3 (socket 0) 00:05:29.283 CRYPTODEV: Creating cryptodev 0000:1a:02.3_qat_asym 00:05:29.283 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1a:02.3_qat_sym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.284 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.4 (socket 0) 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1a:02.4_qat_asym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1a:02.4_qat_sym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.284 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.5 (socket 0) 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1a:02.5_qat_asym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1a:02.5_qat_sym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.284 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.6 (socket 0) 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1a:02.6_qat_asym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1a:02.6_qat_sym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.284 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.7 (socket 0) 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1a:02.7_qat_asym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1a:02.7_qat_sym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.284 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.0 (socket 0) 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1c:01.0_qat_asym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1c:01.0_qat_sym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.284 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.1 (socket 0) 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1c:01.1_qat_asym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1c:01.1_qat_sym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.284 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.2 (socket 0) 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1c:01.2_qat_asym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1c:01.2_qat_sym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.284 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.3 (socket 0) 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1c:01.3_qat_asym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1c:01.3_qat_sym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.284 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.4 (socket 0) 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1c:01.4_qat_asym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1c:01.4_qat_sym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.284 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.5 (socket 0) 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1c:01.5_qat_asym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1c:01.5_qat_sym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.284 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.6 (socket 0) 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1c:01.6_qat_asym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1c:01.6_qat_sym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.284 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.7 (socket 0) 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1c:01.7_qat_asym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1c:01.7_qat_sym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.284 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.0 (socket 0) 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1c:02.0_qat_asym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1c:02.0_qat_sym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.284 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.1 (socket 0) 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1c:02.1_qat_asym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1c:02.1_qat_sym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.284 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.2 (socket 0) 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1c:02.2_qat_asym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1c:02.2_qat_sym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.284 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.3 (socket 0) 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1c:02.3_qat_asym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1c:02.3_qat_sym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.284 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.4 (socket 0) 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1c:02.4_qat_asym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1c:02.4_qat_sym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.284 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.5 (socket 0) 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1c:02.5_qat_asym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1c:02.5_qat_sym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.284 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.6 (socket 0) 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1c:02.6_qat_asym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1c:02.6_qat_sym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.284 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.7 (socket 0) 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1c:02.7_qat_asym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1c:02.7_qat_sym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.284 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.0 (socket 0) 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1e:01.0_qat_asym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1e:01.0_qat_sym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.284 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.1 (socket 0) 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1e:01.1_qat_asym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1e:01.1_qat_sym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.284 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.2 (socket 0) 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1e:01.2_qat_asym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1e:01.2_qat_sym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.284 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.3 (socket 0) 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1e:01.3_qat_asym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1e:01.3_qat_sym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.284 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.4 (socket 0) 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1e:01.4_qat_asym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1e:01.4_qat_sym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.284 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.5 (socket 0) 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1e:01.5_qat_asym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1e:01.5_qat_sym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.284 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.6 (socket 0) 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1e:01.6_qat_asym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1e:01.6_qat_sym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.284 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.7 (socket 0) 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1e:01.7_qat_asym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1e:01.7_qat_sym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.284 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.0 (socket 0) 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1e:02.0_qat_asym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1e:02.0_qat_sym 00:05:29.284 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.284 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.1 (socket 0) 00:05:29.284 CRYPTODEV: Creating cryptodev 0000:1e:02.1_qat_asym 00:05:29.285 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.285 CRYPTODEV: Creating cryptodev 0000:1e:02.1_qat_sym 00:05:29.285 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.285 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.2 (socket 0) 00:05:29.285 CRYPTODEV: Creating cryptodev 0000:1e:02.2_qat_asym 00:05:29.285 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.285 CRYPTODEV: Creating cryptodev 0000:1e:02.2_qat_sym 00:05:29.285 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.285 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.3 (socket 0) 00:05:29.285 CRYPTODEV: Creating cryptodev 0000:1e:02.3_qat_asym 00:05:29.285 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.285 CRYPTODEV: Creating cryptodev 0000:1e:02.3_qat_sym 00:05:29.285 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.285 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.4 (socket 0) 00:05:29.285 CRYPTODEV: Creating cryptodev 0000:1e:02.4_qat_asym 00:05:29.285 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.285 CRYPTODEV: Creating cryptodev 0000:1e:02.4_qat_sym 00:05:29.285 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.285 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.5 (socket 0) 00:05:29.285 CRYPTODEV: Creating cryptodev 0000:1e:02.5_qat_asym 00:05:29.285 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.285 CRYPTODEV: Creating cryptodev 0000:1e:02.5_qat_sym 00:05:29.285 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.285 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.6 (socket 0) 00:05:29.285 CRYPTODEV: Creating cryptodev 0000:1e:02.6_qat_asym 00:05:29.285 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.285 CRYPTODEV: Creating cryptodev 0000:1e:02.6_qat_sym 00:05:29.285 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.285 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.7 (socket 0) 00:05:29.285 CRYPTODEV: Creating cryptodev 0000:1e:02.7_qat_asym 00:05:29.285 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:29.285 CRYPTODEV: Creating cryptodev 0000:1e:02.7_qat_sym 00:05:29.285 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:29.285 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:29.285 EAL: Using IOMMU type 1 (Type 1) 00:05:29.285 EAL: Ignore mapping IO port bar(1) 00:05:29.285 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.0 (socket 0) 00:05:29.285 EAL: Ignore mapping IO port bar(1) 00:05:29.285 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.1 (socket 0) 00:05:29.285 EAL: Ignore mapping IO port bar(1) 00:05:29.285 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.2 (socket 0) 00:05:29.285 EAL: Ignore mapping IO port bar(1) 00:05:29.285 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.3 (socket 0) 00:05:29.285 EAL: Ignore mapping IO port bar(1) 00:05:29.285 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.4 (socket 0) 00:05:29.285 EAL: Ignore mapping IO port bar(1) 00:05:29.285 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.5 (socket 0) 00:05:29.285 EAL: Ignore mapping IO port bar(1) 00:05:29.285 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.6 (socket 0) 00:05:29.285 EAL: Ignore mapping IO port bar(1) 00:05:29.285 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.7 (socket 0) 00:05:30.223 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:5e:00.0 (socket 0) 00:05:30.223 EAL: Ignore mapping IO port bar(1) 00:05:30.223 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.0 (socket 1) 00:05:30.223 EAL: Ignore mapping IO port bar(1) 00:05:30.223 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.1 (socket 1) 00:05:30.223 EAL: Ignore mapping IO port bar(1) 00:05:30.223 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.2 (socket 1) 00:05:30.223 EAL: Ignore mapping IO port bar(1) 00:05:30.223 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.3 (socket 1) 00:05:30.223 EAL: Ignore mapping IO port bar(1) 00:05:30.223 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.4 (socket 1) 00:05:30.223 EAL: Ignore mapping IO port bar(1) 00:05:30.223 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.5 (socket 1) 00:05:30.223 EAL: Ignore mapping IO port bar(1) 00:05:30.223 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.6 (socket 1) 00:05:30.223 EAL: Ignore mapping IO port bar(1) 00:05:30.223 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.7 (socket 1) 00:05:33.511 EAL: Releasing PCI mapped resource for 0000:5e:00.0 00:05:33.511 EAL: Calling pci_unmap_resource for 0000:5e:00.0 at 0x202001080000 00:05:33.511 Starting DPDK initialization... 00:05:33.511 Starting SPDK post initialization... 00:05:33.511 SPDK NVMe probe 00:05:33.511 Attaching to 0000:5e:00.0 00:05:33.511 Attached to 0000:5e:00.0 00:05:33.511 Cleaning up... 00:05:33.511 00:05:33.511 real 0m4.421s 00:05:33.511 user 0m3.350s 00:05:33.511 sys 0m0.142s 00:05:33.511 11:46:23 env.env_dpdk_post_init -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:33.511 11:46:23 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:05:33.511 ************************************ 00:05:33.512 END TEST env_dpdk_post_init 00:05:33.512 ************************************ 00:05:33.512 11:46:23 env -- common/autotest_common.sh@1142 -- # return 0 00:05:33.512 11:46:23 env -- env/env.sh@26 -- # uname 00:05:33.512 11:46:23 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:05:33.512 11:46:23 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:33.512 11:46:23 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:33.512 11:46:23 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:33.512 11:46:23 env -- common/autotest_common.sh@10 -- # set +x 00:05:33.772 ************************************ 00:05:33.772 START TEST env_mem_callbacks 00:05:33.772 ************************************ 00:05:33.772 11:46:23 env.env_mem_callbacks -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:33.772 EAL: Detected CPU lcores: 96 00:05:33.772 EAL: Detected NUMA nodes: 2 00:05:33.772 EAL: Detected shared linkage of DPDK 00:05:33.772 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:33.772 EAL: Selected IOVA mode 'PA' 00:05:33.772 EAL: VFIO support initialized 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.0 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1a:01.0_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1a:01.0_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.1 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1a:01.1_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1a:01.1_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.2 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1a:01.2_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1a:01.2_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.3 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1a:01.3_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1a:01.3_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.4 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1a:01.4_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1a:01.4_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.5 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1a:01.5_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1a:01.5_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.6 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1a:01.6_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1a:01.6_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.7 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1a:01.7_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1a:01.7_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.0 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1a:02.0_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1a:02.0_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.1 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1a:02.1_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1a:02.1_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.2 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1a:02.2_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1a:02.2_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.3 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1a:02.3_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1a:02.3_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.4 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1a:02.4_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1a:02.4_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.5 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1a:02.5_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1a:02.5_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.6 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1a:02.6_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1a:02.6_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.7 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1a:02.7_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1a:02.7_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.0 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1c:01.0_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1c:01.0_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.1 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1c:01.1_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1c:01.1_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.2 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1c:01.2_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1c:01.2_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.3 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1c:01.3_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1c:01.3_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.4 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1c:01.4_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1c:01.4_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.5 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1c:01.5_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1c:01.5_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.6 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1c:01.6_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1c:01.6_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.7 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1c:01.7_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1c:01.7_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.0 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1c:02.0_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1c:02.0_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.1 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1c:02.1_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1c:02.1_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.2 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1c:02.2_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1c:02.2_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.3 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1c:02.3_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1c:02.3_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.4 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1c:02.4_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1c:02.4_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.5 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1c:02.5_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1c:02.5_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.6 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1c:02.6_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1c:02.6_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.7 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1c:02.7_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1c:02.7_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.0 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1e:01.0_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1e:01.0_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.1 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1e:01.1_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1e:01.1_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.2 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1e:01.2_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1e:01.2_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.3 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1e:01.3_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1e:01.3_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.4 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1e:01.4_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1e:01.4_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.5 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1e:01.5_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1e:01.5_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.6 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1e:01.6_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1e:01.6_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.7 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1e:01.7_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1e:01.7_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.0 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1e:02.0_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1e:02.0_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.1 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1e:02.1_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1e:02.1_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.2 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1e:02.2_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1e:02.2_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.3 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1e:02.3_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1e:02.3_qat_sym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.772 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.4 (socket 0) 00:05:33.772 CRYPTODEV: Creating cryptodev 0000:1e:02.4_qat_asym 00:05:33.772 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.773 CRYPTODEV: Creating cryptodev 0000:1e:02.4_qat_sym 00:05:33.773 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.773 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.5 (socket 0) 00:05:33.773 CRYPTODEV: Creating cryptodev 0000:1e:02.5_qat_asym 00:05:33.773 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.773 CRYPTODEV: Creating cryptodev 0000:1e:02.5_qat_sym 00:05:33.773 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.773 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.6 (socket 0) 00:05:33.773 CRYPTODEV: Creating cryptodev 0000:1e:02.6_qat_asym 00:05:33.773 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.773 CRYPTODEV: Creating cryptodev 0000:1e:02.6_qat_sym 00:05:33.773 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.773 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.7 (socket 0) 00:05:33.773 CRYPTODEV: Creating cryptodev 0000:1e:02.7_qat_asym 00:05:33.773 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:33.773 CRYPTODEV: Creating cryptodev 0000:1e:02.7_qat_sym 00:05:33.773 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:33.773 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:33.773 00:05:33.773 00:05:33.773 CUnit - A unit testing framework for C - Version 2.1-3 00:05:33.773 http://cunit.sourceforge.net/ 00:05:33.773 00:05:33.773 00:05:33.773 Suite: memory 00:05:33.773 Test: test ... 00:05:33.773 register 0x200000200000 2097152 00:05:33.773 malloc 3145728 00:05:33.773 register 0x200000400000 4194304 00:05:33.773 buf 0x200000500000 len 3145728 PASSED 00:05:33.773 malloc 64 00:05:33.773 buf 0x2000004fff40 len 64 PASSED 00:05:33.773 malloc 4194304 00:05:33.773 register 0x200000800000 6291456 00:05:33.773 buf 0x200000a00000 len 4194304 PASSED 00:05:33.773 free 0x200000500000 3145728 00:05:33.773 free 0x2000004fff40 64 00:05:33.773 unregister 0x200000400000 4194304 PASSED 00:05:33.773 free 0x200000a00000 4194304 00:05:33.773 unregister 0x200000800000 6291456 PASSED 00:05:33.773 malloc 8388608 00:05:33.773 register 0x200000400000 10485760 00:05:33.773 buf 0x200000600000 len 8388608 PASSED 00:05:33.773 free 0x200000600000 8388608 00:05:33.773 unregister 0x200000400000 10485760 PASSED 00:05:33.773 passed 00:05:33.773 00:05:33.773 Run Summary: Type Total Ran Passed Failed Inactive 00:05:33.773 suites 1 1 n/a 0 0 00:05:33.773 tests 1 1 1 0 0 00:05:33.773 asserts 15 15 15 0 n/a 00:05:33.773 00:05:33.773 Elapsed time = 0.006 seconds 00:05:33.773 00:05:33.773 real 0m0.075s 00:05:33.773 user 0m0.026s 00:05:33.773 sys 0m0.048s 00:05:33.773 11:46:23 env.env_mem_callbacks -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:33.773 11:46:23 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:05:33.773 ************************************ 00:05:33.773 END TEST env_mem_callbacks 00:05:33.773 ************************************ 00:05:33.773 11:46:23 env -- common/autotest_common.sh@1142 -- # return 0 00:05:33.773 00:05:33.773 real 0m6.236s 00:05:33.773 user 0m4.376s 00:05:33.773 sys 0m0.938s 00:05:33.773 11:46:23 env -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:33.773 11:46:23 env -- common/autotest_common.sh@10 -- # set +x 00:05:33.773 ************************************ 00:05:33.773 END TEST env 00:05:33.773 ************************************ 00:05:33.773 11:46:23 -- common/autotest_common.sh@1142 -- # return 0 00:05:33.773 11:46:23 -- spdk/autotest.sh@169 -- # run_test rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:05:33.773 11:46:23 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:33.773 11:46:23 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:33.773 11:46:23 -- common/autotest_common.sh@10 -- # set +x 00:05:33.773 ************************************ 00:05:33.773 START TEST rpc 00:05:33.773 ************************************ 00:05:33.773 11:46:23 rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:05:34.032 * Looking for test storage... 00:05:34.032 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:34.032 11:46:24 rpc -- rpc/rpc.sh@65 -- # spdk_pid=546516 00:05:34.032 11:46:24 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:34.032 11:46:24 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:05:34.032 11:46:24 rpc -- rpc/rpc.sh@67 -- # waitforlisten 546516 00:05:34.032 11:46:24 rpc -- common/autotest_common.sh@829 -- # '[' -z 546516 ']' 00:05:34.032 11:46:24 rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:34.032 11:46:24 rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:34.032 11:46:24 rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:34.032 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:34.032 11:46:24 rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:34.032 11:46:24 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:34.032 [2024-07-12 11:46:24.110249] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:05:34.032 [2024-07-12 11:46:24.110292] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid546516 ] 00:05:34.032 [2024-07-12 11:46:24.185499] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:34.032 [2024-07-12 11:46:24.261483] app.c: 604:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:05:34.032 [2024-07-12 11:46:24.261525] app.c: 605:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 546516' to capture a snapshot of events at runtime. 00:05:34.032 [2024-07-12 11:46:24.261531] app.c: 610:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:05:34.032 [2024-07-12 11:46:24.261537] app.c: 611:app_setup_trace: *NOTICE*: SPDK application currently running. 00:05:34.032 [2024-07-12 11:46:24.261542] app.c: 612:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid546516 for offline analysis/debug. 00:05:34.032 [2024-07-12 11:46:24.261562] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:34.970 11:46:24 rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:34.970 11:46:24 rpc -- common/autotest_common.sh@862 -- # return 0 00:05:34.970 11:46:24 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:34.970 11:46:24 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:34.970 11:46:24 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:05:34.970 11:46:24 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:05:34.970 11:46:24 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:34.970 11:46:24 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:34.970 11:46:24 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:34.970 ************************************ 00:05:34.970 START TEST rpc_integrity 00:05:34.970 ************************************ 00:05:34.970 11:46:24 rpc.rpc_integrity -- common/autotest_common.sh@1123 -- # rpc_integrity 00:05:34.970 11:46:24 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:34.970 11:46:24 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:34.970 11:46:24 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:34.970 11:46:24 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:34.970 11:46:24 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:34.970 11:46:24 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:05:34.970 11:46:24 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:34.970 11:46:24 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:34.970 11:46:24 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:34.970 11:46:24 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:34.970 11:46:24 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:34.970 11:46:24 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:05:34.970 11:46:24 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:34.970 11:46:24 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:34.970 11:46:24 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:34.970 11:46:25 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:34.971 11:46:25 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:34.971 { 00:05:34.971 "name": "Malloc0", 00:05:34.971 "aliases": [ 00:05:34.971 "2e1acd0f-fe1c-489f-af4f-0f78867069b5" 00:05:34.971 ], 00:05:34.971 "product_name": "Malloc disk", 00:05:34.971 "block_size": 512, 00:05:34.971 "num_blocks": 16384, 00:05:34.971 "uuid": "2e1acd0f-fe1c-489f-af4f-0f78867069b5", 00:05:34.971 "assigned_rate_limits": { 00:05:34.971 "rw_ios_per_sec": 0, 00:05:34.971 "rw_mbytes_per_sec": 0, 00:05:34.971 "r_mbytes_per_sec": 0, 00:05:34.971 "w_mbytes_per_sec": 0 00:05:34.971 }, 00:05:34.971 "claimed": false, 00:05:34.971 "zoned": false, 00:05:34.971 "supported_io_types": { 00:05:34.971 "read": true, 00:05:34.971 "write": true, 00:05:34.971 "unmap": true, 00:05:34.971 "flush": true, 00:05:34.971 "reset": true, 00:05:34.971 "nvme_admin": false, 00:05:34.971 "nvme_io": false, 00:05:34.971 "nvme_io_md": false, 00:05:34.971 "write_zeroes": true, 00:05:34.971 "zcopy": true, 00:05:34.971 "get_zone_info": false, 00:05:34.971 "zone_management": false, 00:05:34.971 "zone_append": false, 00:05:34.971 "compare": false, 00:05:34.971 "compare_and_write": false, 00:05:34.971 "abort": true, 00:05:34.971 "seek_hole": false, 00:05:34.971 "seek_data": false, 00:05:34.971 "copy": true, 00:05:34.971 "nvme_iov_md": false 00:05:34.971 }, 00:05:34.971 "memory_domains": [ 00:05:34.971 { 00:05:34.971 "dma_device_id": "system", 00:05:34.971 "dma_device_type": 1 00:05:34.971 }, 00:05:34.971 { 00:05:34.971 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:34.971 "dma_device_type": 2 00:05:34.971 } 00:05:34.971 ], 00:05:34.971 "driver_specific": {} 00:05:34.971 } 00:05:34.971 ]' 00:05:34.971 11:46:25 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:05:34.971 11:46:25 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:34.971 11:46:25 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:05:34.971 11:46:25 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:34.971 11:46:25 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:34.971 [2024-07-12 11:46:25.055135] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:05:34.971 [2024-07-12 11:46:25.055161] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:34.971 [2024-07-12 11:46:25.055172] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x7ebe90 00:05:34.971 [2024-07-12 11:46:25.055177] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:34.971 [2024-07-12 11:46:25.056257] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:34.971 [2024-07-12 11:46:25.056276] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:34.971 Passthru0 00:05:34.971 11:46:25 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:34.971 11:46:25 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:34.971 11:46:25 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:34.971 11:46:25 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:34.971 11:46:25 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:34.971 11:46:25 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:34.971 { 00:05:34.971 "name": "Malloc0", 00:05:34.971 "aliases": [ 00:05:34.971 "2e1acd0f-fe1c-489f-af4f-0f78867069b5" 00:05:34.971 ], 00:05:34.971 "product_name": "Malloc disk", 00:05:34.971 "block_size": 512, 00:05:34.971 "num_blocks": 16384, 00:05:34.971 "uuid": "2e1acd0f-fe1c-489f-af4f-0f78867069b5", 00:05:34.971 "assigned_rate_limits": { 00:05:34.971 "rw_ios_per_sec": 0, 00:05:34.971 "rw_mbytes_per_sec": 0, 00:05:34.971 "r_mbytes_per_sec": 0, 00:05:34.971 "w_mbytes_per_sec": 0 00:05:34.971 }, 00:05:34.971 "claimed": true, 00:05:34.971 "claim_type": "exclusive_write", 00:05:34.971 "zoned": false, 00:05:34.971 "supported_io_types": { 00:05:34.971 "read": true, 00:05:34.971 "write": true, 00:05:34.971 "unmap": true, 00:05:34.971 "flush": true, 00:05:34.971 "reset": true, 00:05:34.971 "nvme_admin": false, 00:05:34.971 "nvme_io": false, 00:05:34.971 "nvme_io_md": false, 00:05:34.971 "write_zeroes": true, 00:05:34.971 "zcopy": true, 00:05:34.971 "get_zone_info": false, 00:05:34.971 "zone_management": false, 00:05:34.971 "zone_append": false, 00:05:34.971 "compare": false, 00:05:34.971 "compare_and_write": false, 00:05:34.971 "abort": true, 00:05:34.971 "seek_hole": false, 00:05:34.971 "seek_data": false, 00:05:34.971 "copy": true, 00:05:34.971 "nvme_iov_md": false 00:05:34.971 }, 00:05:34.971 "memory_domains": [ 00:05:34.971 { 00:05:34.971 "dma_device_id": "system", 00:05:34.971 "dma_device_type": 1 00:05:34.971 }, 00:05:34.971 { 00:05:34.971 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:34.971 "dma_device_type": 2 00:05:34.971 } 00:05:34.971 ], 00:05:34.971 "driver_specific": {} 00:05:34.971 }, 00:05:34.971 { 00:05:34.971 "name": "Passthru0", 00:05:34.971 "aliases": [ 00:05:34.971 "72f9e331-b0f1-5c2d-8f83-d534df7051d2" 00:05:34.971 ], 00:05:34.971 "product_name": "passthru", 00:05:34.971 "block_size": 512, 00:05:34.971 "num_blocks": 16384, 00:05:34.971 "uuid": "72f9e331-b0f1-5c2d-8f83-d534df7051d2", 00:05:34.971 "assigned_rate_limits": { 00:05:34.971 "rw_ios_per_sec": 0, 00:05:34.971 "rw_mbytes_per_sec": 0, 00:05:34.971 "r_mbytes_per_sec": 0, 00:05:34.971 "w_mbytes_per_sec": 0 00:05:34.971 }, 00:05:34.971 "claimed": false, 00:05:34.971 "zoned": false, 00:05:34.971 "supported_io_types": { 00:05:34.971 "read": true, 00:05:34.971 "write": true, 00:05:34.971 "unmap": true, 00:05:34.971 "flush": true, 00:05:34.971 "reset": true, 00:05:34.971 "nvme_admin": false, 00:05:34.971 "nvme_io": false, 00:05:34.971 "nvme_io_md": false, 00:05:34.971 "write_zeroes": true, 00:05:34.971 "zcopy": true, 00:05:34.971 "get_zone_info": false, 00:05:34.971 "zone_management": false, 00:05:34.971 "zone_append": false, 00:05:34.971 "compare": false, 00:05:34.971 "compare_and_write": false, 00:05:34.971 "abort": true, 00:05:34.971 "seek_hole": false, 00:05:34.971 "seek_data": false, 00:05:34.971 "copy": true, 00:05:34.971 "nvme_iov_md": false 00:05:34.971 }, 00:05:34.971 "memory_domains": [ 00:05:34.971 { 00:05:34.971 "dma_device_id": "system", 00:05:34.971 "dma_device_type": 1 00:05:34.971 }, 00:05:34.971 { 00:05:34.971 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:34.971 "dma_device_type": 2 00:05:34.971 } 00:05:34.971 ], 00:05:34.971 "driver_specific": { 00:05:34.971 "passthru": { 00:05:34.971 "name": "Passthru0", 00:05:34.971 "base_bdev_name": "Malloc0" 00:05:34.971 } 00:05:34.971 } 00:05:34.971 } 00:05:34.971 ]' 00:05:34.971 11:46:25 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:05:34.971 11:46:25 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:34.971 11:46:25 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:34.971 11:46:25 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:34.972 11:46:25 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:34.972 11:46:25 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:34.972 11:46:25 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:05:34.972 11:46:25 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:34.972 11:46:25 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:34.972 11:46:25 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:34.972 11:46:25 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:34.972 11:46:25 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:34.972 11:46:25 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:34.972 11:46:25 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:34.972 11:46:25 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:34.972 11:46:25 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:05:34.972 11:46:25 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:34.972 00:05:34.972 real 0m0.269s 00:05:34.972 user 0m0.163s 00:05:34.972 sys 0m0.036s 00:05:34.972 11:46:25 rpc.rpc_integrity -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:34.972 11:46:25 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:34.972 ************************************ 00:05:34.972 END TEST rpc_integrity 00:05:34.972 ************************************ 00:05:35.231 11:46:25 rpc -- common/autotest_common.sh@1142 -- # return 0 00:05:35.231 11:46:25 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:05:35.231 11:46:25 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:35.231 11:46:25 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:35.231 11:46:25 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:35.231 ************************************ 00:05:35.231 START TEST rpc_plugins 00:05:35.231 ************************************ 00:05:35.231 11:46:25 rpc.rpc_plugins -- common/autotest_common.sh@1123 -- # rpc_plugins 00:05:35.231 11:46:25 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:05:35.231 11:46:25 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:35.231 11:46:25 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:35.231 11:46:25 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:35.231 11:46:25 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:05:35.231 11:46:25 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:05:35.231 11:46:25 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:35.231 11:46:25 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:35.231 11:46:25 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:35.231 11:46:25 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:05:35.231 { 00:05:35.231 "name": "Malloc1", 00:05:35.231 "aliases": [ 00:05:35.231 "678d4fe9-be5d-4ef3-986a-26248436aebb" 00:05:35.231 ], 00:05:35.231 "product_name": "Malloc disk", 00:05:35.231 "block_size": 4096, 00:05:35.231 "num_blocks": 256, 00:05:35.231 "uuid": "678d4fe9-be5d-4ef3-986a-26248436aebb", 00:05:35.231 "assigned_rate_limits": { 00:05:35.231 "rw_ios_per_sec": 0, 00:05:35.231 "rw_mbytes_per_sec": 0, 00:05:35.231 "r_mbytes_per_sec": 0, 00:05:35.231 "w_mbytes_per_sec": 0 00:05:35.231 }, 00:05:35.231 "claimed": false, 00:05:35.231 "zoned": false, 00:05:35.231 "supported_io_types": { 00:05:35.231 "read": true, 00:05:35.231 "write": true, 00:05:35.231 "unmap": true, 00:05:35.231 "flush": true, 00:05:35.231 "reset": true, 00:05:35.231 "nvme_admin": false, 00:05:35.231 "nvme_io": false, 00:05:35.231 "nvme_io_md": false, 00:05:35.231 "write_zeroes": true, 00:05:35.231 "zcopy": true, 00:05:35.231 "get_zone_info": false, 00:05:35.231 "zone_management": false, 00:05:35.231 "zone_append": false, 00:05:35.231 "compare": false, 00:05:35.231 "compare_and_write": false, 00:05:35.231 "abort": true, 00:05:35.231 "seek_hole": false, 00:05:35.231 "seek_data": false, 00:05:35.231 "copy": true, 00:05:35.231 "nvme_iov_md": false 00:05:35.231 }, 00:05:35.231 "memory_domains": [ 00:05:35.231 { 00:05:35.231 "dma_device_id": "system", 00:05:35.231 "dma_device_type": 1 00:05:35.231 }, 00:05:35.231 { 00:05:35.231 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:35.231 "dma_device_type": 2 00:05:35.231 } 00:05:35.231 ], 00:05:35.231 "driver_specific": {} 00:05:35.231 } 00:05:35.231 ]' 00:05:35.231 11:46:25 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:05:35.231 11:46:25 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:05:35.231 11:46:25 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:05:35.231 11:46:25 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:35.231 11:46:25 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:35.231 11:46:25 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:35.231 11:46:25 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:05:35.231 11:46:25 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:35.231 11:46:25 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:35.231 11:46:25 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:35.231 11:46:25 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:05:35.231 11:46:25 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:05:35.231 11:46:25 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:05:35.231 00:05:35.231 real 0m0.140s 00:05:35.231 user 0m0.092s 00:05:35.231 sys 0m0.012s 00:05:35.231 11:46:25 rpc.rpc_plugins -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:35.231 11:46:25 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:35.231 ************************************ 00:05:35.231 END TEST rpc_plugins 00:05:35.231 ************************************ 00:05:35.232 11:46:25 rpc -- common/autotest_common.sh@1142 -- # return 0 00:05:35.232 11:46:25 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:05:35.232 11:46:25 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:35.232 11:46:25 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:35.232 11:46:25 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:35.232 ************************************ 00:05:35.232 START TEST rpc_trace_cmd_test 00:05:35.232 ************************************ 00:05:35.232 11:46:25 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1123 -- # rpc_trace_cmd_test 00:05:35.232 11:46:25 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:05:35.232 11:46:25 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:05:35.232 11:46:25 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:35.232 11:46:25 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:05:35.491 11:46:25 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:35.491 11:46:25 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:05:35.491 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid546516", 00:05:35.491 "tpoint_group_mask": "0x8", 00:05:35.491 "iscsi_conn": { 00:05:35.491 "mask": "0x2", 00:05:35.491 "tpoint_mask": "0x0" 00:05:35.491 }, 00:05:35.491 "scsi": { 00:05:35.491 "mask": "0x4", 00:05:35.491 "tpoint_mask": "0x0" 00:05:35.491 }, 00:05:35.491 "bdev": { 00:05:35.491 "mask": "0x8", 00:05:35.491 "tpoint_mask": "0xffffffffffffffff" 00:05:35.491 }, 00:05:35.491 "nvmf_rdma": { 00:05:35.491 "mask": "0x10", 00:05:35.491 "tpoint_mask": "0x0" 00:05:35.491 }, 00:05:35.491 "nvmf_tcp": { 00:05:35.491 "mask": "0x20", 00:05:35.491 "tpoint_mask": "0x0" 00:05:35.491 }, 00:05:35.491 "ftl": { 00:05:35.491 "mask": "0x40", 00:05:35.491 "tpoint_mask": "0x0" 00:05:35.491 }, 00:05:35.491 "blobfs": { 00:05:35.491 "mask": "0x80", 00:05:35.491 "tpoint_mask": "0x0" 00:05:35.491 }, 00:05:35.491 "dsa": { 00:05:35.491 "mask": "0x200", 00:05:35.491 "tpoint_mask": "0x0" 00:05:35.491 }, 00:05:35.491 "thread": { 00:05:35.491 "mask": "0x400", 00:05:35.491 "tpoint_mask": "0x0" 00:05:35.491 }, 00:05:35.491 "nvme_pcie": { 00:05:35.491 "mask": "0x800", 00:05:35.491 "tpoint_mask": "0x0" 00:05:35.491 }, 00:05:35.491 "iaa": { 00:05:35.491 "mask": "0x1000", 00:05:35.491 "tpoint_mask": "0x0" 00:05:35.491 }, 00:05:35.491 "nvme_tcp": { 00:05:35.491 "mask": "0x2000", 00:05:35.491 "tpoint_mask": "0x0" 00:05:35.491 }, 00:05:35.491 "bdev_nvme": { 00:05:35.491 "mask": "0x4000", 00:05:35.491 "tpoint_mask": "0x0" 00:05:35.491 }, 00:05:35.491 "sock": { 00:05:35.491 "mask": "0x8000", 00:05:35.491 "tpoint_mask": "0x0" 00:05:35.491 } 00:05:35.491 }' 00:05:35.491 11:46:25 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:05:35.491 11:46:25 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 16 -gt 2 ']' 00:05:35.491 11:46:25 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:05:35.491 11:46:25 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:05:35.491 11:46:25 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:05:35.491 11:46:25 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:05:35.491 11:46:25 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:05:35.491 11:46:25 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:05:35.491 11:46:25 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:05:35.491 11:46:25 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:05:35.491 00:05:35.491 real 0m0.227s 00:05:35.491 user 0m0.195s 00:05:35.491 sys 0m0.022s 00:05:35.491 11:46:25 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:35.491 11:46:25 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:05:35.491 ************************************ 00:05:35.491 END TEST rpc_trace_cmd_test 00:05:35.491 ************************************ 00:05:35.491 11:46:25 rpc -- common/autotest_common.sh@1142 -- # return 0 00:05:35.491 11:46:25 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:05:35.491 11:46:25 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:05:35.491 11:46:25 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:05:35.491 11:46:25 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:35.491 11:46:25 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:35.491 11:46:25 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:35.751 ************************************ 00:05:35.751 START TEST rpc_daemon_integrity 00:05:35.751 ************************************ 00:05:35.751 11:46:25 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1123 -- # rpc_integrity 00:05:35.751 11:46:25 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:35.751 11:46:25 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:35.751 11:46:25 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:35.751 11:46:25 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:35.751 11:46:25 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:35.751 11:46:25 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:05:35.751 11:46:25 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:35.751 11:46:25 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:35.751 11:46:25 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:35.751 11:46:25 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:35.752 11:46:25 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:35.752 11:46:25 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:05:35.752 11:46:25 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:35.752 11:46:25 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:35.752 11:46:25 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:35.752 11:46:25 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:35.752 11:46:25 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:35.752 { 00:05:35.752 "name": "Malloc2", 00:05:35.752 "aliases": [ 00:05:35.752 "671185ba-bf9a-46ad-b78c-ecba798c0dba" 00:05:35.752 ], 00:05:35.752 "product_name": "Malloc disk", 00:05:35.752 "block_size": 512, 00:05:35.752 "num_blocks": 16384, 00:05:35.752 "uuid": "671185ba-bf9a-46ad-b78c-ecba798c0dba", 00:05:35.752 "assigned_rate_limits": { 00:05:35.752 "rw_ios_per_sec": 0, 00:05:35.752 "rw_mbytes_per_sec": 0, 00:05:35.752 "r_mbytes_per_sec": 0, 00:05:35.752 "w_mbytes_per_sec": 0 00:05:35.752 }, 00:05:35.752 "claimed": false, 00:05:35.752 "zoned": false, 00:05:35.752 "supported_io_types": { 00:05:35.752 "read": true, 00:05:35.752 "write": true, 00:05:35.752 "unmap": true, 00:05:35.752 "flush": true, 00:05:35.752 "reset": true, 00:05:35.752 "nvme_admin": false, 00:05:35.752 "nvme_io": false, 00:05:35.752 "nvme_io_md": false, 00:05:35.752 "write_zeroes": true, 00:05:35.752 "zcopy": true, 00:05:35.752 "get_zone_info": false, 00:05:35.752 "zone_management": false, 00:05:35.752 "zone_append": false, 00:05:35.752 "compare": false, 00:05:35.752 "compare_and_write": false, 00:05:35.752 "abort": true, 00:05:35.752 "seek_hole": false, 00:05:35.752 "seek_data": false, 00:05:35.752 "copy": true, 00:05:35.752 "nvme_iov_md": false 00:05:35.752 }, 00:05:35.752 "memory_domains": [ 00:05:35.752 { 00:05:35.752 "dma_device_id": "system", 00:05:35.752 "dma_device_type": 1 00:05:35.752 }, 00:05:35.752 { 00:05:35.752 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:35.752 "dma_device_type": 2 00:05:35.752 } 00:05:35.752 ], 00:05:35.752 "driver_specific": {} 00:05:35.752 } 00:05:35.752 ]' 00:05:35.752 11:46:25 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:05:35.752 11:46:25 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:35.752 11:46:25 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:05:35.752 11:46:25 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:35.752 11:46:25 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:35.752 [2024-07-12 11:46:25.889400] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:05:35.752 [2024-07-12 11:46:25.889424] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:35.752 [2024-07-12 11:46:25.889437] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x8f93e0 00:05:35.752 [2024-07-12 11:46:25.889443] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:35.752 [2024-07-12 11:46:25.890398] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:35.752 [2024-07-12 11:46:25.890417] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:35.752 Passthru0 00:05:35.752 11:46:25 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:35.752 11:46:25 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:35.752 11:46:25 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:35.752 11:46:25 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:35.752 11:46:25 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:35.752 11:46:25 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:35.752 { 00:05:35.752 "name": "Malloc2", 00:05:35.752 "aliases": [ 00:05:35.752 "671185ba-bf9a-46ad-b78c-ecba798c0dba" 00:05:35.752 ], 00:05:35.752 "product_name": "Malloc disk", 00:05:35.752 "block_size": 512, 00:05:35.752 "num_blocks": 16384, 00:05:35.752 "uuid": "671185ba-bf9a-46ad-b78c-ecba798c0dba", 00:05:35.752 "assigned_rate_limits": { 00:05:35.752 "rw_ios_per_sec": 0, 00:05:35.752 "rw_mbytes_per_sec": 0, 00:05:35.752 "r_mbytes_per_sec": 0, 00:05:35.752 "w_mbytes_per_sec": 0 00:05:35.752 }, 00:05:35.752 "claimed": true, 00:05:35.752 "claim_type": "exclusive_write", 00:05:35.752 "zoned": false, 00:05:35.752 "supported_io_types": { 00:05:35.752 "read": true, 00:05:35.752 "write": true, 00:05:35.752 "unmap": true, 00:05:35.752 "flush": true, 00:05:35.752 "reset": true, 00:05:35.752 "nvme_admin": false, 00:05:35.752 "nvme_io": false, 00:05:35.752 "nvme_io_md": false, 00:05:35.752 "write_zeroes": true, 00:05:35.752 "zcopy": true, 00:05:35.752 "get_zone_info": false, 00:05:35.752 "zone_management": false, 00:05:35.752 "zone_append": false, 00:05:35.752 "compare": false, 00:05:35.752 "compare_and_write": false, 00:05:35.752 "abort": true, 00:05:35.752 "seek_hole": false, 00:05:35.752 "seek_data": false, 00:05:35.752 "copy": true, 00:05:35.752 "nvme_iov_md": false 00:05:35.752 }, 00:05:35.752 "memory_domains": [ 00:05:35.752 { 00:05:35.752 "dma_device_id": "system", 00:05:35.752 "dma_device_type": 1 00:05:35.752 }, 00:05:35.752 { 00:05:35.752 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:35.752 "dma_device_type": 2 00:05:35.752 } 00:05:35.752 ], 00:05:35.752 "driver_specific": {} 00:05:35.752 }, 00:05:35.752 { 00:05:35.752 "name": "Passthru0", 00:05:35.752 "aliases": [ 00:05:35.752 "c7923453-744b-5a6e-9ae7-b3ca68da3d4e" 00:05:35.752 ], 00:05:35.752 "product_name": "passthru", 00:05:35.752 "block_size": 512, 00:05:35.752 "num_blocks": 16384, 00:05:35.752 "uuid": "c7923453-744b-5a6e-9ae7-b3ca68da3d4e", 00:05:35.752 "assigned_rate_limits": { 00:05:35.752 "rw_ios_per_sec": 0, 00:05:35.752 "rw_mbytes_per_sec": 0, 00:05:35.752 "r_mbytes_per_sec": 0, 00:05:35.752 "w_mbytes_per_sec": 0 00:05:35.752 }, 00:05:35.752 "claimed": false, 00:05:35.752 "zoned": false, 00:05:35.752 "supported_io_types": { 00:05:35.752 "read": true, 00:05:35.752 "write": true, 00:05:35.752 "unmap": true, 00:05:35.752 "flush": true, 00:05:35.752 "reset": true, 00:05:35.752 "nvme_admin": false, 00:05:35.752 "nvme_io": false, 00:05:35.752 "nvme_io_md": false, 00:05:35.752 "write_zeroes": true, 00:05:35.752 "zcopy": true, 00:05:35.752 "get_zone_info": false, 00:05:35.752 "zone_management": false, 00:05:35.752 "zone_append": false, 00:05:35.752 "compare": false, 00:05:35.752 "compare_and_write": false, 00:05:35.752 "abort": true, 00:05:35.752 "seek_hole": false, 00:05:35.752 "seek_data": false, 00:05:35.752 "copy": true, 00:05:35.752 "nvme_iov_md": false 00:05:35.752 }, 00:05:35.752 "memory_domains": [ 00:05:35.752 { 00:05:35.752 "dma_device_id": "system", 00:05:35.752 "dma_device_type": 1 00:05:35.752 }, 00:05:35.752 { 00:05:35.752 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:35.752 "dma_device_type": 2 00:05:35.752 } 00:05:35.752 ], 00:05:35.752 "driver_specific": { 00:05:35.752 "passthru": { 00:05:35.752 "name": "Passthru0", 00:05:35.752 "base_bdev_name": "Malloc2" 00:05:35.752 } 00:05:35.752 } 00:05:35.752 } 00:05:35.752 ]' 00:05:35.752 11:46:25 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:05:35.752 11:46:25 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:35.752 11:46:25 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:35.752 11:46:25 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:35.752 11:46:25 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:35.752 11:46:25 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:35.752 11:46:25 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:05:35.752 11:46:25 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:35.752 11:46:25 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:35.752 11:46:25 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:35.752 11:46:25 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:35.752 11:46:25 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:35.752 11:46:25 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:35.752 11:46:25 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:35.752 11:46:25 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:36.011 11:46:25 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:05:36.011 11:46:26 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:36.011 00:05:36.011 real 0m0.276s 00:05:36.011 user 0m0.172s 00:05:36.011 sys 0m0.043s 00:05:36.011 11:46:26 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:36.011 11:46:26 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:36.011 ************************************ 00:05:36.011 END TEST rpc_daemon_integrity 00:05:36.011 ************************************ 00:05:36.011 11:46:26 rpc -- common/autotest_common.sh@1142 -- # return 0 00:05:36.011 11:46:26 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:05:36.011 11:46:26 rpc -- rpc/rpc.sh@84 -- # killprocess 546516 00:05:36.011 11:46:26 rpc -- common/autotest_common.sh@948 -- # '[' -z 546516 ']' 00:05:36.011 11:46:26 rpc -- common/autotest_common.sh@952 -- # kill -0 546516 00:05:36.011 11:46:26 rpc -- common/autotest_common.sh@953 -- # uname 00:05:36.011 11:46:26 rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:05:36.011 11:46:26 rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 546516 00:05:36.011 11:46:26 rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:05:36.011 11:46:26 rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:05:36.011 11:46:26 rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 546516' 00:05:36.011 killing process with pid 546516 00:05:36.011 11:46:26 rpc -- common/autotest_common.sh@967 -- # kill 546516 00:05:36.011 11:46:26 rpc -- common/autotest_common.sh@972 -- # wait 546516 00:05:36.270 00:05:36.270 real 0m2.461s 00:05:36.270 user 0m3.156s 00:05:36.270 sys 0m0.672s 00:05:36.270 11:46:26 rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:36.270 11:46:26 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:36.270 ************************************ 00:05:36.270 END TEST rpc 00:05:36.270 ************************************ 00:05:36.270 11:46:26 -- common/autotest_common.sh@1142 -- # return 0 00:05:36.270 11:46:26 -- spdk/autotest.sh@170 -- # run_test skip_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:05:36.270 11:46:26 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:36.270 11:46:26 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:36.270 11:46:26 -- common/autotest_common.sh@10 -- # set +x 00:05:36.270 ************************************ 00:05:36.270 START TEST skip_rpc 00:05:36.270 ************************************ 00:05:36.271 11:46:26 skip_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:05:36.530 * Looking for test storage... 00:05:36.530 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:36.530 11:46:26 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:05:36.530 11:46:26 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:05:36.530 11:46:26 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:05:36.530 11:46:26 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:36.530 11:46:26 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:36.530 11:46:26 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:36.530 ************************************ 00:05:36.530 START TEST skip_rpc 00:05:36.530 ************************************ 00:05:36.530 11:46:26 skip_rpc.skip_rpc -- common/autotest_common.sh@1123 -- # test_skip_rpc 00:05:36.530 11:46:26 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=547145 00:05:36.530 11:46:26 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:36.530 11:46:26 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:05:36.530 11:46:26 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:05:36.530 [2024-07-12 11:46:26.672018] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:05:36.530 [2024-07-12 11:46:26.672053] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid547145 ] 00:05:36.530 [2024-07-12 11:46:26.745415] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:36.789 [2024-07-12 11:46:26.818692] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:42.060 11:46:31 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:05:42.060 11:46:31 skip_rpc.skip_rpc -- common/autotest_common.sh@648 -- # local es=0 00:05:42.060 11:46:31 skip_rpc.skip_rpc -- common/autotest_common.sh@650 -- # valid_exec_arg rpc_cmd spdk_get_version 00:05:42.060 11:46:31 skip_rpc.skip_rpc -- common/autotest_common.sh@636 -- # local arg=rpc_cmd 00:05:42.060 11:46:31 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:42.060 11:46:31 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # type -t rpc_cmd 00:05:42.060 11:46:31 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:42.060 11:46:31 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # rpc_cmd spdk_get_version 00:05:42.060 11:46:31 skip_rpc.skip_rpc -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:42.060 11:46:31 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:42.060 11:46:31 skip_rpc.skip_rpc -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:05:42.060 11:46:31 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # es=1 00:05:42.060 11:46:31 skip_rpc.skip_rpc -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:05:42.060 11:46:31 skip_rpc.skip_rpc -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:05:42.060 11:46:31 skip_rpc.skip_rpc -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:05:42.060 11:46:31 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:05:42.060 11:46:31 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 547145 00:05:42.060 11:46:31 skip_rpc.skip_rpc -- common/autotest_common.sh@948 -- # '[' -z 547145 ']' 00:05:42.060 11:46:31 skip_rpc.skip_rpc -- common/autotest_common.sh@952 -- # kill -0 547145 00:05:42.060 11:46:31 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # uname 00:05:42.060 11:46:31 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:05:42.060 11:46:31 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 547145 00:05:42.060 11:46:31 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:05:42.060 11:46:31 skip_rpc.skip_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:05:42.060 11:46:31 skip_rpc.skip_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 547145' 00:05:42.060 killing process with pid 547145 00:05:42.060 11:46:31 skip_rpc.skip_rpc -- common/autotest_common.sh@967 -- # kill 547145 00:05:42.060 11:46:31 skip_rpc.skip_rpc -- common/autotest_common.sh@972 -- # wait 547145 00:05:42.060 00:05:42.060 real 0m5.371s 00:05:42.060 user 0m5.106s 00:05:42.060 sys 0m0.282s 00:05:42.060 11:46:31 skip_rpc.skip_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:42.060 11:46:31 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:42.060 ************************************ 00:05:42.060 END TEST skip_rpc 00:05:42.060 ************************************ 00:05:42.060 11:46:32 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:05:42.060 11:46:32 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:05:42.060 11:46:32 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:42.060 11:46:32 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:42.060 11:46:32 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:42.060 ************************************ 00:05:42.060 START TEST skip_rpc_with_json 00:05:42.060 ************************************ 00:05:42.060 11:46:32 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1123 -- # test_skip_rpc_with_json 00:05:42.060 11:46:32 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:05:42.060 11:46:32 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=548079 00:05:42.060 11:46:32 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:42.060 11:46:32 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:42.060 11:46:32 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 548079 00:05:42.060 11:46:32 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@829 -- # '[' -z 548079 ']' 00:05:42.060 11:46:32 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:42.060 11:46:32 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:42.060 11:46:32 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:42.060 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:42.060 11:46:32 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:42.060 11:46:32 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:42.060 [2024-07-12 11:46:32.114130] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:05:42.060 [2024-07-12 11:46:32.114172] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid548079 ] 00:05:42.060 [2024-07-12 11:46:32.191816] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:42.060 [2024-07-12 11:46:32.262598] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:42.996 11:46:32 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:42.996 11:46:32 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@862 -- # return 0 00:05:42.996 11:46:32 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:05:42.996 11:46:32 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:42.996 11:46:32 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:42.996 [2024-07-12 11:46:32.900420] nvmf_rpc.c:2562:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:05:42.996 request: 00:05:42.996 { 00:05:42.996 "trtype": "tcp", 00:05:42.996 "method": "nvmf_get_transports", 00:05:42.996 "req_id": 1 00:05:42.996 } 00:05:42.996 Got JSON-RPC error response 00:05:42.996 response: 00:05:42.996 { 00:05:42.996 "code": -19, 00:05:42.996 "message": "No such device" 00:05:42.996 } 00:05:42.996 11:46:32 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:05:42.996 11:46:32 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:05:42.996 11:46:32 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:42.996 11:46:32 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:42.996 [2024-07-12 11:46:32.908528] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:05:42.996 11:46:32 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:42.996 11:46:32 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:05:42.996 11:46:32 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:42.996 11:46:32 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:42.996 11:46:33 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:42.996 11:46:33 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:05:42.996 { 00:05:42.996 "subsystems": [ 00:05:42.996 { 00:05:42.996 "subsystem": "keyring", 00:05:42.996 "config": [] 00:05:42.996 }, 00:05:42.996 { 00:05:42.996 "subsystem": "iobuf", 00:05:42.996 "config": [ 00:05:42.996 { 00:05:42.996 "method": "iobuf_set_options", 00:05:42.996 "params": { 00:05:42.996 "small_pool_count": 8192, 00:05:42.996 "large_pool_count": 1024, 00:05:42.996 "small_bufsize": 8192, 00:05:42.996 "large_bufsize": 135168 00:05:42.996 } 00:05:42.996 } 00:05:42.996 ] 00:05:42.996 }, 00:05:42.996 { 00:05:42.996 "subsystem": "sock", 00:05:42.996 "config": [ 00:05:42.996 { 00:05:42.996 "method": "sock_set_default_impl", 00:05:42.996 "params": { 00:05:42.996 "impl_name": "posix" 00:05:42.996 } 00:05:42.996 }, 00:05:42.996 { 00:05:42.996 "method": "sock_impl_set_options", 00:05:42.996 "params": { 00:05:42.996 "impl_name": "ssl", 00:05:42.996 "recv_buf_size": 4096, 00:05:42.996 "send_buf_size": 4096, 00:05:42.996 "enable_recv_pipe": true, 00:05:42.996 "enable_quickack": false, 00:05:42.996 "enable_placement_id": 0, 00:05:42.996 "enable_zerocopy_send_server": true, 00:05:42.996 "enable_zerocopy_send_client": false, 00:05:42.996 "zerocopy_threshold": 0, 00:05:42.996 "tls_version": 0, 00:05:42.996 "enable_ktls": false 00:05:42.996 } 00:05:42.996 }, 00:05:42.996 { 00:05:42.996 "method": "sock_impl_set_options", 00:05:42.996 "params": { 00:05:42.996 "impl_name": "posix", 00:05:42.996 "recv_buf_size": 2097152, 00:05:42.996 "send_buf_size": 2097152, 00:05:42.996 "enable_recv_pipe": true, 00:05:42.996 "enable_quickack": false, 00:05:42.996 "enable_placement_id": 0, 00:05:42.996 "enable_zerocopy_send_server": true, 00:05:42.996 "enable_zerocopy_send_client": false, 00:05:42.996 "zerocopy_threshold": 0, 00:05:42.996 "tls_version": 0, 00:05:42.996 "enable_ktls": false 00:05:42.996 } 00:05:42.996 } 00:05:42.996 ] 00:05:42.996 }, 00:05:42.996 { 00:05:42.996 "subsystem": "vmd", 00:05:42.996 "config": [] 00:05:42.996 }, 00:05:42.996 { 00:05:42.996 "subsystem": "accel", 00:05:42.996 "config": [ 00:05:42.996 { 00:05:42.996 "method": "accel_set_options", 00:05:42.996 "params": { 00:05:42.996 "small_cache_size": 128, 00:05:42.996 "large_cache_size": 16, 00:05:42.996 "task_count": 2048, 00:05:42.996 "sequence_count": 2048, 00:05:42.996 "buf_count": 2048 00:05:42.996 } 00:05:42.996 } 00:05:42.996 ] 00:05:42.996 }, 00:05:42.996 { 00:05:42.996 "subsystem": "bdev", 00:05:42.996 "config": [ 00:05:42.996 { 00:05:42.996 "method": "bdev_set_options", 00:05:42.996 "params": { 00:05:42.996 "bdev_io_pool_size": 65535, 00:05:42.996 "bdev_io_cache_size": 256, 00:05:42.996 "bdev_auto_examine": true, 00:05:42.996 "iobuf_small_cache_size": 128, 00:05:42.996 "iobuf_large_cache_size": 16 00:05:42.996 } 00:05:42.996 }, 00:05:42.996 { 00:05:42.996 "method": "bdev_raid_set_options", 00:05:42.996 "params": { 00:05:42.996 "process_window_size_kb": 1024 00:05:42.996 } 00:05:42.996 }, 00:05:42.996 { 00:05:42.996 "method": "bdev_iscsi_set_options", 00:05:42.996 "params": { 00:05:42.996 "timeout_sec": 30 00:05:42.996 } 00:05:42.996 }, 00:05:42.996 { 00:05:42.996 "method": "bdev_nvme_set_options", 00:05:42.996 "params": { 00:05:42.996 "action_on_timeout": "none", 00:05:42.996 "timeout_us": 0, 00:05:42.996 "timeout_admin_us": 0, 00:05:42.996 "keep_alive_timeout_ms": 10000, 00:05:42.996 "arbitration_burst": 0, 00:05:42.996 "low_priority_weight": 0, 00:05:42.996 "medium_priority_weight": 0, 00:05:42.996 "high_priority_weight": 0, 00:05:42.996 "nvme_adminq_poll_period_us": 10000, 00:05:42.996 "nvme_ioq_poll_period_us": 0, 00:05:42.996 "io_queue_requests": 0, 00:05:42.996 "delay_cmd_submit": true, 00:05:42.996 "transport_retry_count": 4, 00:05:42.996 "bdev_retry_count": 3, 00:05:42.996 "transport_ack_timeout": 0, 00:05:42.996 "ctrlr_loss_timeout_sec": 0, 00:05:42.996 "reconnect_delay_sec": 0, 00:05:42.996 "fast_io_fail_timeout_sec": 0, 00:05:42.996 "disable_auto_failback": false, 00:05:42.996 "generate_uuids": false, 00:05:42.996 "transport_tos": 0, 00:05:42.996 "nvme_error_stat": false, 00:05:42.996 "rdma_srq_size": 0, 00:05:42.996 "io_path_stat": false, 00:05:42.996 "allow_accel_sequence": false, 00:05:42.996 "rdma_max_cq_size": 0, 00:05:42.996 "rdma_cm_event_timeout_ms": 0, 00:05:42.996 "dhchap_digests": [ 00:05:42.996 "sha256", 00:05:42.996 "sha384", 00:05:42.996 "sha512" 00:05:42.996 ], 00:05:42.996 "dhchap_dhgroups": [ 00:05:42.996 "null", 00:05:42.996 "ffdhe2048", 00:05:42.996 "ffdhe3072", 00:05:42.996 "ffdhe4096", 00:05:42.996 "ffdhe6144", 00:05:42.996 "ffdhe8192" 00:05:42.996 ] 00:05:42.996 } 00:05:42.996 }, 00:05:42.996 { 00:05:42.996 "method": "bdev_nvme_set_hotplug", 00:05:42.996 "params": { 00:05:42.996 "period_us": 100000, 00:05:42.996 "enable": false 00:05:42.996 } 00:05:42.996 }, 00:05:42.996 { 00:05:42.996 "method": "bdev_wait_for_examine" 00:05:42.996 } 00:05:42.996 ] 00:05:42.996 }, 00:05:42.996 { 00:05:42.996 "subsystem": "scsi", 00:05:42.996 "config": null 00:05:42.996 }, 00:05:42.996 { 00:05:42.996 "subsystem": "scheduler", 00:05:42.996 "config": [ 00:05:42.996 { 00:05:42.996 "method": "framework_set_scheduler", 00:05:42.996 "params": { 00:05:42.996 "name": "static" 00:05:42.996 } 00:05:42.996 } 00:05:42.996 ] 00:05:42.996 }, 00:05:42.996 { 00:05:42.996 "subsystem": "vhost_scsi", 00:05:42.996 "config": [] 00:05:42.996 }, 00:05:42.996 { 00:05:42.996 "subsystem": "vhost_blk", 00:05:42.996 "config": [] 00:05:42.996 }, 00:05:42.996 { 00:05:42.996 "subsystem": "ublk", 00:05:42.996 "config": [] 00:05:42.996 }, 00:05:42.996 { 00:05:42.996 "subsystem": "nbd", 00:05:42.996 "config": [] 00:05:42.996 }, 00:05:42.996 { 00:05:42.996 "subsystem": "nvmf", 00:05:42.996 "config": [ 00:05:42.996 { 00:05:42.996 "method": "nvmf_set_config", 00:05:42.996 "params": { 00:05:42.996 "discovery_filter": "match_any", 00:05:42.996 "admin_cmd_passthru": { 00:05:42.996 "identify_ctrlr": false 00:05:42.996 } 00:05:42.996 } 00:05:42.996 }, 00:05:42.996 { 00:05:42.996 "method": "nvmf_set_max_subsystems", 00:05:42.996 "params": { 00:05:42.996 "max_subsystems": 1024 00:05:42.996 } 00:05:42.996 }, 00:05:42.996 { 00:05:42.996 "method": "nvmf_set_crdt", 00:05:42.996 "params": { 00:05:42.996 "crdt1": 0, 00:05:42.996 "crdt2": 0, 00:05:42.996 "crdt3": 0 00:05:42.996 } 00:05:42.996 }, 00:05:42.996 { 00:05:42.996 "method": "nvmf_create_transport", 00:05:42.996 "params": { 00:05:42.996 "trtype": "TCP", 00:05:42.996 "max_queue_depth": 128, 00:05:42.996 "max_io_qpairs_per_ctrlr": 127, 00:05:42.996 "in_capsule_data_size": 4096, 00:05:42.997 "max_io_size": 131072, 00:05:42.997 "io_unit_size": 131072, 00:05:42.997 "max_aq_depth": 128, 00:05:42.997 "num_shared_buffers": 511, 00:05:42.997 "buf_cache_size": 4294967295, 00:05:42.997 "dif_insert_or_strip": false, 00:05:42.997 "zcopy": false, 00:05:42.997 "c2h_success": true, 00:05:42.997 "sock_priority": 0, 00:05:42.997 "abort_timeout_sec": 1, 00:05:42.997 "ack_timeout": 0, 00:05:42.997 "data_wr_pool_size": 0 00:05:42.997 } 00:05:42.997 } 00:05:42.997 ] 00:05:42.997 }, 00:05:42.997 { 00:05:42.997 "subsystem": "iscsi", 00:05:42.997 "config": [ 00:05:42.997 { 00:05:42.997 "method": "iscsi_set_options", 00:05:42.997 "params": { 00:05:42.997 "node_base": "iqn.2016-06.io.spdk", 00:05:42.997 "max_sessions": 128, 00:05:42.997 "max_connections_per_session": 2, 00:05:42.997 "max_queue_depth": 64, 00:05:42.997 "default_time2wait": 2, 00:05:42.997 "default_time2retain": 20, 00:05:42.997 "first_burst_length": 8192, 00:05:42.997 "immediate_data": true, 00:05:42.997 "allow_duplicated_isid": false, 00:05:42.997 "error_recovery_level": 0, 00:05:42.997 "nop_timeout": 60, 00:05:42.997 "nop_in_interval": 30, 00:05:42.997 "disable_chap": false, 00:05:42.997 "require_chap": false, 00:05:42.997 "mutual_chap": false, 00:05:42.997 "chap_group": 0, 00:05:42.997 "max_large_datain_per_connection": 64, 00:05:42.997 "max_r2t_per_connection": 4, 00:05:42.997 "pdu_pool_size": 36864, 00:05:42.997 "immediate_data_pool_size": 16384, 00:05:42.997 "data_out_pool_size": 2048 00:05:42.997 } 00:05:42.997 } 00:05:42.997 ] 00:05:42.997 } 00:05:42.997 ] 00:05:42.997 } 00:05:42.997 11:46:33 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:05:42.997 11:46:33 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 548079 00:05:42.997 11:46:33 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@948 -- # '[' -z 548079 ']' 00:05:42.997 11:46:33 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # kill -0 548079 00:05:42.997 11:46:33 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # uname 00:05:42.997 11:46:33 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:05:42.997 11:46:33 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 548079 00:05:42.997 11:46:33 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:05:42.997 11:46:33 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:05:42.997 11:46:33 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@966 -- # echo 'killing process with pid 548079' 00:05:42.997 killing process with pid 548079 00:05:42.997 11:46:33 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # kill 548079 00:05:42.997 11:46:33 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # wait 548079 00:05:43.255 11:46:33 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=548323 00:05:43.255 11:46:33 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:05:43.255 11:46:33 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:05:48.520 11:46:38 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 548323 00:05:48.520 11:46:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@948 -- # '[' -z 548323 ']' 00:05:48.520 11:46:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # kill -0 548323 00:05:48.520 11:46:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # uname 00:05:48.520 11:46:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:05:48.520 11:46:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 548323 00:05:48.520 11:46:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:05:48.520 11:46:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:05:48.520 11:46:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@966 -- # echo 'killing process with pid 548323' 00:05:48.520 killing process with pid 548323 00:05:48.520 11:46:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # kill 548323 00:05:48.520 11:46:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # wait 548323 00:05:48.779 11:46:38 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:05:48.779 11:46:38 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:05:48.779 00:05:48.779 real 0m6.731s 00:05:48.779 user 0m6.486s 00:05:48.779 sys 0m0.626s 00:05:48.779 11:46:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:48.779 11:46:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:48.779 ************************************ 00:05:48.779 END TEST skip_rpc_with_json 00:05:48.779 ************************************ 00:05:48.779 11:46:38 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:05:48.779 11:46:38 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:05:48.779 11:46:38 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:48.779 11:46:38 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:48.779 11:46:38 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:48.779 ************************************ 00:05:48.779 START TEST skip_rpc_with_delay 00:05:48.779 ************************************ 00:05:48.779 11:46:38 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1123 -- # test_skip_rpc_with_delay 00:05:48.779 11:46:38 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:05:48.779 11:46:38 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@648 -- # local es=0 00:05:48.779 11:46:38 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:05:48.779 11:46:38 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:48.779 11:46:38 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:48.779 11:46:38 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:48.779 11:46:38 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:48.779 11:46:38 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:48.779 11:46:38 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:48.779 11:46:38 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:48.779 11:46:38 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:05:48.779 11:46:38 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:05:48.779 [2024-07-12 11:46:38.921360] app.c: 832:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:05:48.779 [2024-07-12 11:46:38.921415] app.c: 711:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:05:48.779 11:46:38 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # es=1 00:05:48.779 11:46:38 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:05:48.779 11:46:38 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:05:48.779 11:46:38 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:05:48.779 00:05:48.779 real 0m0.072s 00:05:48.779 user 0m0.049s 00:05:48.779 sys 0m0.023s 00:05:48.779 11:46:38 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:48.779 11:46:38 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:05:48.779 ************************************ 00:05:48.779 END TEST skip_rpc_with_delay 00:05:48.779 ************************************ 00:05:48.779 11:46:38 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:05:48.779 11:46:38 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:05:48.779 11:46:38 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:05:48.779 11:46:38 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:05:48.779 11:46:38 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:48.779 11:46:38 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:48.779 11:46:38 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:48.779 ************************************ 00:05:48.779 START TEST exit_on_failed_rpc_init 00:05:48.779 ************************************ 00:05:48.779 11:46:39 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1123 -- # test_exit_on_failed_rpc_init 00:05:48.779 11:46:39 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=549280 00:05:48.779 11:46:39 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 549280 00:05:48.779 11:46:39 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:48.779 11:46:39 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@829 -- # '[' -z 549280 ']' 00:05:48.779 11:46:39 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:48.779 11:46:39 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:48.779 11:46:39 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:48.779 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:48.779 11:46:39 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:48.779 11:46:39 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:05:49.038 [2024-07-12 11:46:39.058179] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:05:49.038 [2024-07-12 11:46:39.058215] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid549280 ] 00:05:49.038 [2024-07-12 11:46:39.133895] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:49.038 [2024-07-12 11:46:39.204547] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:49.975 11:46:39 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:49.975 11:46:39 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@862 -- # return 0 00:05:49.975 11:46:39 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:49.975 11:46:39 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:05:49.975 11:46:39 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@648 -- # local es=0 00:05:49.975 11:46:39 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:05:49.975 11:46:39 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:49.975 11:46:39 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:49.975 11:46:39 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:49.975 11:46:39 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:49.975 11:46:39 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:49.975 11:46:39 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:49.975 11:46:39 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:49.976 11:46:39 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:05:49.976 11:46:39 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:05:49.976 [2024-07-12 11:46:39.917020] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:05:49.976 [2024-07-12 11:46:39.917062] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid549385 ] 00:05:49.976 [2024-07-12 11:46:39.992566] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:49.976 [2024-07-12 11:46:40.074735] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:49.976 [2024-07-12 11:46:40.074795] rpc.c: 180:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:05:49.976 [2024-07-12 11:46:40.074804] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:05:49.976 [2024-07-12 11:46:40.074810] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:05:49.976 11:46:40 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # es=234 00:05:49.976 11:46:40 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:05:49.976 11:46:40 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@660 -- # es=106 00:05:49.976 11:46:40 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@661 -- # case "$es" in 00:05:49.976 11:46:40 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@668 -- # es=1 00:05:49.976 11:46:40 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:05:49.976 11:46:40 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:05:49.976 11:46:40 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 549280 00:05:49.976 11:46:40 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@948 -- # '[' -z 549280 ']' 00:05:49.976 11:46:40 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@952 -- # kill -0 549280 00:05:49.976 11:46:40 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # uname 00:05:49.976 11:46:40 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:05:49.976 11:46:40 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 549280 00:05:49.976 11:46:40 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:05:49.976 11:46:40 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:05:49.976 11:46:40 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@966 -- # echo 'killing process with pid 549280' 00:05:49.976 killing process with pid 549280 00:05:49.976 11:46:40 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@967 -- # kill 549280 00:05:49.976 11:46:40 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@972 -- # wait 549280 00:05:50.545 00:05:50.545 real 0m1.506s 00:05:50.545 user 0m1.727s 00:05:50.545 sys 0m0.433s 00:05:50.545 11:46:40 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:50.545 11:46:40 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:05:50.545 ************************************ 00:05:50.545 END TEST exit_on_failed_rpc_init 00:05:50.545 ************************************ 00:05:50.545 11:46:40 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:05:50.545 11:46:40 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:05:50.545 00:05:50.545 real 0m14.053s 00:05:50.545 user 0m13.530s 00:05:50.545 sys 0m1.601s 00:05:50.545 11:46:40 skip_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:50.545 11:46:40 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:50.545 ************************************ 00:05:50.545 END TEST skip_rpc 00:05:50.545 ************************************ 00:05:50.545 11:46:40 -- common/autotest_common.sh@1142 -- # return 0 00:05:50.545 11:46:40 -- spdk/autotest.sh@171 -- # run_test rpc_client /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:50.545 11:46:40 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:50.545 11:46:40 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:50.545 11:46:40 -- common/autotest_common.sh@10 -- # set +x 00:05:50.545 ************************************ 00:05:50.545 START TEST rpc_client 00:05:50.545 ************************************ 00:05:50.545 11:46:40 rpc_client -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:50.545 * Looking for test storage... 00:05:50.545 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client 00:05:50.545 11:46:40 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:05:50.545 OK 00:05:50.545 11:46:40 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:05:50.545 00:05:50.545 real 0m0.118s 00:05:50.545 user 0m0.048s 00:05:50.545 sys 0m0.078s 00:05:50.545 11:46:40 rpc_client -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:50.545 11:46:40 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:05:50.545 ************************************ 00:05:50.545 END TEST rpc_client 00:05:50.545 ************************************ 00:05:50.545 11:46:40 -- common/autotest_common.sh@1142 -- # return 0 00:05:50.545 11:46:40 -- spdk/autotest.sh@172 -- # run_test json_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:05:50.545 11:46:40 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:50.545 11:46:40 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:50.545 11:46:40 -- common/autotest_common.sh@10 -- # set +x 00:05:50.840 ************************************ 00:05:50.840 START TEST json_config 00:05:50.840 ************************************ 00:05:50.840 11:46:40 json_config -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:05:50.840 11:46:40 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:05:50.840 11:46:40 json_config -- nvmf/common.sh@7 -- # uname -s 00:05:50.840 11:46:40 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:50.840 11:46:40 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:50.840 11:46:40 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:50.840 11:46:40 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:50.840 11:46:40 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:50.840 11:46:40 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:50.840 11:46:40 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:50.840 11:46:40 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:50.840 11:46:40 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:50.840 11:46:40 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:50.840 11:46:40 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:801347e8-3fd0-e911-906e-0017a4403562 00:05:50.840 11:46:40 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=801347e8-3fd0-e911-906e-0017a4403562 00:05:50.840 11:46:40 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:50.840 11:46:40 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:50.840 11:46:40 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:50.840 11:46:40 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:05:50.840 11:46:40 json_config -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:05:50.840 11:46:40 json_config -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:50.840 11:46:40 json_config -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:50.840 11:46:40 json_config -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:50.840 11:46:40 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:50.840 11:46:40 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:50.840 11:46:40 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:50.840 11:46:40 json_config -- paths/export.sh@5 -- # export PATH 00:05:50.840 11:46:40 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:50.840 11:46:40 json_config -- nvmf/common.sh@47 -- # : 0 00:05:50.840 11:46:40 json_config -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:05:50.840 11:46:40 json_config -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:05:50.840 11:46:40 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:05:50.840 11:46:40 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:50.840 11:46:40 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:50.840 11:46:40 json_config -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:05:50.840 11:46:40 json_config -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:05:50.840 11:46:40 json_config -- nvmf/common.sh@51 -- # have_pci_nics=0 00:05:50.840 11:46:40 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:05:50.840 11:46:40 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:05:50.840 11:46:40 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:05:50.840 11:46:40 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:05:50.840 11:46:40 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:05:50.840 11:46:40 json_config -- json_config/json_config.sh@31 -- # app_pid=(['target']='' ['initiator']='') 00:05:50.840 11:46:40 json_config -- json_config/json_config.sh@31 -- # declare -A app_pid 00:05:50.840 11:46:40 json_config -- json_config/json_config.sh@32 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock' ['initiator']='/var/tmp/spdk_initiator.sock') 00:05:50.840 11:46:40 json_config -- json_config/json_config.sh@32 -- # declare -A app_socket 00:05:50.840 11:46:40 json_config -- json_config/json_config.sh@33 -- # app_params=(['target']='-m 0x1 -s 1024' ['initiator']='-m 0x2 -g -u -s 1024') 00:05:50.840 11:46:40 json_config -- json_config/json_config.sh@33 -- # declare -A app_params 00:05:50.840 11:46:40 json_config -- json_config/json_config.sh@34 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json' ['initiator']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json') 00:05:50.840 11:46:40 json_config -- json_config/json_config.sh@34 -- # declare -A configs_path 00:05:50.840 11:46:40 json_config -- json_config/json_config.sh@40 -- # last_event_id=0 00:05:50.841 11:46:40 json_config -- json_config/json_config.sh@355 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:05:50.841 11:46:40 json_config -- json_config/json_config.sh@356 -- # echo 'INFO: JSON configuration test init' 00:05:50.841 INFO: JSON configuration test init 00:05:50.841 11:46:40 json_config -- json_config/json_config.sh@357 -- # json_config_test_init 00:05:50.841 11:46:40 json_config -- json_config/json_config.sh@262 -- # timing_enter json_config_test_init 00:05:50.841 11:46:40 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:50.841 11:46:40 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:50.841 11:46:40 json_config -- json_config/json_config.sh@263 -- # timing_enter json_config_setup_target 00:05:50.841 11:46:40 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:50.841 11:46:40 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:50.841 11:46:40 json_config -- json_config/json_config.sh@265 -- # json_config_test_start_app target --wait-for-rpc 00:05:50.841 11:46:40 json_config -- json_config/common.sh@9 -- # local app=target 00:05:50.841 11:46:40 json_config -- json_config/common.sh@10 -- # shift 00:05:50.841 11:46:40 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:05:50.841 11:46:40 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:05:50.841 11:46:40 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:05:50.841 11:46:40 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:05:50.841 11:46:40 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:05:50.841 11:46:40 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=549632 00:05:50.841 11:46:40 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:05:50.841 Waiting for target to run... 00:05:50.841 11:46:40 json_config -- json_config/common.sh@25 -- # waitforlisten 549632 /var/tmp/spdk_tgt.sock 00:05:50.841 11:46:40 json_config -- common/autotest_common.sh@829 -- # '[' -z 549632 ']' 00:05:50.841 11:46:40 json_config -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:05:50.841 11:46:40 json_config -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:50.841 11:46:40 json_config -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:05:50.841 11:46:40 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --wait-for-rpc 00:05:50.841 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:05:50.841 11:46:40 json_config -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:50.841 11:46:40 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:50.841 [2024-07-12 11:46:40.959682] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:05:50.841 [2024-07-12 11:46:40.959726] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid549632 ] 00:05:51.102 [2024-07-12 11:46:41.252586] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:51.102 [2024-07-12 11:46:41.322051] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:51.671 11:46:41 json_config -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:51.671 11:46:41 json_config -- common/autotest_common.sh@862 -- # return 0 00:05:51.671 11:46:41 json_config -- json_config/common.sh@26 -- # echo '' 00:05:51.671 00:05:51.671 11:46:41 json_config -- json_config/json_config.sh@269 -- # create_accel_config 00:05:51.671 11:46:41 json_config -- json_config/json_config.sh@93 -- # timing_enter create_accel_config 00:05:51.671 11:46:41 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:51.671 11:46:41 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:51.671 11:46:41 json_config -- json_config/json_config.sh@95 -- # [[ 1 -eq 1 ]] 00:05:51.671 11:46:41 json_config -- json_config/json_config.sh@96 -- # tgt_rpc dpdk_cryptodev_scan_accel_module 00:05:51.671 11:46:41 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock dpdk_cryptodev_scan_accel_module 00:05:51.671 11:46:41 json_config -- json_config/json_config.sh@97 -- # tgt_rpc accel_assign_opc -o encrypt -m dpdk_cryptodev 00:05:51.671 11:46:41 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o encrypt -m dpdk_cryptodev 00:05:51.930 [2024-07-12 11:46:42.064252] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:05:51.930 11:46:42 json_config -- json_config/json_config.sh@98 -- # tgt_rpc accel_assign_opc -o decrypt -m dpdk_cryptodev 00:05:51.930 11:46:42 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o decrypt -m dpdk_cryptodev 00:05:52.189 [2024-07-12 11:46:42.224648] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:05:52.189 11:46:42 json_config -- json_config/json_config.sh@101 -- # timing_exit create_accel_config 00:05:52.189 11:46:42 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:52.189 11:46:42 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:52.189 11:46:42 json_config -- json_config/json_config.sh@273 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --json-with-subsystems 00:05:52.189 11:46:42 json_config -- json_config/json_config.sh@274 -- # tgt_rpc load_config 00:05:52.189 11:46:42 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock load_config 00:05:52.448 [2024-07-12 11:46:42.448344] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:05:57.720 11:46:47 json_config -- json_config/json_config.sh@276 -- # tgt_check_notification_types 00:05:57.720 11:46:47 json_config -- json_config/json_config.sh@43 -- # timing_enter tgt_check_notification_types 00:05:57.720 11:46:47 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:57.720 11:46:47 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:57.720 11:46:47 json_config -- json_config/json_config.sh@45 -- # local ret=0 00:05:57.720 11:46:47 json_config -- json_config/json_config.sh@46 -- # enabled_types=('bdev_register' 'bdev_unregister') 00:05:57.720 11:46:47 json_config -- json_config/json_config.sh@46 -- # local enabled_types 00:05:57.720 11:46:47 json_config -- json_config/json_config.sh@48 -- # tgt_rpc notify_get_types 00:05:57.720 11:46:47 json_config -- json_config/json_config.sh@48 -- # jq -r '.[]' 00:05:57.720 11:46:47 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_types 00:05:57.720 11:46:47 json_config -- json_config/json_config.sh@48 -- # get_types=('bdev_register' 'bdev_unregister') 00:05:57.720 11:46:47 json_config -- json_config/json_config.sh@48 -- # local get_types 00:05:57.720 11:46:47 json_config -- json_config/json_config.sh@49 -- # [[ bdev_register bdev_unregister != \b\d\e\v\_\r\e\g\i\s\t\e\r\ \b\d\e\v\_\u\n\r\e\g\i\s\t\e\r ]] 00:05:57.720 11:46:47 json_config -- json_config/json_config.sh@54 -- # timing_exit tgt_check_notification_types 00:05:57.720 11:46:47 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:57.720 11:46:47 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:57.720 11:46:47 json_config -- json_config/json_config.sh@55 -- # return 0 00:05:57.720 11:46:47 json_config -- json_config/json_config.sh@278 -- # [[ 1 -eq 1 ]] 00:05:57.720 11:46:47 json_config -- json_config/json_config.sh@279 -- # create_bdev_subsystem_config 00:05:57.720 11:46:47 json_config -- json_config/json_config.sh@105 -- # timing_enter create_bdev_subsystem_config 00:05:57.720 11:46:47 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:57.720 11:46:47 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:57.720 11:46:47 json_config -- json_config/json_config.sh@107 -- # expected_notifications=() 00:05:57.720 11:46:47 json_config -- json_config/json_config.sh@107 -- # local expected_notifications 00:05:57.720 11:46:47 json_config -- json_config/json_config.sh@111 -- # expected_notifications+=($(get_notifications)) 00:05:57.720 11:46:47 json_config -- json_config/json_config.sh@111 -- # get_notifications 00:05:57.720 11:46:47 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:05:57.720 11:46:47 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:57.720 11:46:47 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:57.720 11:46:47 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:05:57.720 11:46:47 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:05:57.720 11:46:47 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:05:57.720 11:46:47 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:05:57.720 11:46:47 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:57.720 11:46:47 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:57.720 11:46:47 json_config -- json_config/json_config.sh@113 -- # [[ 1 -eq 1 ]] 00:05:57.720 11:46:47 json_config -- json_config/json_config.sh@114 -- # local lvol_store_base_bdev=Nvme0n1 00:05:57.720 11:46:47 json_config -- json_config/json_config.sh@116 -- # tgt_rpc bdev_split_create Nvme0n1 2 00:05:57.720 11:46:47 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Nvme0n1 2 00:05:57.979 Nvme0n1p0 Nvme0n1p1 00:05:57.980 11:46:47 json_config -- json_config/json_config.sh@117 -- # tgt_rpc bdev_split_create Malloc0 3 00:05:57.980 11:46:47 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Malloc0 3 00:05:57.980 [2024-07-12 11:46:48.133103] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:05:57.980 [2024-07-12 11:46:48.133139] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:05:57.980 00:05:57.980 11:46:48 json_config -- json_config/json_config.sh@118 -- # tgt_rpc bdev_malloc_create 8 4096 --name Malloc3 00:05:57.980 11:46:48 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 4096 --name Malloc3 00:05:58.238 Malloc3 00:05:58.238 11:46:48 json_config -- json_config/json_config.sh@119 -- # tgt_rpc bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:05:58.238 11:46:48 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:05:58.238 [2024-07-12 11:46:48.453988] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:05:58.238 [2024-07-12 11:46:48.454020] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:58.238 [2024-07-12 11:46:48.454031] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfd35d0 00:05:58.238 [2024-07-12 11:46:48.454053] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:58.238 [2024-07-12 11:46:48.455251] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:58.238 [2024-07-12 11:46:48.455271] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:05:58.238 PTBdevFromMalloc3 00:05:58.238 11:46:48 json_config -- json_config/json_config.sh@121 -- # tgt_rpc bdev_null_create Null0 32 512 00:05:58.238 11:46:48 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_null_create Null0 32 512 00:05:58.497 Null0 00:05:58.497 11:46:48 json_config -- json_config/json_config.sh@123 -- # tgt_rpc bdev_malloc_create 32 512 --name Malloc0 00:05:58.497 11:46:48 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 32 512 --name Malloc0 00:05:58.756 Malloc0 00:05:58.756 11:46:48 json_config -- json_config/json_config.sh@124 -- # tgt_rpc bdev_malloc_create 16 4096 --name Malloc1 00:05:58.756 11:46:48 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 16 4096 --name Malloc1 00:05:58.756 Malloc1 00:05:58.756 11:46:48 json_config -- json_config/json_config.sh@137 -- # expected_notifications+=(bdev_register:${lvol_store_base_bdev}p1 bdev_register:${lvol_store_base_bdev}p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1) 00:05:58.756 11:46:48 json_config -- json_config/json_config.sh@140 -- # dd if=/dev/zero of=/sample_aio bs=1024 count=102400 00:05:59.015 102400+0 records in 00:05:59.015 102400+0 records out 00:05:59.015 104857600 bytes (105 MB, 100 MiB) copied, 0.117548 s, 892 MB/s 00:05:59.015 11:46:49 json_config -- json_config/json_config.sh@141 -- # tgt_rpc bdev_aio_create /sample_aio aio_disk 1024 00:05:59.015 11:46:49 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_aio_create /sample_aio aio_disk 1024 00:05:59.015 aio_disk 00:05:59.015 11:46:49 json_config -- json_config/json_config.sh@142 -- # expected_notifications+=(bdev_register:aio_disk) 00:05:59.015 11:46:49 json_config -- json_config/json_config.sh@147 -- # tgt_rpc bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:05:59.015 11:46:49 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:06:01.549 85e32427-a148-43a8-be5a-94e2100d6db4 00:06:01.549 11:46:51 json_config -- json_config/json_config.sh@154 -- # expected_notifications+=("bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test lvol0 32)" "bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32)" "bdev_register:$(tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0)" "bdev_register:$(tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0)") 00:06:01.549 11:46:51 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test lvol0 32 00:06:01.549 11:46:51 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test lvol0 32 00:06:01.549 11:46:51 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32 00:06:01.549 11:46:51 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test -t lvol1 32 00:06:01.549 11:46:51 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:06:01.549 11:46:51 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:06:01.808 11:46:51 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0 00:06:01.808 11:46:51 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_clone lvs_test/snapshot0 clone0 00:06:02.067 11:46:52 json_config -- json_config/json_config.sh@157 -- # [[ 1 -eq 1 ]] 00:06:02.067 11:46:52 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:06:02.067 11:46:52 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:06:02.067 MallocForCryptoBdev 00:06:02.067 11:46:52 json_config -- json_config/json_config.sh@159 -- # wc -l 00:06:02.067 11:46:52 json_config -- json_config/json_config.sh@159 -- # lspci -d:37c8 00:06:02.067 11:46:52 json_config -- json_config/json_config.sh@159 -- # [[ 3 -eq 0 ]] 00:06:02.067 11:46:52 json_config -- json_config/json_config.sh@162 -- # local crypto_driver=crypto_qat 00:06:02.067 11:46:52 json_config -- json_config/json_config.sh@165 -- # tgt_rpc bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:06:02.067 11:46:52 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:06:02.326 [2024-07-12 11:46:52.441833] vbdev_crypto_rpc.c: 136:rpc_bdev_crypto_create: *WARNING*: "crypto_pmd" parameters is obsolete and ignored 00:06:02.327 CryptoMallocBdev 00:06:02.327 11:46:52 json_config -- json_config/json_config.sh@169 -- # expected_notifications+=(bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev) 00:06:02.327 11:46:52 json_config -- json_config/json_config.sh@172 -- # [[ 0 -eq 1 ]] 00:06:02.327 11:46:52 json_config -- json_config/json_config.sh@178 -- # tgt_check_notifications bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:0f0e2f07-e1d7-42e7-a69d-c59e3caf91ba bdev_register:001090fb-e284-467c-b6cd-79e26147e777 bdev_register:288128a5-64bf-4df7-a828-b5e0018b23ce bdev_register:40ffd1da-5ce8-4a33-9c12-8649a4d94372 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:06:02.327 11:46:52 json_config -- json_config/json_config.sh@67 -- # local events_to_check 00:06:02.327 11:46:52 json_config -- json_config/json_config.sh@68 -- # local recorded_events 00:06:02.327 11:46:52 json_config -- json_config/json_config.sh@71 -- # events_to_check=($(printf '%s\n' "$@" | sort)) 00:06:02.327 11:46:52 json_config -- json_config/json_config.sh@71 -- # printf '%s\n' bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:0f0e2f07-e1d7-42e7-a69d-c59e3caf91ba bdev_register:001090fb-e284-467c-b6cd-79e26147e777 bdev_register:288128a5-64bf-4df7-a828-b5e0018b23ce bdev_register:40ffd1da-5ce8-4a33-9c12-8649a4d94372 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:06:02.327 11:46:52 json_config -- json_config/json_config.sh@71 -- # sort 00:06:02.327 11:46:52 json_config -- json_config/json_config.sh@72 -- # recorded_events=($(get_notifications | sort)) 00:06:02.327 11:46:52 json_config -- json_config/json_config.sh@72 -- # get_notifications 00:06:02.327 11:46:52 json_config -- json_config/json_config.sh@72 -- # sort 00:06:02.327 11:46:52 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:06:02.327 11:46:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:02.327 11:46:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:02.327 11:46:52 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:06:02.327 11:46:52 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:06:02.327 11:46:52 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:06:02.585 11:46:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:06:02.585 11:46:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:02.585 11:46:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:02.585 11:46:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p1 00:06:02.585 11:46:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:02.585 11:46:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:02.585 11:46:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p0 00:06:02.585 11:46:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:02.585 11:46:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:02.585 11:46:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc3 00:06:02.585 11:46:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:02.585 11:46:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:02.585 11:46:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:PTBdevFromMalloc3 00:06:02.585 11:46:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:02.585 11:46:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:02.585 11:46:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Null0 00:06:02.585 11:46:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:02.585 11:46:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:02.585 11:46:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0 00:06:02.585 11:46:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p2 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p1 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p0 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc1 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:aio_disk 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:0f0e2f07-e1d7-42e7-a69d-c59e3caf91ba 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:001090fb-e284-467c-b6cd-79e26147e777 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:288128a5-64bf-4df7-a828-b5e0018b23ce 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:40ffd1da-5ce8-4a33-9c12-8649a4d94372 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:MallocForCryptoBdev 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:CryptoMallocBdev 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@74 -- # [[ bdev_register:001090fb-e284-467c-b6cd-79e26147e777 bdev_register:0f0e2f07-e1d7-42e7-a69d-c59e3caf91ba bdev_register:288128a5-64bf-4df7-a828-b5e0018b23ce bdev_register:40ffd1da-5ce8-4a33-9c12-8649a4d94372 bdev_register:aio_disk bdev_register:CryptoMallocBdev bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 != \b\d\e\v\_\r\e\g\i\s\t\e\r\:\0\0\1\0\9\0\f\b\-\e\2\8\4\-\4\6\7\c\-\b\6\c\d\-\7\9\e\2\6\1\4\7\e\7\7\7\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\0\f\0\e\2\f\0\7\-\e\1\d\7\-\4\2\e\7\-\a\6\9\d\-\c\5\9\e\3\c\a\f\9\1\b\a\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\2\8\8\1\2\8\a\5\-\6\4\b\f\-\4\d\f\7\-\a\8\2\8\-\b\5\e\0\0\1\8\b\2\3\c\e\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\4\0\f\f\d\1\d\a\-\5\c\e\8\-\4\a\3\3\-\9\c\1\2\-\8\6\4\9\a\4\d\9\4\3\7\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\i\o\_\d\i\s\k\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\C\r\y\p\t\o\M\a\l\l\o\c\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\3\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\F\o\r\C\r\y\p\t\o\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\u\l\l\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\P\T\B\d\e\v\F\r\o\m\M\a\l\l\o\c\3 ]] 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@86 -- # cat 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@86 -- # printf ' %s\n' bdev_register:001090fb-e284-467c-b6cd-79e26147e777 bdev_register:0f0e2f07-e1d7-42e7-a69d-c59e3caf91ba bdev_register:288128a5-64bf-4df7-a828-b5e0018b23ce bdev_register:40ffd1da-5ce8-4a33-9c12-8649a4d94372 bdev_register:aio_disk bdev_register:CryptoMallocBdev bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 00:06:02.586 Expected events matched: 00:06:02.586 bdev_register:001090fb-e284-467c-b6cd-79e26147e777 00:06:02.586 bdev_register:0f0e2f07-e1d7-42e7-a69d-c59e3caf91ba 00:06:02.586 bdev_register:288128a5-64bf-4df7-a828-b5e0018b23ce 00:06:02.586 bdev_register:40ffd1da-5ce8-4a33-9c12-8649a4d94372 00:06:02.586 bdev_register:aio_disk 00:06:02.586 bdev_register:CryptoMallocBdev 00:06:02.586 bdev_register:Malloc0 00:06:02.586 bdev_register:Malloc0p0 00:06:02.586 bdev_register:Malloc0p1 00:06:02.586 bdev_register:Malloc0p2 00:06:02.586 bdev_register:Malloc1 00:06:02.586 bdev_register:Malloc3 00:06:02.586 bdev_register:MallocForCryptoBdev 00:06:02.586 bdev_register:Null0 00:06:02.586 bdev_register:Nvme0n1 00:06:02.586 bdev_register:Nvme0n1p0 00:06:02.586 bdev_register:Nvme0n1p1 00:06:02.586 bdev_register:PTBdevFromMalloc3 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@180 -- # timing_exit create_bdev_subsystem_config 00:06:02.586 11:46:52 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:02.586 11:46:52 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@282 -- # [[ 0 -eq 1 ]] 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@286 -- # [[ 0 -eq 1 ]] 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@290 -- # [[ 0 -eq 1 ]] 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@293 -- # timing_exit json_config_setup_target 00:06:02.586 11:46:52 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:02.586 11:46:52 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@295 -- # [[ 0 -eq 1 ]] 00:06:02.586 11:46:52 json_config -- json_config/json_config.sh@300 -- # tgt_rpc bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:06:02.586 11:46:52 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:06:02.845 MallocBdevForConfigChangeCheck 00:06:02.845 11:46:52 json_config -- json_config/json_config.sh@302 -- # timing_exit json_config_test_init 00:06:02.845 11:46:52 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:02.845 11:46:52 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:02.845 11:46:52 json_config -- json_config/json_config.sh@359 -- # tgt_rpc save_config 00:06:02.845 11:46:52 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:03.102 11:46:53 json_config -- json_config/json_config.sh@361 -- # echo 'INFO: shutting down applications...' 00:06:03.102 INFO: shutting down applications... 00:06:03.102 11:46:53 json_config -- json_config/json_config.sh@362 -- # [[ 0 -eq 1 ]] 00:06:03.102 11:46:53 json_config -- json_config/json_config.sh@368 -- # json_config_clear target 00:06:03.102 11:46:53 json_config -- json_config/json_config.sh@332 -- # [[ -n 22 ]] 00:06:03.102 11:46:53 json_config -- json_config/json_config.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py -s /var/tmp/spdk_tgt.sock clear_config 00:06:03.361 [2024-07-12 11:46:53.408762] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev Nvme0n1p0 being removed: closing lvstore lvs_test 00:06:04.770 Calling clear_iscsi_subsystem 00:06:04.770 Calling clear_nvmf_subsystem 00:06:04.770 Calling clear_nbd_subsystem 00:06:04.770 Calling clear_ublk_subsystem 00:06:04.770 Calling clear_vhost_blk_subsystem 00:06:04.770 Calling clear_vhost_scsi_subsystem 00:06:04.770 Calling clear_bdev_subsystem 00:06:04.770 11:46:54 json_config -- json_config/json_config.sh@337 -- # local config_filter=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py 00:06:04.770 11:46:54 json_config -- json_config/json_config.sh@343 -- # count=100 00:06:04.770 11:46:54 json_config -- json_config/json_config.sh@344 -- # '[' 100 -gt 0 ']' 00:06:04.770 11:46:54 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:04.770 11:46:54 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method delete_global_parameters 00:06:04.770 11:46:54 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method check_empty 00:06:05.071 11:46:55 json_config -- json_config/json_config.sh@345 -- # break 00:06:05.071 11:46:55 json_config -- json_config/json_config.sh@350 -- # '[' 100 -eq 0 ']' 00:06:05.071 11:46:55 json_config -- json_config/json_config.sh@369 -- # json_config_test_shutdown_app target 00:06:05.071 11:46:55 json_config -- json_config/common.sh@31 -- # local app=target 00:06:05.071 11:46:55 json_config -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:06:05.071 11:46:55 json_config -- json_config/common.sh@35 -- # [[ -n 549632 ]] 00:06:05.071 11:46:55 json_config -- json_config/common.sh@38 -- # kill -SIGINT 549632 00:06:05.071 11:46:55 json_config -- json_config/common.sh@40 -- # (( i = 0 )) 00:06:05.071 11:46:55 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:05.071 11:46:55 json_config -- json_config/common.sh@41 -- # kill -0 549632 00:06:05.071 11:46:55 json_config -- json_config/common.sh@45 -- # sleep 0.5 00:06:05.708 11:46:55 json_config -- json_config/common.sh@40 -- # (( i++ )) 00:06:05.708 11:46:55 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:05.708 11:46:55 json_config -- json_config/common.sh@41 -- # kill -0 549632 00:06:05.708 11:46:55 json_config -- json_config/common.sh@42 -- # app_pid["$app"]= 00:06:05.708 11:46:55 json_config -- json_config/common.sh@43 -- # break 00:06:05.708 11:46:55 json_config -- json_config/common.sh@48 -- # [[ -n '' ]] 00:06:05.708 11:46:55 json_config -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:06:05.708 SPDK target shutdown done 00:06:05.708 11:46:55 json_config -- json_config/json_config.sh@371 -- # echo 'INFO: relaunching applications...' 00:06:05.708 INFO: relaunching applications... 00:06:05.708 11:46:55 json_config -- json_config/json_config.sh@372 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:05.708 11:46:55 json_config -- json_config/common.sh@9 -- # local app=target 00:06:05.708 11:46:55 json_config -- json_config/common.sh@10 -- # shift 00:06:05.708 11:46:55 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:05.708 11:46:55 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:05.708 11:46:55 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:06:05.708 11:46:55 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:05.708 11:46:55 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:05.708 11:46:55 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=552288 00:06:05.708 11:46:55 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:05.708 Waiting for target to run... 00:06:05.708 11:46:55 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:05.708 11:46:55 json_config -- json_config/common.sh@25 -- # waitforlisten 552288 /var/tmp/spdk_tgt.sock 00:06:05.708 11:46:55 json_config -- common/autotest_common.sh@829 -- # '[' -z 552288 ']' 00:06:05.708 11:46:55 json_config -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:05.708 11:46:55 json_config -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:05.708 11:46:55 json_config -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:05.708 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:05.708 11:46:55 json_config -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:05.708 11:46:55 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:05.708 [2024-07-12 11:46:55.763861] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:06:05.708 [2024-07-12 11:46:55.763902] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid552288 ] 00:06:05.966 [2024-07-12 11:46:56.063205] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:05.966 [2024-07-12 11:46:56.130963] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:05.966 [2024-07-12 11:46:56.184385] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:06:05.966 [2024-07-12 11:46:56.192415] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:06:05.966 [2024-07-12 11:46:56.200433] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:06:06.225 [2024-07-12 11:46:56.279840] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:06:08.759 [2024-07-12 11:46:58.399691] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:08.759 [2024-07-12 11:46:58.399729] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:06:08.759 [2024-07-12 11:46:58.399736] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:06:08.759 [2024-07-12 11:46:58.407713] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:06:08.759 [2024-07-12 11:46:58.407729] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:06:08.759 [2024-07-12 11:46:58.415726] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:08.759 [2024-07-12 11:46:58.415739] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:08.759 [2024-07-12 11:46:58.423756] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "CryptoMallocBdev_AES_CBC" 00:06:08.759 [2024-07-12 11:46:58.423770] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: MallocForCryptoBdev 00:06:08.759 [2024-07-12 11:46:58.423775] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:06:11.291 [2024-07-12 11:47:01.273114] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:11.291 [2024-07-12 11:47:01.273145] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:11.291 [2024-07-12 11:47:01.273154] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2374f80 00:06:11.291 [2024-07-12 11:47:01.273159] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:11.291 [2024-07-12 11:47:01.273371] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:11.291 [2024-07-12 11:47:01.273381] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:06:11.550 11:47:01 json_config -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:11.550 11:47:01 json_config -- common/autotest_common.sh@862 -- # return 0 00:06:11.550 11:47:01 json_config -- json_config/common.sh@26 -- # echo '' 00:06:11.550 00:06:11.550 11:47:01 json_config -- json_config/json_config.sh@373 -- # [[ 0 -eq 1 ]] 00:06:11.550 11:47:01 json_config -- json_config/json_config.sh@377 -- # echo 'INFO: Checking if target configuration is the same...' 00:06:11.550 INFO: Checking if target configuration is the same... 00:06:11.550 11:47:01 json_config -- json_config/json_config.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:11.550 11:47:01 json_config -- json_config/json_config.sh@378 -- # tgt_rpc save_config 00:06:11.550 11:47:01 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:11.550 + '[' 2 -ne 2 ']' 00:06:11.550 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:06:11.550 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:06:11.550 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:06:11.550 +++ basename /dev/fd/62 00:06:11.550 ++ mktemp /tmp/62.XXX 00:06:11.550 + tmp_file_1=/tmp/62.ZgA 00:06:11.550 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:11.550 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:06:11.550 + tmp_file_2=/tmp/spdk_tgt_config.json.BlD 00:06:11.550 + ret=0 00:06:11.550 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:06:11.809 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:06:11.809 + diff -u /tmp/62.ZgA /tmp/spdk_tgt_config.json.BlD 00:06:11.809 + echo 'INFO: JSON config files are the same' 00:06:11.809 INFO: JSON config files are the same 00:06:11.809 + rm /tmp/62.ZgA /tmp/spdk_tgt_config.json.BlD 00:06:11.809 + exit 0 00:06:11.809 11:47:01 json_config -- json_config/json_config.sh@379 -- # [[ 0 -eq 1 ]] 00:06:11.809 11:47:01 json_config -- json_config/json_config.sh@384 -- # echo 'INFO: changing configuration and checking if this can be detected...' 00:06:11.809 INFO: changing configuration and checking if this can be detected... 00:06:11.809 11:47:01 json_config -- json_config/json_config.sh@386 -- # tgt_rpc bdev_malloc_delete MallocBdevForConfigChangeCheck 00:06:11.809 11:47:01 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_delete MallocBdevForConfigChangeCheck 00:06:12.067 11:47:02 json_config -- json_config/json_config.sh@387 -- # tgt_rpc save_config 00:06:12.067 11:47:02 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:12.067 11:47:02 json_config -- json_config/json_config.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:12.067 + '[' 2 -ne 2 ']' 00:06:12.067 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:06:12.067 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:06:12.067 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:06:12.067 +++ basename /dev/fd/62 00:06:12.067 ++ mktemp /tmp/62.XXX 00:06:12.067 + tmp_file_1=/tmp/62.Tnb 00:06:12.067 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:12.067 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:06:12.068 + tmp_file_2=/tmp/spdk_tgt_config.json.61K 00:06:12.068 + ret=0 00:06:12.068 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:06:12.326 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:06:12.326 + diff -u /tmp/62.Tnb /tmp/spdk_tgt_config.json.61K 00:06:12.326 + ret=1 00:06:12.326 + echo '=== Start of file: /tmp/62.Tnb ===' 00:06:12.326 + cat /tmp/62.Tnb 00:06:12.326 + echo '=== End of file: /tmp/62.Tnb ===' 00:06:12.326 + echo '' 00:06:12.326 + echo '=== Start of file: /tmp/spdk_tgt_config.json.61K ===' 00:06:12.326 + cat /tmp/spdk_tgt_config.json.61K 00:06:12.326 + echo '=== End of file: /tmp/spdk_tgt_config.json.61K ===' 00:06:12.326 + echo '' 00:06:12.326 + rm /tmp/62.Tnb /tmp/spdk_tgt_config.json.61K 00:06:12.326 + exit 1 00:06:12.326 11:47:02 json_config -- json_config/json_config.sh@391 -- # echo 'INFO: configuration change detected.' 00:06:12.326 INFO: configuration change detected. 00:06:12.326 11:47:02 json_config -- json_config/json_config.sh@394 -- # json_config_test_fini 00:06:12.326 11:47:02 json_config -- json_config/json_config.sh@306 -- # timing_enter json_config_test_fini 00:06:12.326 11:47:02 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:12.326 11:47:02 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:12.326 11:47:02 json_config -- json_config/json_config.sh@307 -- # local ret=0 00:06:12.326 11:47:02 json_config -- json_config/json_config.sh@309 -- # [[ -n '' ]] 00:06:12.326 11:47:02 json_config -- json_config/json_config.sh@317 -- # [[ -n 552288 ]] 00:06:12.326 11:47:02 json_config -- json_config/json_config.sh@320 -- # cleanup_bdev_subsystem_config 00:06:12.326 11:47:02 json_config -- json_config/json_config.sh@184 -- # timing_enter cleanup_bdev_subsystem_config 00:06:12.326 11:47:02 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:12.326 11:47:02 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:12.326 11:47:02 json_config -- json_config/json_config.sh@186 -- # [[ 1 -eq 1 ]] 00:06:12.326 11:47:02 json_config -- json_config/json_config.sh@187 -- # tgt_rpc bdev_lvol_delete lvs_test/clone0 00:06:12.326 11:47:02 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/clone0 00:06:12.585 11:47:02 json_config -- json_config/json_config.sh@188 -- # tgt_rpc bdev_lvol_delete lvs_test/lvol0 00:06:12.585 11:47:02 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/lvol0 00:06:12.585 11:47:02 json_config -- json_config/json_config.sh@189 -- # tgt_rpc bdev_lvol_delete lvs_test/snapshot0 00:06:12.585 11:47:02 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/snapshot0 00:06:12.844 11:47:02 json_config -- json_config/json_config.sh@190 -- # tgt_rpc bdev_lvol_delete_lvstore -l lvs_test 00:06:12.844 11:47:02 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete_lvstore -l lvs_test 00:06:13.102 11:47:03 json_config -- json_config/json_config.sh@193 -- # uname -s 00:06:13.102 11:47:03 json_config -- json_config/json_config.sh@193 -- # [[ Linux = Linux ]] 00:06:13.102 11:47:03 json_config -- json_config/json_config.sh@194 -- # rm -f /sample_aio 00:06:13.102 11:47:03 json_config -- json_config/json_config.sh@197 -- # [[ 0 -eq 1 ]] 00:06:13.102 11:47:03 json_config -- json_config/json_config.sh@201 -- # timing_exit cleanup_bdev_subsystem_config 00:06:13.102 11:47:03 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:13.102 11:47:03 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:13.102 11:47:03 json_config -- json_config/json_config.sh@323 -- # killprocess 552288 00:06:13.102 11:47:03 json_config -- common/autotest_common.sh@948 -- # '[' -z 552288 ']' 00:06:13.102 11:47:03 json_config -- common/autotest_common.sh@952 -- # kill -0 552288 00:06:13.102 11:47:03 json_config -- common/autotest_common.sh@953 -- # uname 00:06:13.103 11:47:03 json_config -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:13.103 11:47:03 json_config -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 552288 00:06:13.103 11:47:03 json_config -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:13.103 11:47:03 json_config -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:13.103 11:47:03 json_config -- common/autotest_common.sh@966 -- # echo 'killing process with pid 552288' 00:06:13.103 killing process with pid 552288 00:06:13.103 11:47:03 json_config -- common/autotest_common.sh@967 -- # kill 552288 00:06:13.103 11:47:03 json_config -- common/autotest_common.sh@972 -- # wait 552288 00:06:15.007 11:47:04 json_config -- json_config/json_config.sh@326 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:15.007 11:47:04 json_config -- json_config/json_config.sh@327 -- # timing_exit json_config_test_fini 00:06:15.007 11:47:04 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:15.007 11:47:04 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:15.007 11:47:04 json_config -- json_config/json_config.sh@328 -- # return 0 00:06:15.007 11:47:04 json_config -- json_config/json_config.sh@396 -- # echo 'INFO: Success' 00:06:15.007 INFO: Success 00:06:15.007 00:06:15.007 real 0m24.084s 00:06:15.007 user 0m27.354s 00:06:15.007 sys 0m2.485s 00:06:15.007 11:47:04 json_config -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:15.007 11:47:04 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:15.007 ************************************ 00:06:15.007 END TEST json_config 00:06:15.007 ************************************ 00:06:15.007 11:47:04 -- common/autotest_common.sh@1142 -- # return 0 00:06:15.008 11:47:04 -- spdk/autotest.sh@173 -- # run_test json_config_extra_key /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:15.008 11:47:04 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:15.008 11:47:04 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:15.008 11:47:04 -- common/autotest_common.sh@10 -- # set +x 00:06:15.008 ************************************ 00:06:15.008 START TEST json_config_extra_key 00:06:15.008 ************************************ 00:06:15.008 11:47:04 json_config_extra_key -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:15.008 11:47:05 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:06:15.008 11:47:05 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:06:15.008 11:47:05 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:15.008 11:47:05 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:15.008 11:47:05 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:15.008 11:47:05 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:15.008 11:47:05 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:15.008 11:47:05 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:15.008 11:47:05 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:15.008 11:47:05 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:15.008 11:47:05 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:15.008 11:47:05 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:15.008 11:47:05 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:801347e8-3fd0-e911-906e-0017a4403562 00:06:15.008 11:47:05 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=801347e8-3fd0-e911-906e-0017a4403562 00:06:15.008 11:47:05 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:15.008 11:47:05 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:15.008 11:47:05 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:15.008 11:47:05 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:15.008 11:47:05 json_config_extra_key -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:06:15.008 11:47:05 json_config_extra_key -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:15.008 11:47:05 json_config_extra_key -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:15.008 11:47:05 json_config_extra_key -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:15.008 11:47:05 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:15.008 11:47:05 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:15.008 11:47:05 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:15.008 11:47:05 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:06:15.008 11:47:05 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:15.008 11:47:05 json_config_extra_key -- nvmf/common.sh@47 -- # : 0 00:06:15.008 11:47:05 json_config_extra_key -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:06:15.008 11:47:05 json_config_extra_key -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:06:15.008 11:47:05 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:15.008 11:47:05 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:15.008 11:47:05 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:15.008 11:47:05 json_config_extra_key -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:06:15.008 11:47:05 json_config_extra_key -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:06:15.008 11:47:05 json_config_extra_key -- nvmf/common.sh@51 -- # have_pci_nics=0 00:06:15.008 11:47:05 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:06:15.008 11:47:05 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:06:15.008 11:47:05 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:06:15.008 11:47:05 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:06:15.008 11:47:05 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:06:15.008 11:47:05 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:06:15.008 11:47:05 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:06:15.008 11:47:05 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json') 00:06:15.008 11:47:05 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:06:15.008 11:47:05 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:06:15.008 11:47:05 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:06:15.008 INFO: launching applications... 00:06:15.008 11:47:05 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:06:15.008 11:47:05 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:06:15.008 11:47:05 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:06:15.008 11:47:05 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:15.008 11:47:05 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:15.008 11:47:05 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:06:15.008 11:47:05 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:15.008 11:47:05 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:15.008 11:47:05 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=554000 00:06:15.008 11:47:05 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:15.008 Waiting for target to run... 00:06:15.008 11:47:05 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 554000 /var/tmp/spdk_tgt.sock 00:06:15.008 11:47:05 json_config_extra_key -- common/autotest_common.sh@829 -- # '[' -z 554000 ']' 00:06:15.008 11:47:05 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:06:15.008 11:47:05 json_config_extra_key -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:15.008 11:47:05 json_config_extra_key -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:15.008 11:47:05 json_config_extra_key -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:15.008 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:15.008 11:47:05 json_config_extra_key -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:15.008 11:47:05 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:06:15.008 [2024-07-12 11:47:05.106178] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:06:15.008 [2024-07-12 11:47:05.106225] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid554000 ] 00:06:15.581 [2024-07-12 11:47:05.558609] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:15.581 [2024-07-12 11:47:05.645453] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:15.840 11:47:05 json_config_extra_key -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:15.840 11:47:05 json_config_extra_key -- common/autotest_common.sh@862 -- # return 0 00:06:15.840 11:47:05 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:06:15.840 00:06:15.840 11:47:05 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:06:15.840 INFO: shutting down applications... 00:06:15.840 11:47:05 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:06:15.840 11:47:05 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:06:15.840 11:47:05 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:06:15.840 11:47:05 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 554000 ]] 00:06:15.840 11:47:05 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 554000 00:06:15.840 11:47:05 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:06:15.840 11:47:05 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:15.840 11:47:05 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 554000 00:06:15.840 11:47:05 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:06:16.407 11:47:06 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:06:16.407 11:47:06 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:16.407 11:47:06 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 554000 00:06:16.407 11:47:06 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:06:16.407 11:47:06 json_config_extra_key -- json_config/common.sh@43 -- # break 00:06:16.407 11:47:06 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:06:16.407 11:47:06 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:06:16.407 SPDK target shutdown done 00:06:16.407 11:47:06 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:06:16.407 Success 00:06:16.407 00:06:16.407 real 0m1.458s 00:06:16.407 user 0m0.933s 00:06:16.407 sys 0m0.538s 00:06:16.407 11:47:06 json_config_extra_key -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:16.407 11:47:06 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:06:16.407 ************************************ 00:06:16.407 END TEST json_config_extra_key 00:06:16.407 ************************************ 00:06:16.407 11:47:06 -- common/autotest_common.sh@1142 -- # return 0 00:06:16.407 11:47:06 -- spdk/autotest.sh@174 -- # run_test alias_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:16.407 11:47:06 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:16.407 11:47:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:16.407 11:47:06 -- common/autotest_common.sh@10 -- # set +x 00:06:16.407 ************************************ 00:06:16.407 START TEST alias_rpc 00:06:16.407 ************************************ 00:06:16.407 11:47:06 alias_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:16.407 * Looking for test storage... 00:06:16.407 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc 00:06:16.407 11:47:06 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:16.407 11:47:06 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=554278 00:06:16.407 11:47:06 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 554278 00:06:16.407 11:47:06 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:16.407 11:47:06 alias_rpc -- common/autotest_common.sh@829 -- # '[' -z 554278 ']' 00:06:16.407 11:47:06 alias_rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:16.407 11:47:06 alias_rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:16.407 11:47:06 alias_rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:16.407 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:16.407 11:47:06 alias_rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:16.407 11:47:06 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:16.407 [2024-07-12 11:47:06.628145] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:06:16.407 [2024-07-12 11:47:06.628194] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid554278 ] 00:06:16.666 [2024-07-12 11:47:06.705295] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:16.666 [2024-07-12 11:47:06.781302] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:17.261 11:47:07 alias_rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:17.261 11:47:07 alias_rpc -- common/autotest_common.sh@862 -- # return 0 00:06:17.261 11:47:07 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_config -i 00:06:17.519 11:47:07 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 554278 00:06:17.519 11:47:07 alias_rpc -- common/autotest_common.sh@948 -- # '[' -z 554278 ']' 00:06:17.519 11:47:07 alias_rpc -- common/autotest_common.sh@952 -- # kill -0 554278 00:06:17.519 11:47:07 alias_rpc -- common/autotest_common.sh@953 -- # uname 00:06:17.519 11:47:07 alias_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:17.519 11:47:07 alias_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 554278 00:06:17.519 11:47:07 alias_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:17.519 11:47:07 alias_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:17.519 11:47:07 alias_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 554278' 00:06:17.519 killing process with pid 554278 00:06:17.519 11:47:07 alias_rpc -- common/autotest_common.sh@967 -- # kill 554278 00:06:17.519 11:47:07 alias_rpc -- common/autotest_common.sh@972 -- # wait 554278 00:06:17.778 00:06:17.778 real 0m1.505s 00:06:17.778 user 0m1.619s 00:06:17.778 sys 0m0.423s 00:06:17.778 11:47:07 alias_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:17.778 11:47:07 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:17.778 ************************************ 00:06:17.778 END TEST alias_rpc 00:06:17.778 ************************************ 00:06:18.036 11:47:08 -- common/autotest_common.sh@1142 -- # return 0 00:06:18.036 11:47:08 -- spdk/autotest.sh@176 -- # [[ 0 -eq 0 ]] 00:06:18.036 11:47:08 -- spdk/autotest.sh@177 -- # run_test spdkcli_tcp /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:18.036 11:47:08 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:18.036 11:47:08 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:18.036 11:47:08 -- common/autotest_common.sh@10 -- # set +x 00:06:18.036 ************************************ 00:06:18.036 START TEST spdkcli_tcp 00:06:18.036 ************************************ 00:06:18.036 11:47:08 spdkcli_tcp -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:18.036 * Looking for test storage... 00:06:18.036 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli 00:06:18.036 11:47:08 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/common.sh 00:06:18.036 11:47:08 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:06:18.036 11:47:08 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py 00:06:18.036 11:47:08 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:06:18.036 11:47:08 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:06:18.036 11:47:08 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:06:18.036 11:47:08 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:06:18.036 11:47:08 spdkcli_tcp -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:18.036 11:47:08 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:18.036 11:47:08 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=554566 00:06:18.036 11:47:08 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 554566 00:06:18.036 11:47:08 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:06:18.036 11:47:08 spdkcli_tcp -- common/autotest_common.sh@829 -- # '[' -z 554566 ']' 00:06:18.036 11:47:08 spdkcli_tcp -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:18.036 11:47:08 spdkcli_tcp -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:18.036 11:47:08 spdkcli_tcp -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:18.036 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:18.036 11:47:08 spdkcli_tcp -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:18.036 11:47:08 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:18.036 [2024-07-12 11:47:08.210682] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:06:18.036 [2024-07-12 11:47:08.210727] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid554566 ] 00:06:18.294 [2024-07-12 11:47:08.284898] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:18.294 [2024-07-12 11:47:08.356890] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:18.294 [2024-07-12 11:47:08.356904] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:18.860 11:47:08 spdkcli_tcp -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:18.860 11:47:08 spdkcli_tcp -- common/autotest_common.sh@862 -- # return 0 00:06:18.860 11:47:08 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=554790 00:06:18.860 11:47:08 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:06:18.860 11:47:08 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:06:19.119 [ 00:06:19.119 "bdev_malloc_delete", 00:06:19.119 "bdev_malloc_create", 00:06:19.119 "bdev_null_resize", 00:06:19.119 "bdev_null_delete", 00:06:19.119 "bdev_null_create", 00:06:19.119 "bdev_nvme_cuse_unregister", 00:06:19.119 "bdev_nvme_cuse_register", 00:06:19.119 "bdev_opal_new_user", 00:06:19.119 "bdev_opal_set_lock_state", 00:06:19.119 "bdev_opal_delete", 00:06:19.119 "bdev_opal_get_info", 00:06:19.119 "bdev_opal_create", 00:06:19.119 "bdev_nvme_opal_revert", 00:06:19.119 "bdev_nvme_opal_init", 00:06:19.119 "bdev_nvme_send_cmd", 00:06:19.119 "bdev_nvme_get_path_iostat", 00:06:19.119 "bdev_nvme_get_mdns_discovery_info", 00:06:19.119 "bdev_nvme_stop_mdns_discovery", 00:06:19.119 "bdev_nvme_start_mdns_discovery", 00:06:19.119 "bdev_nvme_set_multipath_policy", 00:06:19.119 "bdev_nvme_set_preferred_path", 00:06:19.119 "bdev_nvme_get_io_paths", 00:06:19.119 "bdev_nvme_remove_error_injection", 00:06:19.119 "bdev_nvme_add_error_injection", 00:06:19.119 "bdev_nvme_get_discovery_info", 00:06:19.119 "bdev_nvme_stop_discovery", 00:06:19.119 "bdev_nvme_start_discovery", 00:06:19.119 "bdev_nvme_get_controller_health_info", 00:06:19.119 "bdev_nvme_disable_controller", 00:06:19.119 "bdev_nvme_enable_controller", 00:06:19.119 "bdev_nvme_reset_controller", 00:06:19.119 "bdev_nvme_get_transport_statistics", 00:06:19.119 "bdev_nvme_apply_firmware", 00:06:19.119 "bdev_nvme_detach_controller", 00:06:19.119 "bdev_nvme_get_controllers", 00:06:19.119 "bdev_nvme_attach_controller", 00:06:19.119 "bdev_nvme_set_hotplug", 00:06:19.119 "bdev_nvme_set_options", 00:06:19.119 "bdev_passthru_delete", 00:06:19.119 "bdev_passthru_create", 00:06:19.119 "bdev_lvol_set_parent_bdev", 00:06:19.119 "bdev_lvol_set_parent", 00:06:19.119 "bdev_lvol_check_shallow_copy", 00:06:19.119 "bdev_lvol_start_shallow_copy", 00:06:19.119 "bdev_lvol_grow_lvstore", 00:06:19.119 "bdev_lvol_get_lvols", 00:06:19.119 "bdev_lvol_get_lvstores", 00:06:19.119 "bdev_lvol_delete", 00:06:19.119 "bdev_lvol_set_read_only", 00:06:19.119 "bdev_lvol_resize", 00:06:19.119 "bdev_lvol_decouple_parent", 00:06:19.119 "bdev_lvol_inflate", 00:06:19.119 "bdev_lvol_rename", 00:06:19.119 "bdev_lvol_clone_bdev", 00:06:19.119 "bdev_lvol_clone", 00:06:19.119 "bdev_lvol_snapshot", 00:06:19.119 "bdev_lvol_create", 00:06:19.119 "bdev_lvol_delete_lvstore", 00:06:19.119 "bdev_lvol_rename_lvstore", 00:06:19.120 "bdev_lvol_create_lvstore", 00:06:19.120 "bdev_raid_set_options", 00:06:19.120 "bdev_raid_remove_base_bdev", 00:06:19.120 "bdev_raid_add_base_bdev", 00:06:19.120 "bdev_raid_delete", 00:06:19.120 "bdev_raid_create", 00:06:19.120 "bdev_raid_get_bdevs", 00:06:19.120 "bdev_error_inject_error", 00:06:19.120 "bdev_error_delete", 00:06:19.120 "bdev_error_create", 00:06:19.120 "bdev_split_delete", 00:06:19.120 "bdev_split_create", 00:06:19.120 "bdev_delay_delete", 00:06:19.120 "bdev_delay_create", 00:06:19.120 "bdev_delay_update_latency", 00:06:19.120 "bdev_zone_block_delete", 00:06:19.120 "bdev_zone_block_create", 00:06:19.120 "blobfs_create", 00:06:19.120 "blobfs_detect", 00:06:19.120 "blobfs_set_cache_size", 00:06:19.120 "bdev_crypto_delete", 00:06:19.120 "bdev_crypto_create", 00:06:19.120 "bdev_compress_delete", 00:06:19.120 "bdev_compress_create", 00:06:19.120 "bdev_compress_get_orphans", 00:06:19.120 "bdev_aio_delete", 00:06:19.120 "bdev_aio_rescan", 00:06:19.120 "bdev_aio_create", 00:06:19.120 "bdev_ftl_set_property", 00:06:19.120 "bdev_ftl_get_properties", 00:06:19.120 "bdev_ftl_get_stats", 00:06:19.120 "bdev_ftl_unmap", 00:06:19.120 "bdev_ftl_unload", 00:06:19.120 "bdev_ftl_delete", 00:06:19.120 "bdev_ftl_load", 00:06:19.120 "bdev_ftl_create", 00:06:19.120 "bdev_virtio_attach_controller", 00:06:19.120 "bdev_virtio_scsi_get_devices", 00:06:19.120 "bdev_virtio_detach_controller", 00:06:19.120 "bdev_virtio_blk_set_hotplug", 00:06:19.120 "bdev_iscsi_delete", 00:06:19.120 "bdev_iscsi_create", 00:06:19.120 "bdev_iscsi_set_options", 00:06:19.120 "accel_error_inject_error", 00:06:19.120 "ioat_scan_accel_module", 00:06:19.120 "dsa_scan_accel_module", 00:06:19.120 "iaa_scan_accel_module", 00:06:19.120 "dpdk_cryptodev_get_driver", 00:06:19.120 "dpdk_cryptodev_set_driver", 00:06:19.120 "dpdk_cryptodev_scan_accel_module", 00:06:19.120 "compressdev_scan_accel_module", 00:06:19.120 "keyring_file_remove_key", 00:06:19.120 "keyring_file_add_key", 00:06:19.120 "keyring_linux_set_options", 00:06:19.120 "iscsi_get_histogram", 00:06:19.120 "iscsi_enable_histogram", 00:06:19.120 "iscsi_set_options", 00:06:19.120 "iscsi_get_auth_groups", 00:06:19.120 "iscsi_auth_group_remove_secret", 00:06:19.120 "iscsi_auth_group_add_secret", 00:06:19.120 "iscsi_delete_auth_group", 00:06:19.120 "iscsi_create_auth_group", 00:06:19.120 "iscsi_set_discovery_auth", 00:06:19.120 "iscsi_get_options", 00:06:19.120 "iscsi_target_node_request_logout", 00:06:19.120 "iscsi_target_node_set_redirect", 00:06:19.120 "iscsi_target_node_set_auth", 00:06:19.120 "iscsi_target_node_add_lun", 00:06:19.120 "iscsi_get_stats", 00:06:19.120 "iscsi_get_connections", 00:06:19.120 "iscsi_portal_group_set_auth", 00:06:19.120 "iscsi_start_portal_group", 00:06:19.120 "iscsi_delete_portal_group", 00:06:19.120 "iscsi_create_portal_group", 00:06:19.120 "iscsi_get_portal_groups", 00:06:19.120 "iscsi_delete_target_node", 00:06:19.120 "iscsi_target_node_remove_pg_ig_maps", 00:06:19.120 "iscsi_target_node_add_pg_ig_maps", 00:06:19.120 "iscsi_create_target_node", 00:06:19.120 "iscsi_get_target_nodes", 00:06:19.120 "iscsi_delete_initiator_group", 00:06:19.120 "iscsi_initiator_group_remove_initiators", 00:06:19.120 "iscsi_initiator_group_add_initiators", 00:06:19.120 "iscsi_create_initiator_group", 00:06:19.120 "iscsi_get_initiator_groups", 00:06:19.120 "nvmf_set_crdt", 00:06:19.120 "nvmf_set_config", 00:06:19.120 "nvmf_set_max_subsystems", 00:06:19.120 "nvmf_stop_mdns_prr", 00:06:19.120 "nvmf_publish_mdns_prr", 00:06:19.120 "nvmf_subsystem_get_listeners", 00:06:19.120 "nvmf_subsystem_get_qpairs", 00:06:19.120 "nvmf_subsystem_get_controllers", 00:06:19.120 "nvmf_get_stats", 00:06:19.120 "nvmf_get_transports", 00:06:19.120 "nvmf_create_transport", 00:06:19.120 "nvmf_get_targets", 00:06:19.120 "nvmf_delete_target", 00:06:19.120 "nvmf_create_target", 00:06:19.120 "nvmf_subsystem_allow_any_host", 00:06:19.120 "nvmf_subsystem_remove_host", 00:06:19.120 "nvmf_subsystem_add_host", 00:06:19.120 "nvmf_ns_remove_host", 00:06:19.120 "nvmf_ns_add_host", 00:06:19.120 "nvmf_subsystem_remove_ns", 00:06:19.120 "nvmf_subsystem_add_ns", 00:06:19.120 "nvmf_subsystem_listener_set_ana_state", 00:06:19.120 "nvmf_discovery_get_referrals", 00:06:19.120 "nvmf_discovery_remove_referral", 00:06:19.120 "nvmf_discovery_add_referral", 00:06:19.120 "nvmf_subsystem_remove_listener", 00:06:19.120 "nvmf_subsystem_add_listener", 00:06:19.120 "nvmf_delete_subsystem", 00:06:19.120 "nvmf_create_subsystem", 00:06:19.120 "nvmf_get_subsystems", 00:06:19.120 "env_dpdk_get_mem_stats", 00:06:19.120 "nbd_get_disks", 00:06:19.120 "nbd_stop_disk", 00:06:19.120 "nbd_start_disk", 00:06:19.120 "ublk_recover_disk", 00:06:19.120 "ublk_get_disks", 00:06:19.120 "ublk_stop_disk", 00:06:19.120 "ublk_start_disk", 00:06:19.120 "ublk_destroy_target", 00:06:19.120 "ublk_create_target", 00:06:19.120 "virtio_blk_create_transport", 00:06:19.120 "virtio_blk_get_transports", 00:06:19.120 "vhost_controller_set_coalescing", 00:06:19.120 "vhost_get_controllers", 00:06:19.120 "vhost_delete_controller", 00:06:19.120 "vhost_create_blk_controller", 00:06:19.120 "vhost_scsi_controller_remove_target", 00:06:19.120 "vhost_scsi_controller_add_target", 00:06:19.120 "vhost_start_scsi_controller", 00:06:19.120 "vhost_create_scsi_controller", 00:06:19.120 "thread_set_cpumask", 00:06:19.120 "framework_get_governor", 00:06:19.120 "framework_get_scheduler", 00:06:19.120 "framework_set_scheduler", 00:06:19.120 "framework_get_reactors", 00:06:19.120 "thread_get_io_channels", 00:06:19.120 "thread_get_pollers", 00:06:19.120 "thread_get_stats", 00:06:19.120 "framework_monitor_context_switch", 00:06:19.120 "spdk_kill_instance", 00:06:19.120 "log_enable_timestamps", 00:06:19.120 "log_get_flags", 00:06:19.120 "log_clear_flag", 00:06:19.120 "log_set_flag", 00:06:19.120 "log_get_level", 00:06:19.120 "log_set_level", 00:06:19.120 "log_get_print_level", 00:06:19.120 "log_set_print_level", 00:06:19.120 "framework_enable_cpumask_locks", 00:06:19.120 "framework_disable_cpumask_locks", 00:06:19.120 "framework_wait_init", 00:06:19.120 "framework_start_init", 00:06:19.120 "scsi_get_devices", 00:06:19.120 "bdev_get_histogram", 00:06:19.120 "bdev_enable_histogram", 00:06:19.120 "bdev_set_qos_limit", 00:06:19.120 "bdev_set_qd_sampling_period", 00:06:19.120 "bdev_get_bdevs", 00:06:19.120 "bdev_reset_iostat", 00:06:19.120 "bdev_get_iostat", 00:06:19.120 "bdev_examine", 00:06:19.120 "bdev_wait_for_examine", 00:06:19.120 "bdev_set_options", 00:06:19.120 "notify_get_notifications", 00:06:19.120 "notify_get_types", 00:06:19.120 "accel_get_stats", 00:06:19.120 "accel_set_options", 00:06:19.120 "accel_set_driver", 00:06:19.120 "accel_crypto_key_destroy", 00:06:19.120 "accel_crypto_keys_get", 00:06:19.120 "accel_crypto_key_create", 00:06:19.120 "accel_assign_opc", 00:06:19.120 "accel_get_module_info", 00:06:19.120 "accel_get_opc_assignments", 00:06:19.120 "vmd_rescan", 00:06:19.120 "vmd_remove_device", 00:06:19.120 "vmd_enable", 00:06:19.120 "sock_get_default_impl", 00:06:19.120 "sock_set_default_impl", 00:06:19.120 "sock_impl_set_options", 00:06:19.120 "sock_impl_get_options", 00:06:19.120 "iobuf_get_stats", 00:06:19.120 "iobuf_set_options", 00:06:19.120 "framework_get_pci_devices", 00:06:19.120 "framework_get_config", 00:06:19.120 "framework_get_subsystems", 00:06:19.120 "trace_get_info", 00:06:19.120 "trace_get_tpoint_group_mask", 00:06:19.120 "trace_disable_tpoint_group", 00:06:19.120 "trace_enable_tpoint_group", 00:06:19.120 "trace_clear_tpoint_mask", 00:06:19.120 "trace_set_tpoint_mask", 00:06:19.120 "keyring_get_keys", 00:06:19.120 "spdk_get_version", 00:06:19.120 "rpc_get_methods" 00:06:19.120 ] 00:06:19.120 11:47:09 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:06:19.120 11:47:09 spdkcli_tcp -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:19.120 11:47:09 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:19.120 11:47:09 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:06:19.120 11:47:09 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 554566 00:06:19.120 11:47:09 spdkcli_tcp -- common/autotest_common.sh@948 -- # '[' -z 554566 ']' 00:06:19.120 11:47:09 spdkcli_tcp -- common/autotest_common.sh@952 -- # kill -0 554566 00:06:19.120 11:47:09 spdkcli_tcp -- common/autotest_common.sh@953 -- # uname 00:06:19.120 11:47:09 spdkcli_tcp -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:19.120 11:47:09 spdkcli_tcp -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 554566 00:06:19.120 11:47:09 spdkcli_tcp -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:19.120 11:47:09 spdkcli_tcp -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:19.120 11:47:09 spdkcli_tcp -- common/autotest_common.sh@966 -- # echo 'killing process with pid 554566' 00:06:19.120 killing process with pid 554566 00:06:19.120 11:47:09 spdkcli_tcp -- common/autotest_common.sh@967 -- # kill 554566 00:06:19.120 11:47:09 spdkcli_tcp -- common/autotest_common.sh@972 -- # wait 554566 00:06:19.380 00:06:19.380 real 0m1.506s 00:06:19.380 user 0m2.750s 00:06:19.380 sys 0m0.445s 00:06:19.380 11:47:09 spdkcli_tcp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:19.380 11:47:09 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:19.380 ************************************ 00:06:19.380 END TEST spdkcli_tcp 00:06:19.380 ************************************ 00:06:19.380 11:47:09 -- common/autotest_common.sh@1142 -- # return 0 00:06:19.380 11:47:09 -- spdk/autotest.sh@180 -- # run_test dpdk_mem_utility /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:19.380 11:47:09 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:19.380 11:47:09 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:19.380 11:47:09 -- common/autotest_common.sh@10 -- # set +x 00:06:19.639 ************************************ 00:06:19.639 START TEST dpdk_mem_utility 00:06:19.639 ************************************ 00:06:19.639 11:47:09 dpdk_mem_utility -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:19.639 * Looking for test storage... 00:06:19.639 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility 00:06:19.639 11:47:09 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:19.639 11:47:09 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=554861 00:06:19.639 11:47:09 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 554861 00:06:19.639 11:47:09 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:19.639 11:47:09 dpdk_mem_utility -- common/autotest_common.sh@829 -- # '[' -z 554861 ']' 00:06:19.639 11:47:09 dpdk_mem_utility -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:19.639 11:47:09 dpdk_mem_utility -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:19.639 11:47:09 dpdk_mem_utility -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:19.639 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:19.639 11:47:09 dpdk_mem_utility -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:19.639 11:47:09 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:19.639 [2024-07-12 11:47:09.778578] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:06:19.639 [2024-07-12 11:47:09.778619] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid554861 ] 00:06:19.639 [2024-07-12 11:47:09.856040] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:19.899 [2024-07-12 11:47:09.933358] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:20.471 11:47:10 dpdk_mem_utility -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:20.471 11:47:10 dpdk_mem_utility -- common/autotest_common.sh@862 -- # return 0 00:06:20.471 11:47:10 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:06:20.471 11:47:10 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:06:20.471 11:47:10 dpdk_mem_utility -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:20.471 11:47:10 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:20.471 { 00:06:20.471 "filename": "/tmp/spdk_mem_dump.txt" 00:06:20.471 } 00:06:20.471 11:47:10 dpdk_mem_utility -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:20.471 11:47:10 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:20.471 DPDK memory size 814.000000 MiB in 1 heap(s) 00:06:20.471 1 heaps totaling size 814.000000 MiB 00:06:20.471 size: 814.000000 MiB heap id: 0 00:06:20.471 end heaps---------- 00:06:20.471 8 mempools totaling size 598.116089 MiB 00:06:20.471 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:06:20.471 size: 158.602051 MiB name: PDU_data_out_Pool 00:06:20.471 size: 84.521057 MiB name: bdev_io_554861 00:06:20.471 size: 51.011292 MiB name: evtpool_554861 00:06:20.471 size: 50.003479 MiB name: msgpool_554861 00:06:20.471 size: 21.763794 MiB name: PDU_Pool 00:06:20.471 size: 19.513306 MiB name: SCSI_TASK_Pool 00:06:20.471 size: 0.026123 MiB name: Session_Pool 00:06:20.471 end mempools------- 00:06:20.471 201 memzones totaling size 4.176453 MiB 00:06:20.471 size: 1.000366 MiB name: RG_ring_0_554861 00:06:20.471 size: 1.000366 MiB name: RG_ring_1_554861 00:06:20.471 size: 1.000366 MiB name: RG_ring_4_554861 00:06:20.471 size: 1.000366 MiB name: RG_ring_5_554861 00:06:20.471 size: 0.125366 MiB name: RG_ring_2_554861 00:06:20.471 size: 0.015991 MiB name: RG_ring_3_554861 00:06:20.471 size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:06:20.471 size: 0.000305 MiB name: 0000:1a:01.0_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1a:01.1_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1a:01.2_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1a:01.3_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1a:01.4_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1a:01.5_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1a:01.6_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1a:01.7_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1a:02.0_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1a:02.1_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1a:02.2_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1a:02.3_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1a:02.4_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1a:02.5_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1a:02.6_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1a:02.7_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1c:01.0_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1c:01.1_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1c:01.2_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1c:01.3_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1c:01.4_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1c:01.5_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1c:01.6_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1c:01.7_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1c:02.0_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1c:02.1_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1c:02.2_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1c:02.3_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1c:02.4_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1c:02.5_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1c:02.6_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1c:02.7_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1e:01.0_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1e:01.1_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1e:01.2_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1e:01.3_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1e:01.4_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1e:01.5_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1e:01.6_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1e:01.7_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1e:02.0_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1e:02.1_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1e:02.2_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1e:02.3_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1e:02.4_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1e:02.5_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1e:02.6_qat 00:06:20.471 size: 0.000305 MiB name: 0000:1e:02.7_qat 00:06:20.471 size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:06:20.471 size: 0.000122 MiB name: rte_cryptodev_data_0 00:06:20.471 size: 0.000122 MiB name: rte_cryptodev_data_1 00:06:20.471 size: 0.000122 MiB name: rte_compressdev_data_0 00:06:20.471 size: 0.000122 MiB name: rte_cryptodev_data_2 00:06:20.471 size: 0.000122 MiB name: rte_cryptodev_data_3 00:06:20.471 size: 0.000122 MiB name: rte_compressdev_data_1 00:06:20.471 size: 0.000122 MiB name: rte_cryptodev_data_4 00:06:20.471 size: 0.000122 MiB name: rte_cryptodev_data_5 00:06:20.471 size: 0.000122 MiB name: rte_compressdev_data_2 00:06:20.471 size: 0.000122 MiB name: rte_cryptodev_data_6 00:06:20.471 size: 0.000122 MiB name: rte_cryptodev_data_7 00:06:20.471 size: 0.000122 MiB name: rte_compressdev_data_3 00:06:20.471 size: 0.000122 MiB name: rte_cryptodev_data_8 00:06:20.471 size: 0.000122 MiB name: rte_cryptodev_data_9 00:06:20.471 size: 0.000122 MiB name: rte_compressdev_data_4 00:06:20.471 size: 0.000122 MiB name: rte_cryptodev_data_10 00:06:20.471 size: 0.000122 MiB name: rte_cryptodev_data_11 00:06:20.471 size: 0.000122 MiB name: rte_compressdev_data_5 00:06:20.471 size: 0.000122 MiB name: rte_cryptodev_data_12 00:06:20.471 size: 0.000122 MiB name: rte_cryptodev_data_13 00:06:20.471 size: 0.000122 MiB name: rte_compressdev_data_6 00:06:20.471 size: 0.000122 MiB name: rte_cryptodev_data_14 00:06:20.471 size: 0.000122 MiB name: rte_cryptodev_data_15 00:06:20.471 size: 0.000122 MiB name: rte_compressdev_data_7 00:06:20.471 size: 0.000122 MiB name: rte_cryptodev_data_16 00:06:20.471 size: 0.000122 MiB name: rte_cryptodev_data_17 00:06:20.471 size: 0.000122 MiB name: rte_compressdev_data_8 00:06:20.471 size: 0.000122 MiB name: rte_cryptodev_data_18 00:06:20.471 size: 0.000122 MiB name: rte_cryptodev_data_19 00:06:20.471 size: 0.000122 MiB name: rte_compressdev_data_9 00:06:20.471 size: 0.000122 MiB name: rte_cryptodev_data_20 00:06:20.471 size: 0.000122 MiB name: rte_cryptodev_data_21 00:06:20.471 size: 0.000122 MiB name: rte_compressdev_data_10 00:06:20.471 size: 0.000122 MiB name: rte_cryptodev_data_22 00:06:20.471 size: 0.000122 MiB name: rte_cryptodev_data_23 00:06:20.471 size: 0.000122 MiB name: rte_compressdev_data_11 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_24 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_25 00:06:20.472 size: 0.000122 MiB name: rte_compressdev_data_12 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_26 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_27 00:06:20.472 size: 0.000122 MiB name: rte_compressdev_data_13 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_28 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_29 00:06:20.472 size: 0.000122 MiB name: rte_compressdev_data_14 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_30 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_31 00:06:20.472 size: 0.000122 MiB name: rte_compressdev_data_15 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_32 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_33 00:06:20.472 size: 0.000122 MiB name: rte_compressdev_data_16 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_34 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_35 00:06:20.472 size: 0.000122 MiB name: rte_compressdev_data_17 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_36 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_37 00:06:20.472 size: 0.000122 MiB name: rte_compressdev_data_18 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_38 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_39 00:06:20.472 size: 0.000122 MiB name: rte_compressdev_data_19 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_40 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_41 00:06:20.472 size: 0.000122 MiB name: rte_compressdev_data_20 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_42 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_43 00:06:20.472 size: 0.000122 MiB name: rte_compressdev_data_21 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_44 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_45 00:06:20.472 size: 0.000122 MiB name: rte_compressdev_data_22 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_46 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_47 00:06:20.472 size: 0.000122 MiB name: rte_compressdev_data_23 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_48 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_49 00:06:20.472 size: 0.000122 MiB name: rte_compressdev_data_24 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_50 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_51 00:06:20.472 size: 0.000122 MiB name: rte_compressdev_data_25 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_52 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_53 00:06:20.472 size: 0.000122 MiB name: rte_compressdev_data_26 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_54 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_55 00:06:20.472 size: 0.000122 MiB name: rte_compressdev_data_27 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_56 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_57 00:06:20.472 size: 0.000122 MiB name: rte_compressdev_data_28 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_58 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_59 00:06:20.472 size: 0.000122 MiB name: rte_compressdev_data_29 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_60 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_61 00:06:20.472 size: 0.000122 MiB name: rte_compressdev_data_30 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_62 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_63 00:06:20.472 size: 0.000122 MiB name: rte_compressdev_data_31 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_64 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_65 00:06:20.472 size: 0.000122 MiB name: rte_compressdev_data_32 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_66 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_67 00:06:20.472 size: 0.000122 MiB name: rte_compressdev_data_33 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_68 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_69 00:06:20.472 size: 0.000122 MiB name: rte_compressdev_data_34 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_70 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_71 00:06:20.472 size: 0.000122 MiB name: rte_compressdev_data_35 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_72 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_73 00:06:20.472 size: 0.000122 MiB name: rte_compressdev_data_36 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_74 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_75 00:06:20.472 size: 0.000122 MiB name: rte_compressdev_data_37 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_76 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_77 00:06:20.472 size: 0.000122 MiB name: rte_compressdev_data_38 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_78 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_79 00:06:20.472 size: 0.000122 MiB name: rte_compressdev_data_39 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_80 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_81 00:06:20.472 size: 0.000122 MiB name: rte_compressdev_data_40 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_82 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_83 00:06:20.472 size: 0.000122 MiB name: rte_compressdev_data_41 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_84 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_85 00:06:20.472 size: 0.000122 MiB name: rte_compressdev_data_42 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_86 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_87 00:06:20.472 size: 0.000122 MiB name: rte_compressdev_data_43 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_88 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_89 00:06:20.472 size: 0.000122 MiB name: rte_compressdev_data_44 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_90 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_91 00:06:20.472 size: 0.000122 MiB name: rte_compressdev_data_45 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_92 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_93 00:06:20.472 size: 0.000122 MiB name: rte_compressdev_data_46 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_94 00:06:20.472 size: 0.000122 MiB name: rte_cryptodev_data_95 00:06:20.472 size: 0.000122 MiB name: rte_compressdev_data_47 00:06:20.472 size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:06:20.472 end memzones------- 00:06:20.472 11:47:10 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:06:20.472 heap id: 0 total size: 814.000000 MiB number of busy elements: 637 number of free elements: 14 00:06:20.472 list of free elements. size: 11.781372 MiB 00:06:20.472 element at address: 0x200000400000 with size: 1.999512 MiB 00:06:20.472 element at address: 0x200018e00000 with size: 0.999878 MiB 00:06:20.472 element at address: 0x200019000000 with size: 0.999878 MiB 00:06:20.472 element at address: 0x200003e00000 with size: 0.996460 MiB 00:06:20.472 element at address: 0x200031c00000 with size: 0.994446 MiB 00:06:20.472 element at address: 0x200013800000 with size: 0.978699 MiB 00:06:20.472 element at address: 0x200007000000 with size: 0.959839 MiB 00:06:20.472 element at address: 0x200019200000 with size: 0.936584 MiB 00:06:20.472 element at address: 0x20001aa00000 with size: 0.564026 MiB 00:06:20.472 element at address: 0x200003a00000 with size: 0.495056 MiB 00:06:20.472 element at address: 0x20000b200000 with size: 0.488892 MiB 00:06:20.472 element at address: 0x200000800000 with size: 0.486694 MiB 00:06:20.472 element at address: 0x200019400000 with size: 0.485657 MiB 00:06:20.472 element at address: 0x200027e00000 with size: 0.395752 MiB 00:06:20.472 list of standard malloc elements. size: 199.898621 MiB 00:06:20.472 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:06:20.472 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:06:20.472 element at address: 0x200018efff80 with size: 1.000122 MiB 00:06:20.472 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:06:20.472 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:06:20.472 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:06:20.472 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:06:20.472 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:06:20.472 element at address: 0x20000032bc80 with size: 0.004395 MiB 00:06:20.472 element at address: 0x20000032f740 with size: 0.004395 MiB 00:06:20.472 element at address: 0x200000333200 with size: 0.004395 MiB 00:06:20.472 element at address: 0x200000336cc0 with size: 0.004395 MiB 00:06:20.472 element at address: 0x20000033a780 with size: 0.004395 MiB 00:06:20.472 element at address: 0x20000033e240 with size: 0.004395 MiB 00:06:20.472 element at address: 0x200000341d00 with size: 0.004395 MiB 00:06:20.472 element at address: 0x2000003457c0 with size: 0.004395 MiB 00:06:20.472 element at address: 0x200000349280 with size: 0.004395 MiB 00:06:20.472 element at address: 0x20000034cd40 with size: 0.004395 MiB 00:06:20.472 element at address: 0x200000350800 with size: 0.004395 MiB 00:06:20.472 element at address: 0x2000003542c0 with size: 0.004395 MiB 00:06:20.472 element at address: 0x200000357d80 with size: 0.004395 MiB 00:06:20.472 element at address: 0x20000035b840 with size: 0.004395 MiB 00:06:20.472 element at address: 0x20000035f300 with size: 0.004395 MiB 00:06:20.473 element at address: 0x200000362dc0 with size: 0.004395 MiB 00:06:20.473 element at address: 0x200000366880 with size: 0.004395 MiB 00:06:20.473 element at address: 0x20000036a340 with size: 0.004395 MiB 00:06:20.473 element at address: 0x20000036de00 with size: 0.004395 MiB 00:06:20.473 element at address: 0x2000003718c0 with size: 0.004395 MiB 00:06:20.473 element at address: 0x200000375380 with size: 0.004395 MiB 00:06:20.473 element at address: 0x200000378e40 with size: 0.004395 MiB 00:06:20.473 element at address: 0x20000037c900 with size: 0.004395 MiB 00:06:20.473 element at address: 0x2000003803c0 with size: 0.004395 MiB 00:06:20.473 element at address: 0x200000383e80 with size: 0.004395 MiB 00:06:20.473 element at address: 0x200000387940 with size: 0.004395 MiB 00:06:20.473 element at address: 0x20000038b400 with size: 0.004395 MiB 00:06:20.473 element at address: 0x20000038eec0 with size: 0.004395 MiB 00:06:20.473 element at address: 0x200000392980 with size: 0.004395 MiB 00:06:20.473 element at address: 0x200000396440 with size: 0.004395 MiB 00:06:20.473 element at address: 0x200000399f00 with size: 0.004395 MiB 00:06:20.473 element at address: 0x20000039d9c0 with size: 0.004395 MiB 00:06:20.473 element at address: 0x2000003a1480 with size: 0.004395 MiB 00:06:20.473 element at address: 0x2000003a4f40 with size: 0.004395 MiB 00:06:20.473 element at address: 0x2000003a8a00 with size: 0.004395 MiB 00:06:20.473 element at address: 0x2000003ac4c0 with size: 0.004395 MiB 00:06:20.473 element at address: 0x2000003aff80 with size: 0.004395 MiB 00:06:20.473 element at address: 0x2000003b3a40 with size: 0.004395 MiB 00:06:20.473 element at address: 0x2000003b7500 with size: 0.004395 MiB 00:06:20.473 element at address: 0x2000003bafc0 with size: 0.004395 MiB 00:06:20.473 element at address: 0x2000003bea80 with size: 0.004395 MiB 00:06:20.473 element at address: 0x2000003c2540 with size: 0.004395 MiB 00:06:20.473 element at address: 0x2000003c6000 with size: 0.004395 MiB 00:06:20.473 element at address: 0x2000003c9ac0 with size: 0.004395 MiB 00:06:20.473 element at address: 0x2000003cd580 with size: 0.004395 MiB 00:06:20.473 element at address: 0x2000003d1040 with size: 0.004395 MiB 00:06:20.473 element at address: 0x2000003d4b00 with size: 0.004395 MiB 00:06:20.473 element at address: 0x2000003d8d00 with size: 0.004395 MiB 00:06:20.473 element at address: 0x200000329b80 with size: 0.004028 MiB 00:06:20.473 element at address: 0x20000032ac00 with size: 0.004028 MiB 00:06:20.473 element at address: 0x20000032d640 with size: 0.004028 MiB 00:06:20.473 element at address: 0x20000032e6c0 with size: 0.004028 MiB 00:06:20.473 element at address: 0x200000331100 with size: 0.004028 MiB 00:06:20.473 element at address: 0x200000332180 with size: 0.004028 MiB 00:06:20.473 element at address: 0x200000334bc0 with size: 0.004028 MiB 00:06:20.473 element at address: 0x200000335c40 with size: 0.004028 MiB 00:06:20.473 element at address: 0x200000338680 with size: 0.004028 MiB 00:06:20.473 element at address: 0x200000339700 with size: 0.004028 MiB 00:06:20.473 element at address: 0x20000033c140 with size: 0.004028 MiB 00:06:20.473 element at address: 0x20000033d1c0 with size: 0.004028 MiB 00:06:20.473 element at address: 0x20000033fc00 with size: 0.004028 MiB 00:06:20.473 element at address: 0x200000340c80 with size: 0.004028 MiB 00:06:20.473 element at address: 0x2000003436c0 with size: 0.004028 MiB 00:06:20.473 element at address: 0x200000344740 with size: 0.004028 MiB 00:06:20.473 element at address: 0x200000347180 with size: 0.004028 MiB 00:06:20.473 element at address: 0x200000348200 with size: 0.004028 MiB 00:06:20.473 element at address: 0x20000034ac40 with size: 0.004028 MiB 00:06:20.473 element at address: 0x20000034bcc0 with size: 0.004028 MiB 00:06:20.473 element at address: 0x20000034e700 with size: 0.004028 MiB 00:06:20.473 element at address: 0x20000034f780 with size: 0.004028 MiB 00:06:20.473 element at address: 0x2000003521c0 with size: 0.004028 MiB 00:06:20.473 element at address: 0x200000353240 with size: 0.004028 MiB 00:06:20.473 element at address: 0x200000355c80 with size: 0.004028 MiB 00:06:20.473 element at address: 0x200000356d00 with size: 0.004028 MiB 00:06:20.473 element at address: 0x200000359740 with size: 0.004028 MiB 00:06:20.473 element at address: 0x20000035a7c0 with size: 0.004028 MiB 00:06:20.473 element at address: 0x20000035d200 with size: 0.004028 MiB 00:06:20.473 element at address: 0x20000035e280 with size: 0.004028 MiB 00:06:20.473 element at address: 0x200000360cc0 with size: 0.004028 MiB 00:06:20.473 element at address: 0x200000361d40 with size: 0.004028 MiB 00:06:20.473 element at address: 0x200000364780 with size: 0.004028 MiB 00:06:20.473 element at address: 0x200000365800 with size: 0.004028 MiB 00:06:20.473 element at address: 0x200000368240 with size: 0.004028 MiB 00:06:20.473 element at address: 0x2000003692c0 with size: 0.004028 MiB 00:06:20.473 element at address: 0x20000036bd00 with size: 0.004028 MiB 00:06:20.473 element at address: 0x20000036cd80 with size: 0.004028 MiB 00:06:20.473 element at address: 0x20000036f7c0 with size: 0.004028 MiB 00:06:20.473 element at address: 0x200000370840 with size: 0.004028 MiB 00:06:20.473 element at address: 0x200000373280 with size: 0.004028 MiB 00:06:20.473 element at address: 0x200000374300 with size: 0.004028 MiB 00:06:20.473 element at address: 0x200000376d40 with size: 0.004028 MiB 00:06:20.473 element at address: 0x200000377dc0 with size: 0.004028 MiB 00:06:20.473 element at address: 0x20000037a800 with size: 0.004028 MiB 00:06:20.473 element at address: 0x20000037b880 with size: 0.004028 MiB 00:06:20.473 element at address: 0x20000037e2c0 with size: 0.004028 MiB 00:06:20.473 element at address: 0x20000037f340 with size: 0.004028 MiB 00:06:20.473 element at address: 0x200000381d80 with size: 0.004028 MiB 00:06:20.473 element at address: 0x200000382e00 with size: 0.004028 MiB 00:06:20.473 element at address: 0x200000385840 with size: 0.004028 MiB 00:06:20.473 element at address: 0x2000003868c0 with size: 0.004028 MiB 00:06:20.473 element at address: 0x200000389300 with size: 0.004028 MiB 00:06:20.473 element at address: 0x20000038a380 with size: 0.004028 MiB 00:06:20.473 element at address: 0x20000038cdc0 with size: 0.004028 MiB 00:06:20.473 element at address: 0x20000038de40 with size: 0.004028 MiB 00:06:20.473 element at address: 0x200000390880 with size: 0.004028 MiB 00:06:20.473 element at address: 0x200000391900 with size: 0.004028 MiB 00:06:20.473 element at address: 0x200000394340 with size: 0.004028 MiB 00:06:20.473 element at address: 0x2000003953c0 with size: 0.004028 MiB 00:06:20.473 element at address: 0x200000397e00 with size: 0.004028 MiB 00:06:20.473 element at address: 0x200000398e80 with size: 0.004028 MiB 00:06:20.473 element at address: 0x20000039b8c0 with size: 0.004028 MiB 00:06:20.473 element at address: 0x20000039c940 with size: 0.004028 MiB 00:06:20.473 element at address: 0x20000039f380 with size: 0.004028 MiB 00:06:20.473 element at address: 0x2000003a0400 with size: 0.004028 MiB 00:06:20.473 element at address: 0x2000003a2e40 with size: 0.004028 MiB 00:06:20.473 element at address: 0x2000003a3ec0 with size: 0.004028 MiB 00:06:20.473 element at address: 0x2000003a6900 with size: 0.004028 MiB 00:06:20.473 element at address: 0x2000003a7980 with size: 0.004028 MiB 00:06:20.473 element at address: 0x2000003aa3c0 with size: 0.004028 MiB 00:06:20.473 element at address: 0x2000003ab440 with size: 0.004028 MiB 00:06:20.473 element at address: 0x2000003ade80 with size: 0.004028 MiB 00:06:20.473 element at address: 0x2000003aef00 with size: 0.004028 MiB 00:06:20.473 element at address: 0x2000003b1940 with size: 0.004028 MiB 00:06:20.473 element at address: 0x2000003b29c0 with size: 0.004028 MiB 00:06:20.473 element at address: 0x2000003b5400 with size: 0.004028 MiB 00:06:20.473 element at address: 0x2000003b6480 with size: 0.004028 MiB 00:06:20.473 element at address: 0x2000003b8ec0 with size: 0.004028 MiB 00:06:20.473 element at address: 0x2000003b9f40 with size: 0.004028 MiB 00:06:20.473 element at address: 0x2000003bc980 with size: 0.004028 MiB 00:06:20.473 element at address: 0x2000003bda00 with size: 0.004028 MiB 00:06:20.473 element at address: 0x2000003c0440 with size: 0.004028 MiB 00:06:20.473 element at address: 0x2000003c14c0 with size: 0.004028 MiB 00:06:20.473 element at address: 0x2000003c3f00 with size: 0.004028 MiB 00:06:20.473 element at address: 0x2000003c4f80 with size: 0.004028 MiB 00:06:20.473 element at address: 0x2000003c79c0 with size: 0.004028 MiB 00:06:20.473 element at address: 0x2000003c8a40 with size: 0.004028 MiB 00:06:20.473 element at address: 0x2000003cb480 with size: 0.004028 MiB 00:06:20.473 element at address: 0x2000003cc500 with size: 0.004028 MiB 00:06:20.473 element at address: 0x2000003cef40 with size: 0.004028 MiB 00:06:20.473 element at address: 0x2000003cffc0 with size: 0.004028 MiB 00:06:20.473 element at address: 0x2000003d2a00 with size: 0.004028 MiB 00:06:20.473 element at address: 0x2000003d3a80 with size: 0.004028 MiB 00:06:20.473 element at address: 0x2000003d6c00 with size: 0.004028 MiB 00:06:20.473 element at address: 0x2000003d7c80 with size: 0.004028 MiB 00:06:20.473 element at address: 0x200000200000 with size: 0.000305 MiB 00:06:20.473 element at address: 0x2000192efdc0 with size: 0.000305 MiB 00:06:20.473 element at address: 0x200000200140 with size: 0.000183 MiB 00:06:20.473 element at address: 0x200000200200 with size: 0.000183 MiB 00:06:20.473 element at address: 0x2000002002c0 with size: 0.000183 MiB 00:06:20.473 element at address: 0x200000200380 with size: 0.000183 MiB 00:06:20.473 element at address: 0x200000200440 with size: 0.000183 MiB 00:06:20.473 element at address: 0x200000200500 with size: 0.000183 MiB 00:06:20.473 element at address: 0x2000002005c0 with size: 0.000183 MiB 00:06:20.473 element at address: 0x200000200680 with size: 0.000183 MiB 00:06:20.473 element at address: 0x200000200740 with size: 0.000183 MiB 00:06:20.473 element at address: 0x200000200800 with size: 0.000183 MiB 00:06:20.473 element at address: 0x2000002008c0 with size: 0.000183 MiB 00:06:20.473 element at address: 0x200000200980 with size: 0.000183 MiB 00:06:20.473 element at address: 0x200000200a40 with size: 0.000183 MiB 00:06:20.473 element at address: 0x200000200b00 with size: 0.000183 MiB 00:06:20.473 element at address: 0x200000200bc0 with size: 0.000183 MiB 00:06:20.473 element at address: 0x200000200c80 with size: 0.000183 MiB 00:06:20.473 element at address: 0x200000200d40 with size: 0.000183 MiB 00:06:20.473 element at address: 0x200000200e00 with size: 0.000183 MiB 00:06:20.473 element at address: 0x200000200ec0 with size: 0.000183 MiB 00:06:20.473 element at address: 0x200000200f80 with size: 0.000183 MiB 00:06:20.473 element at address: 0x200000201040 with size: 0.000183 MiB 00:06:20.473 element at address: 0x200000201100 with size: 0.000183 MiB 00:06:20.473 element at address: 0x200000201300 with size: 0.000183 MiB 00:06:20.473 element at address: 0x2000002055c0 with size: 0.000183 MiB 00:06:20.473 element at address: 0x200000225880 with size: 0.000183 MiB 00:06:20.473 element at address: 0x200000225940 with size: 0.000183 MiB 00:06:20.473 element at address: 0x200000225a00 with size: 0.000183 MiB 00:06:20.473 element at address: 0x200000225ac0 with size: 0.000183 MiB 00:06:20.473 element at address: 0x200000225b80 with size: 0.000183 MiB 00:06:20.473 element at address: 0x200000225c40 with size: 0.000183 MiB 00:06:20.473 element at address: 0x200000225d00 with size: 0.000183 MiB 00:06:20.473 element at address: 0x200000225dc0 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000225e80 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000225f40 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000226000 with size: 0.000183 MiB 00:06:20.474 element at address: 0x2000002260c0 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000226180 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000226240 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000226300 with size: 0.000183 MiB 00:06:20.474 element at address: 0x2000002263c0 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000226480 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000226680 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000226740 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000226800 with size: 0.000183 MiB 00:06:20.474 element at address: 0x2000002268c0 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000226980 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000226a40 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000226b00 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000226bc0 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000226c80 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000226d40 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000226e00 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000226ec0 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000226f80 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000227040 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000227100 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000329300 with size: 0.000183 MiB 00:06:20.474 element at address: 0x2000003293c0 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000329580 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000329640 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000329800 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000032ce80 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000032d040 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000032d100 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000032d2c0 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000330940 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000330b00 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000330bc0 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000330d80 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000334400 with size: 0.000183 MiB 00:06:20.474 element at address: 0x2000003345c0 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000334680 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000334840 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000337ec0 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000338080 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000338140 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000338300 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000033b980 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000033bb40 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000033bc00 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000033bdc0 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000033f440 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000033f600 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000033f6c0 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000033f880 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000342f00 with size: 0.000183 MiB 00:06:20.474 element at address: 0x2000003430c0 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000343180 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000343340 with size: 0.000183 MiB 00:06:20.474 element at address: 0x2000003469c0 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000346b80 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000346c40 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000346e00 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000034a480 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000034a640 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000034a700 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000034a8c0 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000034df40 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000034e100 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000034e1c0 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000034e380 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000351a00 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000351bc0 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000351c80 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000351e40 with size: 0.000183 MiB 00:06:20.474 element at address: 0x2000003554c0 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000355680 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000355740 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000355900 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000358f80 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000359140 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000359200 with size: 0.000183 MiB 00:06:20.474 element at address: 0x2000003593c0 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000035ca40 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000035cc00 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000035ccc0 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000035ce80 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000360500 with size: 0.000183 MiB 00:06:20.474 element at address: 0x2000003606c0 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000360780 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000360940 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000363fc0 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000364180 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000364240 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000364400 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000367a80 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000367c40 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000367d00 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000367ec0 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000036b540 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000036b700 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000036b7c0 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000036b980 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000036f000 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000036f1c0 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000036f280 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000036f440 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000372ac0 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000372c80 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000372d40 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000372f00 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000376580 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000376740 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000376800 with size: 0.000183 MiB 00:06:20.474 element at address: 0x2000003769c0 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000037a040 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000037a200 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000037a2c0 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000037a480 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000037db00 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000037dcc0 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000037dd80 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000037df40 with size: 0.000183 MiB 00:06:20.474 element at address: 0x2000003815c0 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000381780 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000381840 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000381a00 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000385080 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000385240 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000385300 with size: 0.000183 MiB 00:06:20.474 element at address: 0x2000003854c0 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000388b40 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000388d00 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000388dc0 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000388f80 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000038c600 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000038c7c0 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000038c880 with size: 0.000183 MiB 00:06:20.474 element at address: 0x20000038ca40 with size: 0.000183 MiB 00:06:20.474 element at address: 0x2000003900c0 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000390280 with size: 0.000183 MiB 00:06:20.474 element at address: 0x200000390340 with size: 0.000183 MiB 00:06:20.475 element at address: 0x200000390500 with size: 0.000183 MiB 00:06:20.475 element at address: 0x200000393b80 with size: 0.000183 MiB 00:06:20.475 element at address: 0x200000393d40 with size: 0.000183 MiB 00:06:20.475 element at address: 0x200000393e00 with size: 0.000183 MiB 00:06:20.475 element at address: 0x200000393fc0 with size: 0.000183 MiB 00:06:20.475 element at address: 0x200000397640 with size: 0.000183 MiB 00:06:20.475 element at address: 0x200000397800 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003978c0 with size: 0.000183 MiB 00:06:20.475 element at address: 0x200000397a80 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20000039b100 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20000039b2c0 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20000039b380 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20000039b540 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20000039ebc0 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20000039ed80 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20000039ee40 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20000039f000 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003a2680 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003a2840 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003a2900 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003a2ac0 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003a6140 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003a6300 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003a63c0 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003a6580 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003a9c00 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003a9dc0 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003a9e80 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003aa040 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003ad6c0 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003ad880 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003ad940 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003adb00 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003b1180 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003b1340 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003b1400 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003b15c0 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003b4c40 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003b4e00 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003b4ec0 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003b5080 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003b8700 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003b88c0 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003b8980 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003b8b40 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003bc1c0 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003bc380 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003bc440 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003bc600 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003bfc80 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003bfe40 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003bff00 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003c00c0 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003c3740 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003c3900 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003c39c0 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003c3b80 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003c7200 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003c73c0 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003c7480 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003c7640 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003cacc0 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003cae80 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003caf40 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003cb100 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003ce780 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003ce940 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003cea00 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003cebc0 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003d2240 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003d2400 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003d24c0 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003d2680 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003d5dc0 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003d64c0 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003d6580 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000003d6880 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20000087c980 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20000087ca40 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20000087cb00 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20000087cbc0 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20000087cc80 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20000087cd40 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:06:20.475 element at address: 0x200003a7ebc0 with size: 0.000183 MiB 00:06:20.475 element at address: 0x200003a7ec80 with size: 0.000183 MiB 00:06:20.475 element at address: 0x200003a7ed40 with size: 0.000183 MiB 00:06:20.475 element at address: 0x200003a7ee00 with size: 0.000183 MiB 00:06:20.475 element at address: 0x200003a7eec0 with size: 0.000183 MiB 00:06:20.475 element at address: 0x200003a7ef80 with size: 0.000183 MiB 00:06:20.475 element at address: 0x200003a7f040 with size: 0.000183 MiB 00:06:20.475 element at address: 0x200003a7f100 with size: 0.000183 MiB 00:06:20.475 element at address: 0x200003a7f1c0 with size: 0.000183 MiB 00:06:20.475 element at address: 0x200003a7f280 with size: 0.000183 MiB 00:06:20.475 element at address: 0x200003a7f340 with size: 0.000183 MiB 00:06:20.475 element at address: 0x200003a7f400 with size: 0.000183 MiB 00:06:20.475 element at address: 0x200003a7f4c0 with size: 0.000183 MiB 00:06:20.475 element at address: 0x200003a7f580 with size: 0.000183 MiB 00:06:20.475 element at address: 0x200003a7f640 with size: 0.000183 MiB 00:06:20.475 element at address: 0x200003a7f700 with size: 0.000183 MiB 00:06:20.475 element at address: 0x200003a7f7c0 with size: 0.000183 MiB 00:06:20.475 element at address: 0x200003a7f880 with size: 0.000183 MiB 00:06:20.475 element at address: 0x200003a7f940 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20000b27d280 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20000b27d340 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20000b27d400 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20000b27d4c0 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20000b27d580 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20000b27d640 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20000b27d700 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20000b27d7c0 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20000b27d880 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20000b27d940 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20000b27da00 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20000b27dac0 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000138fa8c0 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000192efc40 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000192efd00 with size: 0.000183 MiB 00:06:20.475 element at address: 0x2000194bc740 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20001aa90640 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20001aa90700 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20001aa907c0 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20001aa90880 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20001aa90940 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20001aa90a00 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20001aa90ac0 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20001aa90b80 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20001aa90c40 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20001aa90d00 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20001aa90dc0 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20001aa90e80 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20001aa90f40 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20001aa91000 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20001aa910c0 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20001aa91180 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20001aa91240 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20001aa91300 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20001aa913c0 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20001aa91480 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20001aa91540 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20001aa91600 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20001aa916c0 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20001aa91780 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20001aa91840 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20001aa91900 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20001aa919c0 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20001aa91a80 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20001aa91b40 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20001aa91c00 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20001aa91cc0 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20001aa91d80 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20001aa91e40 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20001aa91f00 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20001aa91fc0 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20001aa92080 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20001aa92140 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20001aa92200 with size: 0.000183 MiB 00:06:20.475 element at address: 0x20001aa922c0 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa92380 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa92440 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa92500 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa925c0 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa92680 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa92740 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa92800 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa928c0 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa92980 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa92a40 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa92b00 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa92bc0 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa92c80 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa92d40 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa92e00 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa92ec0 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa92f80 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa93040 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa93100 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa931c0 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa93280 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa93340 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa93400 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa934c0 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa93580 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa93640 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa93700 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa937c0 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa93880 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa93940 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa93a00 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa93ac0 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa93b80 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa93c40 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa93d00 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa93dc0 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa93e80 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa93f40 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa94000 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa940c0 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa94180 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa94240 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa94300 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa943c0 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa94480 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa94540 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa94600 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa946c0 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa94780 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa94840 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa94900 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa949c0 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa94a80 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa94b40 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa94c00 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa94cc0 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa94d80 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa94e40 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa94f00 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa94fc0 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa95080 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa95140 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa95200 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa952c0 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:06:20.476 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:06:20.476 element at address: 0x200027e65500 with size: 0.000183 MiB 00:06:20.476 element at address: 0x200027e655c0 with size: 0.000183 MiB 00:06:20.476 element at address: 0x200027e6c1c0 with size: 0.000183 MiB 00:06:20.476 element at address: 0x200027e6c3c0 with size: 0.000183 MiB 00:06:20.476 element at address: 0x200027e6c480 with size: 0.000183 MiB 00:06:20.476 element at address: 0x200027e6c540 with size: 0.000183 MiB 00:06:20.476 element at address: 0x200027e6c600 with size: 0.000183 MiB 00:06:20.476 element at address: 0x200027e6c6c0 with size: 0.000183 MiB 00:06:20.476 element at address: 0x200027e6c780 with size: 0.000183 MiB 00:06:20.476 element at address: 0x200027e6c840 with size: 0.000183 MiB 00:06:20.476 element at address: 0x200027e6c900 with size: 0.000183 MiB 00:06:20.476 element at address: 0x200027e6c9c0 with size: 0.000183 MiB 00:06:20.476 element at address: 0x200027e6ca80 with size: 0.000183 MiB 00:06:20.476 element at address: 0x200027e6cb40 with size: 0.000183 MiB 00:06:20.476 element at address: 0x200027e6cc00 with size: 0.000183 MiB 00:06:20.476 element at address: 0x200027e6ccc0 with size: 0.000183 MiB 00:06:20.476 element at address: 0x200027e6cd80 with size: 0.000183 MiB 00:06:20.476 element at address: 0x200027e6ce40 with size: 0.000183 MiB 00:06:20.476 element at address: 0x200027e6cf00 with size: 0.000183 MiB 00:06:20.476 element at address: 0x200027e6cfc0 with size: 0.000183 MiB 00:06:20.476 element at address: 0x200027e6d080 with size: 0.000183 MiB 00:06:20.476 element at address: 0x200027e6d140 with size: 0.000183 MiB 00:06:20.476 element at address: 0x200027e6d200 with size: 0.000183 MiB 00:06:20.476 element at address: 0x200027e6d2c0 with size: 0.000183 MiB 00:06:20.476 element at address: 0x200027e6d380 with size: 0.000183 MiB 00:06:20.476 element at address: 0x200027e6d440 with size: 0.000183 MiB 00:06:20.476 element at address: 0x200027e6d500 with size: 0.000183 MiB 00:06:20.476 element at address: 0x200027e6d5c0 with size: 0.000183 MiB 00:06:20.476 element at address: 0x200027e6d680 with size: 0.000183 MiB 00:06:20.476 element at address: 0x200027e6d740 with size: 0.000183 MiB 00:06:20.476 element at address: 0x200027e6d800 with size: 0.000183 MiB 00:06:20.476 element at address: 0x200027e6d8c0 with size: 0.000183 MiB 00:06:20.476 element at address: 0x200027e6d980 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6da40 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6db00 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6dbc0 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6dc80 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6dd40 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6de00 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6dec0 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6df80 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6e040 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6e100 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6e1c0 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6e280 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6e340 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6e400 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6e4c0 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6e580 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6e640 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6e700 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6e7c0 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6e880 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6e940 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6ea00 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6eac0 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6eb80 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6ec40 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6ed00 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6edc0 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6ee80 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6ef40 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6f000 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6f0c0 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6f180 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6f240 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6f300 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6f3c0 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6f480 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6f540 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6f600 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6f6c0 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6f780 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6f840 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6f900 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6f9c0 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6fa80 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6fb40 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6fc00 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6fcc0 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6fd80 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:06:20.477 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:06:20.477 list of memzone associated elements. size: 602.320007 MiB 00:06:20.477 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:06:20.477 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:06:20.477 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:06:20.477 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:06:20.477 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:06:20.477 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_554861_0 00:06:20.477 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:06:20.477 associated memzone info: size: 48.002930 MiB name: MP_evtpool_554861_0 00:06:20.477 element at address: 0x200003fff380 with size: 48.003052 MiB 00:06:20.477 associated memzone info: size: 48.002930 MiB name: MP_msgpool_554861_0 00:06:20.477 element at address: 0x2000195be940 with size: 20.255554 MiB 00:06:20.477 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:06:20.477 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:06:20.477 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:06:20.477 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:06:20.477 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_554861 00:06:20.477 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:06:20.477 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_554861 00:06:20.477 element at address: 0x2000002271c0 with size: 1.008118 MiB 00:06:20.477 associated memzone info: size: 1.007996 MiB name: MP_evtpool_554861 00:06:20.477 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:06:20.477 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:06:20.477 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:06:20.477 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:06:20.477 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:06:20.477 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:06:20.477 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:06:20.477 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:06:20.477 element at address: 0x200003eff180 with size: 1.000488 MiB 00:06:20.477 associated memzone info: size: 1.000366 MiB name: RG_ring_0_554861 00:06:20.477 element at address: 0x200003affc00 with size: 1.000488 MiB 00:06:20.477 associated memzone info: size: 1.000366 MiB name: RG_ring_1_554861 00:06:20.477 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:06:20.477 associated memzone info: size: 1.000366 MiB name: RG_ring_4_554861 00:06:20.477 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:06:20.477 associated memzone info: size: 1.000366 MiB name: RG_ring_5_554861 00:06:20.477 element at address: 0x200003a7fa00 with size: 0.500488 MiB 00:06:20.477 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_554861 00:06:20.477 element at address: 0x20000b27db80 with size: 0.500488 MiB 00:06:20.477 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:06:20.477 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:06:20.477 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:06:20.477 element at address: 0x20001947c540 with size: 0.250488 MiB 00:06:20.477 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:06:20.477 element at address: 0x200000205680 with size: 0.125488 MiB 00:06:20.477 associated memzone info: size: 0.125366 MiB name: RG_ring_2_554861 00:06:20.477 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:06:20.477 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:06:20.477 element at address: 0x200027e65680 with size: 0.023743 MiB 00:06:20.477 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:06:20.477 element at address: 0x2000002013c0 with size: 0.016113 MiB 00:06:20.477 associated memzone info: size: 0.015991 MiB name: RG_ring_3_554861 00:06:20.477 element at address: 0x200027e6b7c0 with size: 0.002441 MiB 00:06:20.477 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:06:20.477 element at address: 0x2000003d5f80 with size: 0.001282 MiB 00:06:20.477 associated memzone info: size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:06:20.477 element at address: 0x2000003d6a40 with size: 0.000427 MiB 00:06:20.477 associated memzone info: size: 0.000305 MiB name: 0000:1a:01.0_qat 00:06:20.477 element at address: 0x2000003d2840 with size: 0.000427 MiB 00:06:20.477 associated memzone info: size: 0.000305 MiB name: 0000:1a:01.1_qat 00:06:20.477 element at address: 0x2000003ced80 with size: 0.000427 MiB 00:06:20.477 associated memzone info: size: 0.000305 MiB name: 0000:1a:01.2_qat 00:06:20.477 element at address: 0x2000003cb2c0 with size: 0.000427 MiB 00:06:20.477 associated memzone info: size: 0.000305 MiB name: 0000:1a:01.3_qat 00:06:20.477 element at address: 0x2000003c7800 with size: 0.000427 MiB 00:06:20.477 associated memzone info: size: 0.000305 MiB name: 0000:1a:01.4_qat 00:06:20.477 element at address: 0x2000003c3d40 with size: 0.000427 MiB 00:06:20.477 associated memzone info: size: 0.000305 MiB name: 0000:1a:01.5_qat 00:06:20.477 element at address: 0x2000003c0280 with size: 0.000427 MiB 00:06:20.477 associated memzone info: size: 0.000305 MiB name: 0000:1a:01.6_qat 00:06:20.477 element at address: 0x2000003bc7c0 with size: 0.000427 MiB 00:06:20.477 associated memzone info: size: 0.000305 MiB name: 0000:1a:01.7_qat 00:06:20.477 element at address: 0x2000003b8d00 with size: 0.000427 MiB 00:06:20.477 associated memzone info: size: 0.000305 MiB name: 0000:1a:02.0_qat 00:06:20.477 element at address: 0x2000003b5240 with size: 0.000427 MiB 00:06:20.477 associated memzone info: size: 0.000305 MiB name: 0000:1a:02.1_qat 00:06:20.477 element at address: 0x2000003b1780 with size: 0.000427 MiB 00:06:20.477 associated memzone info: size: 0.000305 MiB name: 0000:1a:02.2_qat 00:06:20.477 element at address: 0x2000003adcc0 with size: 0.000427 MiB 00:06:20.477 associated memzone info: size: 0.000305 MiB name: 0000:1a:02.3_qat 00:06:20.477 element at address: 0x2000003aa200 with size: 0.000427 MiB 00:06:20.477 associated memzone info: size: 0.000305 MiB name: 0000:1a:02.4_qat 00:06:20.477 element at address: 0x2000003a6740 with size: 0.000427 MiB 00:06:20.477 associated memzone info: size: 0.000305 MiB name: 0000:1a:02.5_qat 00:06:20.477 element at address: 0x2000003a2c80 with size: 0.000427 MiB 00:06:20.477 associated memzone info: size: 0.000305 MiB name: 0000:1a:02.6_qat 00:06:20.477 element at address: 0x20000039f1c0 with size: 0.000427 MiB 00:06:20.477 associated memzone info: size: 0.000305 MiB name: 0000:1a:02.7_qat 00:06:20.477 element at address: 0x20000039b700 with size: 0.000427 MiB 00:06:20.477 associated memzone info: size: 0.000305 MiB name: 0000:1c:01.0_qat 00:06:20.477 element at address: 0x200000397c40 with size: 0.000427 MiB 00:06:20.477 associated memzone info: size: 0.000305 MiB name: 0000:1c:01.1_qat 00:06:20.477 element at address: 0x200000394180 with size: 0.000427 MiB 00:06:20.477 associated memzone info: size: 0.000305 MiB name: 0000:1c:01.2_qat 00:06:20.477 element at address: 0x2000003906c0 with size: 0.000427 MiB 00:06:20.477 associated memzone info: size: 0.000305 MiB name: 0000:1c:01.3_qat 00:06:20.477 element at address: 0x20000038cc00 with size: 0.000427 MiB 00:06:20.477 associated memzone info: size: 0.000305 MiB name: 0000:1c:01.4_qat 00:06:20.477 element at address: 0x200000389140 with size: 0.000427 MiB 00:06:20.477 associated memzone info: size: 0.000305 MiB name: 0000:1c:01.5_qat 00:06:20.477 element at address: 0x200000385680 with size: 0.000427 MiB 00:06:20.477 associated memzone info: size: 0.000305 MiB name: 0000:1c:01.6_qat 00:06:20.477 element at address: 0x200000381bc0 with size: 0.000427 MiB 00:06:20.477 associated memzone info: size: 0.000305 MiB name: 0000:1c:01.7_qat 00:06:20.477 element at address: 0x20000037e100 with size: 0.000427 MiB 00:06:20.477 associated memzone info: size: 0.000305 MiB name: 0000:1c:02.0_qat 00:06:20.477 element at address: 0x20000037a640 with size: 0.000427 MiB 00:06:20.477 associated memzone info: size: 0.000305 MiB name: 0000:1c:02.1_qat 00:06:20.478 element at address: 0x200000376b80 with size: 0.000427 MiB 00:06:20.478 associated memzone info: size: 0.000305 MiB name: 0000:1c:02.2_qat 00:06:20.478 element at address: 0x2000003730c0 with size: 0.000427 MiB 00:06:20.478 associated memzone info: size: 0.000305 MiB name: 0000:1c:02.3_qat 00:06:20.478 element at address: 0x20000036f600 with size: 0.000427 MiB 00:06:20.478 associated memzone info: size: 0.000305 MiB name: 0000:1c:02.4_qat 00:06:20.478 element at address: 0x20000036bb40 with size: 0.000427 MiB 00:06:20.478 associated memzone info: size: 0.000305 MiB name: 0000:1c:02.5_qat 00:06:20.478 element at address: 0x200000368080 with size: 0.000427 MiB 00:06:20.478 associated memzone info: size: 0.000305 MiB name: 0000:1c:02.6_qat 00:06:20.478 element at address: 0x2000003645c0 with size: 0.000427 MiB 00:06:20.478 associated memzone info: size: 0.000305 MiB name: 0000:1c:02.7_qat 00:06:20.478 element at address: 0x200000360b00 with size: 0.000427 MiB 00:06:20.478 associated memzone info: size: 0.000305 MiB name: 0000:1e:01.0_qat 00:06:20.478 element at address: 0x20000035d040 with size: 0.000427 MiB 00:06:20.478 associated memzone info: size: 0.000305 MiB name: 0000:1e:01.1_qat 00:06:20.478 element at address: 0x200000359580 with size: 0.000427 MiB 00:06:20.478 associated memzone info: size: 0.000305 MiB name: 0000:1e:01.2_qat 00:06:20.478 element at address: 0x200000355ac0 with size: 0.000427 MiB 00:06:20.478 associated memzone info: size: 0.000305 MiB name: 0000:1e:01.3_qat 00:06:20.478 element at address: 0x200000352000 with size: 0.000427 MiB 00:06:20.478 associated memzone info: size: 0.000305 MiB name: 0000:1e:01.4_qat 00:06:20.478 element at address: 0x20000034e540 with size: 0.000427 MiB 00:06:20.478 associated memzone info: size: 0.000305 MiB name: 0000:1e:01.5_qat 00:06:20.478 element at address: 0x20000034aa80 with size: 0.000427 MiB 00:06:20.478 associated memzone info: size: 0.000305 MiB name: 0000:1e:01.6_qat 00:06:20.478 element at address: 0x200000346fc0 with size: 0.000427 MiB 00:06:20.478 associated memzone info: size: 0.000305 MiB name: 0000:1e:01.7_qat 00:06:20.478 element at address: 0x200000343500 with size: 0.000427 MiB 00:06:20.478 associated memzone info: size: 0.000305 MiB name: 0000:1e:02.0_qat 00:06:20.478 element at address: 0x20000033fa40 with size: 0.000427 MiB 00:06:20.478 associated memzone info: size: 0.000305 MiB name: 0000:1e:02.1_qat 00:06:20.478 element at address: 0x20000033bf80 with size: 0.000427 MiB 00:06:20.478 associated memzone info: size: 0.000305 MiB name: 0000:1e:02.2_qat 00:06:20.478 element at address: 0x2000003384c0 with size: 0.000427 MiB 00:06:20.478 associated memzone info: size: 0.000305 MiB name: 0000:1e:02.3_qat 00:06:20.478 element at address: 0x200000334a00 with size: 0.000427 MiB 00:06:20.478 associated memzone info: size: 0.000305 MiB name: 0000:1e:02.4_qat 00:06:20.478 element at address: 0x200000330f40 with size: 0.000427 MiB 00:06:20.478 associated memzone info: size: 0.000305 MiB name: 0000:1e:02.5_qat 00:06:20.478 element at address: 0x20000032d480 with size: 0.000427 MiB 00:06:20.478 associated memzone info: size: 0.000305 MiB name: 0000:1e:02.6_qat 00:06:20.478 element at address: 0x2000003299c0 with size: 0.000427 MiB 00:06:20.478 associated memzone info: size: 0.000305 MiB name: 0000:1e:02.7_qat 00:06:20.478 element at address: 0x2000003d6740 with size: 0.000305 MiB 00:06:20.478 associated memzone info: size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:06:20.478 element at address: 0x200000226540 with size: 0.000305 MiB 00:06:20.478 associated memzone info: size: 0.000183 MiB name: MP_msgpool_554861 00:06:20.478 element at address: 0x2000002011c0 with size: 0.000305 MiB 00:06:20.478 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_554861 00:06:20.478 element at address: 0x200027e6c280 with size: 0.000305 MiB 00:06:20.478 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:06:20.478 element at address: 0x2000003d6940 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_0 00:06:20.478 element at address: 0x2000003d6640 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_1 00:06:20.478 element at address: 0x2000003d5e80 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_0 00:06:20.478 element at address: 0x2000003d2740 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_2 00:06:20.478 element at address: 0x2000003d2580 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_3 00:06:20.478 element at address: 0x2000003d2300 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_1 00:06:20.478 element at address: 0x2000003cec80 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_4 00:06:20.478 element at address: 0x2000003ceac0 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_5 00:06:20.478 element at address: 0x2000003ce840 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_2 00:06:20.478 element at address: 0x2000003cb1c0 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_6 00:06:20.478 element at address: 0x2000003cb000 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_7 00:06:20.478 element at address: 0x2000003cad80 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_3 00:06:20.478 element at address: 0x2000003c7700 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_8 00:06:20.478 element at address: 0x2000003c7540 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_9 00:06:20.478 element at address: 0x2000003c72c0 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_4 00:06:20.478 element at address: 0x2000003c3c40 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_10 00:06:20.478 element at address: 0x2000003c3a80 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_11 00:06:20.478 element at address: 0x2000003c3800 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_5 00:06:20.478 element at address: 0x2000003c0180 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_12 00:06:20.478 element at address: 0x2000003bffc0 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_13 00:06:20.478 element at address: 0x2000003bfd40 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_6 00:06:20.478 element at address: 0x2000003bc6c0 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_14 00:06:20.478 element at address: 0x2000003bc500 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_15 00:06:20.478 element at address: 0x2000003bc280 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_7 00:06:20.478 element at address: 0x2000003b8c00 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_16 00:06:20.478 element at address: 0x2000003b8a40 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_17 00:06:20.478 element at address: 0x2000003b87c0 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_8 00:06:20.478 element at address: 0x2000003b5140 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_18 00:06:20.478 element at address: 0x2000003b4f80 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_19 00:06:20.478 element at address: 0x2000003b4d00 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_9 00:06:20.478 element at address: 0x2000003b1680 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_20 00:06:20.478 element at address: 0x2000003b14c0 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_21 00:06:20.478 element at address: 0x2000003b1240 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_10 00:06:20.478 element at address: 0x2000003adbc0 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_22 00:06:20.478 element at address: 0x2000003ada00 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_23 00:06:20.478 element at address: 0x2000003ad780 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_11 00:06:20.478 element at address: 0x2000003aa100 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_24 00:06:20.478 element at address: 0x2000003a9f40 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_25 00:06:20.478 element at address: 0x2000003a9cc0 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_12 00:06:20.478 element at address: 0x2000003a6640 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_26 00:06:20.478 element at address: 0x2000003a6480 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_27 00:06:20.478 element at address: 0x2000003a6200 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_13 00:06:20.478 element at address: 0x2000003a2b80 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_28 00:06:20.478 element at address: 0x2000003a29c0 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_29 00:06:20.478 element at address: 0x2000003a2740 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_14 00:06:20.478 element at address: 0x20000039f0c0 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_30 00:06:20.478 element at address: 0x20000039ef00 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_31 00:06:20.478 element at address: 0x20000039ec80 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_15 00:06:20.478 element at address: 0x20000039b600 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_32 00:06:20.478 element at address: 0x20000039b440 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_33 00:06:20.478 element at address: 0x20000039b1c0 with size: 0.000244 MiB 00:06:20.478 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_16 00:06:20.479 element at address: 0x200000397b40 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_34 00:06:20.479 element at address: 0x200000397980 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_35 00:06:20.479 element at address: 0x200000397700 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_17 00:06:20.479 element at address: 0x200000394080 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_36 00:06:20.479 element at address: 0x200000393ec0 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_37 00:06:20.479 element at address: 0x200000393c40 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_18 00:06:20.479 element at address: 0x2000003905c0 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_38 00:06:20.479 element at address: 0x200000390400 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_39 00:06:20.479 element at address: 0x200000390180 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_19 00:06:20.479 element at address: 0x20000038cb00 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_40 00:06:20.479 element at address: 0x20000038c940 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_41 00:06:20.479 element at address: 0x20000038c6c0 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_20 00:06:20.479 element at address: 0x200000389040 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_42 00:06:20.479 element at address: 0x200000388e80 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_43 00:06:20.479 element at address: 0x200000388c00 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_21 00:06:20.479 element at address: 0x200000385580 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_44 00:06:20.479 element at address: 0x2000003853c0 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_45 00:06:20.479 element at address: 0x200000385140 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_22 00:06:20.479 element at address: 0x200000381ac0 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_46 00:06:20.479 element at address: 0x200000381900 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_47 00:06:20.479 element at address: 0x200000381680 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_23 00:06:20.479 element at address: 0x20000037e000 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_48 00:06:20.479 element at address: 0x20000037de40 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_49 00:06:20.479 element at address: 0x20000037dbc0 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_24 00:06:20.479 element at address: 0x20000037a540 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_50 00:06:20.479 element at address: 0x20000037a380 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_51 00:06:20.479 element at address: 0x20000037a100 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_25 00:06:20.479 element at address: 0x200000376a80 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_52 00:06:20.479 element at address: 0x2000003768c0 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_53 00:06:20.479 element at address: 0x200000376640 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_26 00:06:20.479 element at address: 0x200000372fc0 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_54 00:06:20.479 element at address: 0x200000372e00 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_55 00:06:20.479 element at address: 0x200000372b80 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_27 00:06:20.479 element at address: 0x20000036f500 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_56 00:06:20.479 element at address: 0x20000036f340 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_57 00:06:20.479 element at address: 0x20000036f0c0 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_28 00:06:20.479 element at address: 0x20000036ba40 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_58 00:06:20.479 element at address: 0x20000036b880 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_59 00:06:20.479 element at address: 0x20000036b600 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_29 00:06:20.479 element at address: 0x200000367f80 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_60 00:06:20.479 element at address: 0x200000367dc0 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_61 00:06:20.479 element at address: 0x200000367b40 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_30 00:06:20.479 element at address: 0x2000003644c0 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_62 00:06:20.479 element at address: 0x200000364300 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_63 00:06:20.479 element at address: 0x200000364080 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_31 00:06:20.479 element at address: 0x200000360a00 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_64 00:06:20.479 element at address: 0x200000360840 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_65 00:06:20.479 element at address: 0x2000003605c0 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_32 00:06:20.479 element at address: 0x20000035cf40 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_66 00:06:20.479 element at address: 0x20000035cd80 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_67 00:06:20.479 element at address: 0x20000035cb00 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_33 00:06:20.479 element at address: 0x200000359480 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_68 00:06:20.479 element at address: 0x2000003592c0 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_69 00:06:20.479 element at address: 0x200000359040 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_34 00:06:20.479 element at address: 0x2000003559c0 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_70 00:06:20.479 element at address: 0x200000355800 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_71 00:06:20.479 element at address: 0x200000355580 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_35 00:06:20.479 element at address: 0x200000351f00 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_72 00:06:20.479 element at address: 0x200000351d40 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_73 00:06:20.479 element at address: 0x200000351ac0 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_36 00:06:20.479 element at address: 0x20000034e440 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_74 00:06:20.479 element at address: 0x20000034e280 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_75 00:06:20.479 element at address: 0x20000034e000 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_37 00:06:20.479 element at address: 0x20000034a980 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_76 00:06:20.479 element at address: 0x20000034a7c0 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_77 00:06:20.479 element at address: 0x20000034a540 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_38 00:06:20.479 element at address: 0x200000346ec0 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_78 00:06:20.479 element at address: 0x200000346d00 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_79 00:06:20.479 element at address: 0x200000346a80 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_39 00:06:20.479 element at address: 0x200000343400 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_80 00:06:20.479 element at address: 0x200000343240 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_81 00:06:20.479 element at address: 0x200000342fc0 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_40 00:06:20.479 element at address: 0x20000033f940 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_82 00:06:20.479 element at address: 0x20000033f780 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_83 00:06:20.479 element at address: 0x20000033f500 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_41 00:06:20.479 element at address: 0x20000033be80 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_84 00:06:20.479 element at address: 0x20000033bcc0 with size: 0.000244 MiB 00:06:20.479 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_85 00:06:20.480 element at address: 0x20000033ba40 with size: 0.000244 MiB 00:06:20.480 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_42 00:06:20.480 element at address: 0x2000003383c0 with size: 0.000244 MiB 00:06:20.480 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_86 00:06:20.480 element at address: 0x200000338200 with size: 0.000244 MiB 00:06:20.480 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_87 00:06:20.480 element at address: 0x200000337f80 with size: 0.000244 MiB 00:06:20.480 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_43 00:06:20.480 element at address: 0x200000334900 with size: 0.000244 MiB 00:06:20.480 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_88 00:06:20.480 element at address: 0x200000334740 with size: 0.000244 MiB 00:06:20.480 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_89 00:06:20.480 element at address: 0x2000003344c0 with size: 0.000244 MiB 00:06:20.480 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_44 00:06:20.480 element at address: 0x200000330e40 with size: 0.000244 MiB 00:06:20.480 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_90 00:06:20.480 element at address: 0x200000330c80 with size: 0.000244 MiB 00:06:20.480 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_91 00:06:20.480 element at address: 0x200000330a00 with size: 0.000244 MiB 00:06:20.480 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_45 00:06:20.480 element at address: 0x20000032d380 with size: 0.000244 MiB 00:06:20.480 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_92 00:06:20.480 element at address: 0x20000032d1c0 with size: 0.000244 MiB 00:06:20.480 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_93 00:06:20.480 element at address: 0x20000032cf40 with size: 0.000244 MiB 00:06:20.480 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_46 00:06:20.480 element at address: 0x2000003298c0 with size: 0.000244 MiB 00:06:20.480 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_94 00:06:20.480 element at address: 0x200000329700 with size: 0.000244 MiB 00:06:20.480 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_95 00:06:20.480 element at address: 0x200000329480 with size: 0.000244 MiB 00:06:20.480 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_47 00:06:20.480 element at address: 0x2000003d5d00 with size: 0.000183 MiB 00:06:20.480 associated memzone info: size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:06:20.739 11:47:10 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:06:20.739 11:47:10 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 554861 00:06:20.739 11:47:10 dpdk_mem_utility -- common/autotest_common.sh@948 -- # '[' -z 554861 ']' 00:06:20.739 11:47:10 dpdk_mem_utility -- common/autotest_common.sh@952 -- # kill -0 554861 00:06:20.739 11:47:10 dpdk_mem_utility -- common/autotest_common.sh@953 -- # uname 00:06:20.739 11:47:10 dpdk_mem_utility -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:20.739 11:47:10 dpdk_mem_utility -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 554861 00:06:20.739 11:47:10 dpdk_mem_utility -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:20.739 11:47:10 dpdk_mem_utility -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:20.739 11:47:10 dpdk_mem_utility -- common/autotest_common.sh@966 -- # echo 'killing process with pid 554861' 00:06:20.739 killing process with pid 554861 00:06:20.739 11:47:10 dpdk_mem_utility -- common/autotest_common.sh@967 -- # kill 554861 00:06:20.739 11:47:10 dpdk_mem_utility -- common/autotest_common.sh@972 -- # wait 554861 00:06:20.999 00:06:20.999 real 0m1.434s 00:06:20.999 user 0m1.503s 00:06:20.999 sys 0m0.418s 00:06:20.999 11:47:11 dpdk_mem_utility -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:20.999 11:47:11 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:20.999 ************************************ 00:06:20.999 END TEST dpdk_mem_utility 00:06:20.999 ************************************ 00:06:20.999 11:47:11 -- common/autotest_common.sh@1142 -- # return 0 00:06:20.999 11:47:11 -- spdk/autotest.sh@181 -- # run_test event /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:06:20.999 11:47:11 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:20.999 11:47:11 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:20.999 11:47:11 -- common/autotest_common.sh@10 -- # set +x 00:06:20.999 ************************************ 00:06:20.999 START TEST event 00:06:20.999 ************************************ 00:06:20.999 11:47:11 event -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:06:21.000 * Looking for test storage... 00:06:21.000 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event 00:06:21.000 11:47:11 event -- event/event.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:06:21.000 11:47:11 event -- bdev/nbd_common.sh@6 -- # set -e 00:06:21.000 11:47:11 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:21.000 11:47:11 event -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:06:21.000 11:47:11 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:21.000 11:47:11 event -- common/autotest_common.sh@10 -- # set +x 00:06:21.259 ************************************ 00:06:21.260 START TEST event_perf 00:06:21.260 ************************************ 00:06:21.260 11:47:11 event.event_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:21.260 Running I/O for 1 seconds...[2024-07-12 11:47:11.274614] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:06:21.260 [2024-07-12 11:47:11.274661] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid555202 ] 00:06:21.260 [2024-07-12 11:47:11.350666] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:21.260 [2024-07-12 11:47:11.425190] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:21.260 [2024-07-12 11:47:11.425299] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:21.260 [2024-07-12 11:47:11.425403] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:21.260 [2024-07-12 11:47:11.425404] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:22.635 Running I/O for 1 seconds... 00:06:22.635 lcore 0: 212262 00:06:22.635 lcore 1: 212261 00:06:22.635 lcore 2: 212262 00:06:22.635 lcore 3: 212262 00:06:22.635 done. 00:06:22.635 00:06:22.635 real 0m1.245s 00:06:22.635 user 0m4.151s 00:06:22.635 sys 0m0.093s 00:06:22.635 11:47:12 event.event_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:22.635 11:47:12 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:06:22.635 ************************************ 00:06:22.635 END TEST event_perf 00:06:22.635 ************************************ 00:06:22.635 11:47:12 event -- common/autotest_common.sh@1142 -- # return 0 00:06:22.635 11:47:12 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:22.635 11:47:12 event -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:06:22.635 11:47:12 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:22.635 11:47:12 event -- common/autotest_common.sh@10 -- # set +x 00:06:22.635 ************************************ 00:06:22.635 START TEST event_reactor 00:06:22.635 ************************************ 00:06:22.635 11:47:12 event.event_reactor -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:22.635 [2024-07-12 11:47:12.588402] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:06:22.635 [2024-07-12 11:47:12.588447] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid555446 ] 00:06:22.635 [2024-07-12 11:47:12.664452] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:22.635 [2024-07-12 11:47:12.736132] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:23.569 test_start 00:06:23.569 oneshot 00:06:23.569 tick 100 00:06:23.569 tick 100 00:06:23.569 tick 250 00:06:23.569 tick 100 00:06:23.569 tick 100 00:06:23.569 tick 100 00:06:23.569 tick 250 00:06:23.569 tick 500 00:06:23.569 tick 100 00:06:23.569 tick 100 00:06:23.569 tick 250 00:06:23.569 tick 100 00:06:23.569 tick 100 00:06:23.569 test_end 00:06:23.569 00:06:23.570 real 0m1.238s 00:06:23.570 user 0m1.149s 00:06:23.570 sys 0m0.085s 00:06:23.570 11:47:13 event.event_reactor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:23.570 11:47:13 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:06:23.570 ************************************ 00:06:23.570 END TEST event_reactor 00:06:23.570 ************************************ 00:06:23.828 11:47:13 event -- common/autotest_common.sh@1142 -- # return 0 00:06:23.828 11:47:13 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:23.828 11:47:13 event -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:06:23.828 11:47:13 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:23.828 11:47:13 event -- common/autotest_common.sh@10 -- # set +x 00:06:23.828 ************************************ 00:06:23.828 START TEST event_reactor_perf 00:06:23.828 ************************************ 00:06:23.828 11:47:13 event.event_reactor_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:23.828 [2024-07-12 11:47:13.894093] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:06:23.828 [2024-07-12 11:47:13.894141] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid555654 ] 00:06:23.828 [2024-07-12 11:47:13.971776] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:23.828 [2024-07-12 11:47:14.043061] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:25.206 test_start 00:06:25.206 test_end 00:06:25.206 Performance: 520895 events per second 00:06:25.206 00:06:25.206 real 0m1.242s 00:06:25.206 user 0m1.148s 00:06:25.206 sys 0m0.089s 00:06:25.206 11:47:15 event.event_reactor_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:25.206 11:47:15 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:06:25.206 ************************************ 00:06:25.206 END TEST event_reactor_perf 00:06:25.206 ************************************ 00:06:25.206 11:47:15 event -- common/autotest_common.sh@1142 -- # return 0 00:06:25.206 11:47:15 event -- event/event.sh@49 -- # uname -s 00:06:25.206 11:47:15 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:06:25.206 11:47:15 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:25.206 11:47:15 event -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:25.206 11:47:15 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:25.206 11:47:15 event -- common/autotest_common.sh@10 -- # set +x 00:06:25.206 ************************************ 00:06:25.206 START TEST event_scheduler 00:06:25.206 ************************************ 00:06:25.206 11:47:15 event.event_scheduler -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:25.206 * Looking for test storage... 00:06:25.206 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler 00:06:25.206 11:47:15 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:06:25.206 11:47:15 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=555943 00:06:25.206 11:47:15 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:06:25.206 11:47:15 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:06:25.206 11:47:15 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 555943 00:06:25.206 11:47:15 event.event_scheduler -- common/autotest_common.sh@829 -- # '[' -z 555943 ']' 00:06:25.206 11:47:15 event.event_scheduler -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:25.206 11:47:15 event.event_scheduler -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:25.206 11:47:15 event.event_scheduler -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:25.206 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:25.206 11:47:15 event.event_scheduler -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:25.206 11:47:15 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:25.206 [2024-07-12 11:47:15.315514] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:06:25.206 [2024-07-12 11:47:15.315581] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid555943 ] 00:06:25.206 [2024-07-12 11:47:15.393180] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:25.464 [2024-07-12 11:47:15.473125] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:25.464 [2024-07-12 11:47:15.473235] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:25.464 [2024-07-12 11:47:15.473339] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:25.464 [2024-07-12 11:47:15.473340] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:26.030 11:47:16 event.event_scheduler -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:26.030 11:47:16 event.event_scheduler -- common/autotest_common.sh@862 -- # return 0 00:06:26.030 11:47:16 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:06:26.030 11:47:16 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:26.030 11:47:16 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:26.030 POWER: Env isn't set yet! 00:06:26.030 POWER: Attempting to initialise ACPI cpufreq power management... 00:06:26.030 POWER: Failed to write /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:06:26.030 POWER: Cannot set governor of lcore 0 to userspace 00:06:26.030 POWER: Attempting to initialise PSTAT power management... 00:06:26.030 POWER: Power management governor of lcore 0 has been set to 'performance' successfully 00:06:26.030 POWER: Initialized successfully for lcore 0 power management 00:06:26.030 POWER: Power management governor of lcore 1 has been set to 'performance' successfully 00:06:26.030 POWER: Initialized successfully for lcore 1 power management 00:06:26.030 POWER: Power management governor of lcore 2 has been set to 'performance' successfully 00:06:26.030 POWER: Initialized successfully for lcore 2 power management 00:06:26.030 POWER: Power management governor of lcore 3 has been set to 'performance' successfully 00:06:26.030 POWER: Initialized successfully for lcore 3 power management 00:06:26.030 [2024-07-12 11:47:16.155726] scheduler_dynamic.c: 382:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:06:26.030 [2024-07-12 11:47:16.155739] scheduler_dynamic.c: 384:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:06:26.030 [2024-07-12 11:47:16.155748] scheduler_dynamic.c: 386:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:06:26.030 11:47:16 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:26.030 11:47:16 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:06:26.030 11:47:16 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:26.030 11:47:16 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:26.030 [2024-07-12 11:47:16.230711] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:06:26.030 11:47:16 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:26.030 11:47:16 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:06:26.030 11:47:16 event.event_scheduler -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:26.030 11:47:16 event.event_scheduler -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:26.030 11:47:16 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:26.030 ************************************ 00:06:26.030 START TEST scheduler_create_thread 00:06:26.030 ************************************ 00:06:26.030 11:47:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1123 -- # scheduler_create_thread 00:06:26.030 11:47:16 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:06:26.030 11:47:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:26.030 11:47:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:26.289 2 00:06:26.289 11:47:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:26.289 11:47:16 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:06:26.289 11:47:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:26.289 11:47:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:26.289 3 00:06:26.289 11:47:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:26.289 11:47:16 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:06:26.289 11:47:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:26.289 11:47:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:26.289 4 00:06:26.289 11:47:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:26.289 11:47:16 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:06:26.290 11:47:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:26.290 11:47:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:26.290 5 00:06:26.290 11:47:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:26.290 11:47:16 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:06:26.290 11:47:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:26.290 11:47:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:26.290 6 00:06:26.290 11:47:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:26.290 11:47:16 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:06:26.290 11:47:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:26.290 11:47:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:26.290 7 00:06:26.290 11:47:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:26.290 11:47:16 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:06:26.290 11:47:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:26.290 11:47:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:26.290 8 00:06:26.290 11:47:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:26.290 11:47:16 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:06:26.290 11:47:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:26.290 11:47:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:26.290 9 00:06:26.290 11:47:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:26.290 11:47:16 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:06:26.290 11:47:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:26.290 11:47:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:26.290 10 00:06:26.290 11:47:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:26.290 11:47:16 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:06:26.290 11:47:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:26.290 11:47:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:26.290 11:47:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:26.290 11:47:16 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:06:26.290 11:47:16 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:06:26.290 11:47:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:26.290 11:47:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:26.857 11:47:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:26.857 11:47:16 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:06:26.857 11:47:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:26.857 11:47:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:28.233 11:47:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:28.233 11:47:18 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:06:28.233 11:47:18 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:06:28.233 11:47:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:28.233 11:47:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:29.170 11:47:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:29.170 00:06:29.170 real 0m3.099s 00:06:29.170 user 0m0.023s 00:06:29.170 sys 0m0.005s 00:06:29.170 11:47:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:29.170 11:47:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:29.170 ************************************ 00:06:29.170 END TEST scheduler_create_thread 00:06:29.170 ************************************ 00:06:29.170 11:47:19 event.event_scheduler -- common/autotest_common.sh@1142 -- # return 0 00:06:29.170 11:47:19 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:06:29.170 11:47:19 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 555943 00:06:29.170 11:47:19 event.event_scheduler -- common/autotest_common.sh@948 -- # '[' -z 555943 ']' 00:06:29.170 11:47:19 event.event_scheduler -- common/autotest_common.sh@952 -- # kill -0 555943 00:06:29.170 11:47:19 event.event_scheduler -- common/autotest_common.sh@953 -- # uname 00:06:29.170 11:47:19 event.event_scheduler -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:29.170 11:47:19 event.event_scheduler -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 555943 00:06:29.428 11:47:19 event.event_scheduler -- common/autotest_common.sh@954 -- # process_name=reactor_2 00:06:29.428 11:47:19 event.event_scheduler -- common/autotest_common.sh@958 -- # '[' reactor_2 = sudo ']' 00:06:29.428 11:47:19 event.event_scheduler -- common/autotest_common.sh@966 -- # echo 'killing process with pid 555943' 00:06:29.428 killing process with pid 555943 00:06:29.428 11:47:19 event.event_scheduler -- common/autotest_common.sh@967 -- # kill 555943 00:06:29.428 11:47:19 event.event_scheduler -- common/autotest_common.sh@972 -- # wait 555943 00:06:29.687 [2024-07-12 11:47:19.745892] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:06:29.687 POWER: Power management governor of lcore 0 has been set to 'powersave' successfully 00:06:29.687 POWER: Power management of lcore 0 has exited from 'performance' mode and been set back to the original 00:06:29.687 POWER: Power management governor of lcore 1 has been set to 'powersave' successfully 00:06:29.687 POWER: Power management of lcore 1 has exited from 'performance' mode and been set back to the original 00:06:29.687 POWER: Power management governor of lcore 2 has been set to 'powersave' successfully 00:06:29.687 POWER: Power management of lcore 2 has exited from 'performance' mode and been set back to the original 00:06:29.687 POWER: Power management governor of lcore 3 has been set to 'powersave' successfully 00:06:29.687 POWER: Power management of lcore 3 has exited from 'performance' mode and been set back to the original 00:06:29.946 00:06:29.946 real 0m4.778s 00:06:29.946 user 0m9.214s 00:06:29.946 sys 0m0.384s 00:06:29.946 11:47:19 event.event_scheduler -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:29.946 11:47:19 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:29.946 ************************************ 00:06:29.946 END TEST event_scheduler 00:06:29.946 ************************************ 00:06:29.946 11:47:19 event -- common/autotest_common.sh@1142 -- # return 0 00:06:29.946 11:47:19 event -- event/event.sh@51 -- # modprobe -n nbd 00:06:29.946 11:47:20 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:06:29.946 11:47:20 event -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:29.946 11:47:20 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:29.946 11:47:20 event -- common/autotest_common.sh@10 -- # set +x 00:06:29.946 ************************************ 00:06:29.946 START TEST app_repeat 00:06:29.946 ************************************ 00:06:29.946 11:47:20 event.app_repeat -- common/autotest_common.sh@1123 -- # app_repeat_test 00:06:29.946 11:47:20 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:29.946 11:47:20 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:29.946 11:47:20 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:06:29.946 11:47:20 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:29.946 11:47:20 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:06:29.946 11:47:20 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:06:29.946 11:47:20 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:06:29.946 11:47:20 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:06:29.946 11:47:20 event.app_repeat -- event/event.sh@19 -- # repeat_pid=556868 00:06:29.946 11:47:20 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:06:29.946 11:47:20 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 556868' 00:06:29.946 Process app_repeat pid: 556868 00:06:29.946 11:47:20 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:06:29.946 11:47:20 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:06:29.946 spdk_app_start Round 0 00:06:29.946 11:47:20 event.app_repeat -- event/event.sh@25 -- # waitforlisten 556868 /var/tmp/spdk-nbd.sock 00:06:29.946 11:47:20 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 556868 ']' 00:06:29.946 11:47:20 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:29.946 11:47:20 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:29.946 11:47:20 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:29.946 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:29.946 11:47:20 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:29.946 11:47:20 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:29.946 [2024-07-12 11:47:20.064649] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:06:29.946 [2024-07-12 11:47:20.064689] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid556868 ] 00:06:29.946 [2024-07-12 11:47:20.125414] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:30.205 [2024-07-12 11:47:20.207537] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:30.205 [2024-07-12 11:47:20.207540] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:30.774 11:47:20 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:30.774 11:47:20 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:06:30.774 11:47:20 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:31.033 Malloc0 00:06:31.033 11:47:21 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:31.033 Malloc1 00:06:31.033 11:47:21 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:31.033 11:47:21 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:31.033 11:47:21 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:31.033 11:47:21 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:31.033 11:47:21 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:31.033 11:47:21 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:31.033 11:47:21 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:31.033 11:47:21 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:31.033 11:47:21 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:31.033 11:47:21 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:31.033 11:47:21 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:31.033 11:47:21 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:31.033 11:47:21 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:06:31.033 11:47:21 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:31.033 11:47:21 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:31.033 11:47:21 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:31.291 /dev/nbd0 00:06:31.291 11:47:21 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:31.291 11:47:21 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:31.292 11:47:21 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:06:31.292 11:47:21 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:06:31.292 11:47:21 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:31.292 11:47:21 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:31.292 11:47:21 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:06:31.292 11:47:21 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:06:31.292 11:47:21 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:31.292 11:47:21 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:31.292 11:47:21 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:31.292 1+0 records in 00:06:31.292 1+0 records out 00:06:31.292 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000186796 s, 21.9 MB/s 00:06:31.292 11:47:21 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:31.292 11:47:21 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:06:31.292 11:47:21 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:31.292 11:47:21 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:31.292 11:47:21 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:06:31.292 11:47:21 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:31.292 11:47:21 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:31.292 11:47:21 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:31.550 /dev/nbd1 00:06:31.550 11:47:21 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:31.550 11:47:21 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:31.550 11:47:21 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:06:31.550 11:47:21 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:06:31.550 11:47:21 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:31.550 11:47:21 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:31.550 11:47:21 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:06:31.550 11:47:21 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:06:31.550 11:47:21 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:31.550 11:47:21 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:31.550 11:47:21 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:31.550 1+0 records in 00:06:31.550 1+0 records out 00:06:31.550 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000206237 s, 19.9 MB/s 00:06:31.550 11:47:21 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:31.550 11:47:21 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:06:31.550 11:47:21 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:31.550 11:47:21 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:31.550 11:47:21 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:06:31.550 11:47:21 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:31.550 11:47:21 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:31.550 11:47:21 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:31.551 11:47:21 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:31.551 11:47:21 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:31.809 11:47:21 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:31.809 { 00:06:31.809 "nbd_device": "/dev/nbd0", 00:06:31.809 "bdev_name": "Malloc0" 00:06:31.809 }, 00:06:31.809 { 00:06:31.809 "nbd_device": "/dev/nbd1", 00:06:31.809 "bdev_name": "Malloc1" 00:06:31.809 } 00:06:31.809 ]' 00:06:31.809 11:47:21 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:31.809 { 00:06:31.809 "nbd_device": "/dev/nbd0", 00:06:31.809 "bdev_name": "Malloc0" 00:06:31.809 }, 00:06:31.809 { 00:06:31.809 "nbd_device": "/dev/nbd1", 00:06:31.809 "bdev_name": "Malloc1" 00:06:31.809 } 00:06:31.809 ]' 00:06:31.809 11:47:21 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:31.809 11:47:21 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:31.809 /dev/nbd1' 00:06:31.809 11:47:21 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:31.809 /dev/nbd1' 00:06:31.809 11:47:21 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:31.809 11:47:21 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:06:31.809 11:47:21 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:06:31.809 11:47:21 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:06:31.809 11:47:21 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:31.809 11:47:21 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:31.809 11:47:21 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:31.809 11:47:21 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:31.809 11:47:21 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:31.809 11:47:21 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:31.809 11:47:21 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:31.809 11:47:21 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:31.809 256+0 records in 00:06:31.809 256+0 records out 00:06:31.809 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00994455 s, 105 MB/s 00:06:31.809 11:47:21 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:31.809 11:47:21 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:31.809 256+0 records in 00:06:31.809 256+0 records out 00:06:31.809 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0138222 s, 75.9 MB/s 00:06:31.810 11:47:21 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:31.810 11:47:21 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:31.810 256+0 records in 00:06:31.810 256+0 records out 00:06:31.810 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0144367 s, 72.6 MB/s 00:06:31.810 11:47:21 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:31.810 11:47:21 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:31.810 11:47:21 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:31.810 11:47:21 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:31.810 11:47:21 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:31.810 11:47:21 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:31.810 11:47:21 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:31.810 11:47:21 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:31.810 11:47:21 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:31.810 11:47:21 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:31.810 11:47:21 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:31.810 11:47:21 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:31.810 11:47:21 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:31.810 11:47:21 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:31.810 11:47:21 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:31.810 11:47:21 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:31.810 11:47:21 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:06:31.810 11:47:21 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:31.810 11:47:21 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:32.068 11:47:22 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:32.068 11:47:22 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:32.068 11:47:22 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:32.068 11:47:22 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:32.068 11:47:22 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:32.068 11:47:22 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:32.068 11:47:22 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:32.068 11:47:22 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:32.068 11:47:22 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:32.068 11:47:22 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:32.327 11:47:22 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:32.327 11:47:22 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:32.327 11:47:22 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:32.327 11:47:22 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:32.327 11:47:22 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:32.327 11:47:22 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:32.327 11:47:22 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:32.327 11:47:22 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:32.327 11:47:22 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:32.327 11:47:22 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:32.327 11:47:22 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:32.327 11:47:22 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:32.327 11:47:22 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:32.327 11:47:22 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:32.327 11:47:22 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:32.585 11:47:22 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:06:32.585 11:47:22 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:32.585 11:47:22 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:06:32.586 11:47:22 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:06:32.586 11:47:22 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:06:32.586 11:47:22 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:06:32.586 11:47:22 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:32.586 11:47:22 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:06:32.586 11:47:22 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:32.586 11:47:22 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:06:32.844 [2024-07-12 11:47:22.951638] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:32.844 [2024-07-12 11:47:23.017672] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:32.844 [2024-07-12 11:47:23.017675] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:32.844 [2024-07-12 11:47:23.058130] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:32.844 [2024-07-12 11:47:23.058167] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:36.129 11:47:25 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:06:36.129 11:47:25 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:06:36.129 spdk_app_start Round 1 00:06:36.129 11:47:25 event.app_repeat -- event/event.sh@25 -- # waitforlisten 556868 /var/tmp/spdk-nbd.sock 00:06:36.129 11:47:25 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 556868 ']' 00:06:36.129 11:47:25 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:36.129 11:47:25 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:36.129 11:47:25 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:36.129 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:36.129 11:47:25 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:36.129 11:47:25 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:36.129 11:47:25 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:36.129 11:47:25 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:06:36.129 11:47:25 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:36.129 Malloc0 00:06:36.129 11:47:26 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:36.129 Malloc1 00:06:36.129 11:47:26 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:36.129 11:47:26 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:36.129 11:47:26 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:36.129 11:47:26 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:36.129 11:47:26 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:36.129 11:47:26 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:36.129 11:47:26 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:36.129 11:47:26 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:36.129 11:47:26 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:36.129 11:47:26 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:36.129 11:47:26 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:36.129 11:47:26 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:36.129 11:47:26 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:06:36.129 11:47:26 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:36.129 11:47:26 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:36.129 11:47:26 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:36.388 /dev/nbd0 00:06:36.388 11:47:26 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:36.388 11:47:26 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:36.388 11:47:26 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:06:36.388 11:47:26 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:06:36.388 11:47:26 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:36.388 11:47:26 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:36.388 11:47:26 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:06:36.388 11:47:26 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:06:36.388 11:47:26 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:36.388 11:47:26 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:36.388 11:47:26 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:36.388 1+0 records in 00:06:36.388 1+0 records out 00:06:36.388 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000194384 s, 21.1 MB/s 00:06:36.388 11:47:26 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:36.388 11:47:26 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:06:36.388 11:47:26 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:36.388 11:47:26 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:36.388 11:47:26 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:06:36.388 11:47:26 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:36.388 11:47:26 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:36.388 11:47:26 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:36.647 /dev/nbd1 00:06:36.647 11:47:26 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:36.647 11:47:26 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:36.647 11:47:26 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:06:36.647 11:47:26 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:06:36.647 11:47:26 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:36.647 11:47:26 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:36.647 11:47:26 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:06:36.647 11:47:26 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:06:36.647 11:47:26 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:36.647 11:47:26 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:36.647 11:47:26 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:36.647 1+0 records in 00:06:36.647 1+0 records out 00:06:36.647 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000182013 s, 22.5 MB/s 00:06:36.647 11:47:26 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:36.647 11:47:26 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:06:36.647 11:47:26 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:36.647 11:47:26 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:36.647 11:47:26 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:06:36.647 11:47:26 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:36.647 11:47:26 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:36.647 11:47:26 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:36.647 11:47:26 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:36.647 11:47:26 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:36.647 11:47:26 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:36.647 { 00:06:36.647 "nbd_device": "/dev/nbd0", 00:06:36.647 "bdev_name": "Malloc0" 00:06:36.647 }, 00:06:36.647 { 00:06:36.647 "nbd_device": "/dev/nbd1", 00:06:36.647 "bdev_name": "Malloc1" 00:06:36.647 } 00:06:36.647 ]' 00:06:36.647 11:47:26 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:36.647 { 00:06:36.647 "nbd_device": "/dev/nbd0", 00:06:36.647 "bdev_name": "Malloc0" 00:06:36.647 }, 00:06:36.647 { 00:06:36.647 "nbd_device": "/dev/nbd1", 00:06:36.647 "bdev_name": "Malloc1" 00:06:36.647 } 00:06:36.647 ]' 00:06:36.647 11:47:26 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:36.906 11:47:26 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:36.906 /dev/nbd1' 00:06:36.906 11:47:26 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:36.906 /dev/nbd1' 00:06:36.906 11:47:26 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:36.906 11:47:26 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:06:36.906 11:47:26 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:06:36.906 11:47:26 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:06:36.906 11:47:26 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:36.906 11:47:26 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:36.906 11:47:26 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:36.906 11:47:26 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:36.906 11:47:26 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:36.906 11:47:26 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:36.906 11:47:26 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:36.906 11:47:26 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:36.906 256+0 records in 00:06:36.906 256+0 records out 00:06:36.906 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0103413 s, 101 MB/s 00:06:36.906 11:47:26 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:36.906 11:47:26 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:36.906 256+0 records in 00:06:36.906 256+0 records out 00:06:36.906 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0136085 s, 77.1 MB/s 00:06:36.906 11:47:26 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:36.906 11:47:26 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:36.906 256+0 records in 00:06:36.906 256+0 records out 00:06:36.906 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0143133 s, 73.3 MB/s 00:06:36.906 11:47:26 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:36.906 11:47:26 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:36.907 11:47:26 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:36.907 11:47:26 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:36.907 11:47:26 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:36.907 11:47:26 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:36.907 11:47:26 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:36.907 11:47:26 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:36.907 11:47:26 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:36.907 11:47:26 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:36.907 11:47:26 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:36.907 11:47:26 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:36.907 11:47:26 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:36.907 11:47:26 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:36.907 11:47:26 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:36.907 11:47:26 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:36.907 11:47:26 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:06:36.907 11:47:26 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:36.907 11:47:26 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:37.166 11:47:27 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:37.166 11:47:27 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:37.166 11:47:27 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:37.166 11:47:27 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:37.166 11:47:27 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:37.166 11:47:27 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:37.166 11:47:27 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:37.166 11:47:27 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:37.166 11:47:27 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:37.166 11:47:27 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:37.166 11:47:27 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:37.166 11:47:27 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:37.166 11:47:27 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:37.166 11:47:27 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:37.166 11:47:27 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:37.166 11:47:27 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:37.166 11:47:27 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:37.166 11:47:27 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:37.166 11:47:27 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:37.166 11:47:27 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:37.166 11:47:27 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:37.423 11:47:27 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:37.423 11:47:27 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:37.423 11:47:27 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:37.423 11:47:27 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:37.423 11:47:27 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:06:37.423 11:47:27 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:37.423 11:47:27 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:06:37.423 11:47:27 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:06:37.423 11:47:27 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:06:37.423 11:47:27 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:06:37.423 11:47:27 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:37.423 11:47:27 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:06:37.423 11:47:27 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:37.681 11:47:27 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:06:37.940 [2024-07-12 11:47:27.972680] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:37.940 [2024-07-12 11:47:28.037675] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:37.940 [2024-07-12 11:47:28.037677] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:37.940 [2024-07-12 11:47:28.078919] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:37.940 [2024-07-12 11:47:28.078954] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:41.221 11:47:30 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:06:41.221 11:47:30 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:06:41.221 spdk_app_start Round 2 00:06:41.221 11:47:30 event.app_repeat -- event/event.sh@25 -- # waitforlisten 556868 /var/tmp/spdk-nbd.sock 00:06:41.221 11:47:30 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 556868 ']' 00:06:41.221 11:47:30 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:41.221 11:47:30 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:41.221 11:47:30 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:41.221 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:41.221 11:47:30 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:41.221 11:47:30 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:41.221 11:47:30 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:41.221 11:47:30 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:06:41.221 11:47:30 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:41.221 Malloc0 00:06:41.221 11:47:31 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:41.221 Malloc1 00:06:41.221 11:47:31 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:41.221 11:47:31 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:41.221 11:47:31 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:41.221 11:47:31 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:41.221 11:47:31 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:41.221 11:47:31 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:41.221 11:47:31 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:41.221 11:47:31 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:41.221 11:47:31 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:41.221 11:47:31 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:41.221 11:47:31 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:41.222 11:47:31 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:41.222 11:47:31 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:06:41.222 11:47:31 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:41.222 11:47:31 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:41.222 11:47:31 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:41.480 /dev/nbd0 00:06:41.480 11:47:31 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:41.480 11:47:31 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:41.480 11:47:31 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:06:41.480 11:47:31 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:06:41.480 11:47:31 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:41.480 11:47:31 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:41.480 11:47:31 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:06:41.480 11:47:31 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:06:41.480 11:47:31 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:41.480 11:47:31 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:41.480 11:47:31 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:41.480 1+0 records in 00:06:41.480 1+0 records out 00:06:41.480 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00018282 s, 22.4 MB/s 00:06:41.480 11:47:31 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:41.480 11:47:31 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:06:41.480 11:47:31 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:41.480 11:47:31 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:41.480 11:47:31 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:06:41.480 11:47:31 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:41.480 11:47:31 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:41.480 11:47:31 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:41.480 /dev/nbd1 00:06:41.739 11:47:31 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:41.739 11:47:31 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:41.739 11:47:31 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:06:41.739 11:47:31 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:06:41.739 11:47:31 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:41.739 11:47:31 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:41.739 11:47:31 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:06:41.739 11:47:31 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:06:41.739 11:47:31 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:41.739 11:47:31 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:41.739 11:47:31 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:41.739 1+0 records in 00:06:41.739 1+0 records out 00:06:41.739 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000130252 s, 31.4 MB/s 00:06:41.739 11:47:31 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:41.739 11:47:31 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:06:41.739 11:47:31 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:41.739 11:47:31 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:41.739 11:47:31 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:06:41.739 11:47:31 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:41.739 11:47:31 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:41.739 11:47:31 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:41.739 11:47:31 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:41.739 11:47:31 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:41.739 11:47:31 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:41.739 { 00:06:41.739 "nbd_device": "/dev/nbd0", 00:06:41.739 "bdev_name": "Malloc0" 00:06:41.739 }, 00:06:41.739 { 00:06:41.739 "nbd_device": "/dev/nbd1", 00:06:41.739 "bdev_name": "Malloc1" 00:06:41.739 } 00:06:41.739 ]' 00:06:41.739 11:47:31 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:41.739 { 00:06:41.739 "nbd_device": "/dev/nbd0", 00:06:41.739 "bdev_name": "Malloc0" 00:06:41.739 }, 00:06:41.739 { 00:06:41.739 "nbd_device": "/dev/nbd1", 00:06:41.739 "bdev_name": "Malloc1" 00:06:41.739 } 00:06:41.739 ]' 00:06:41.739 11:47:31 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:41.739 11:47:31 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:41.739 /dev/nbd1' 00:06:41.739 11:47:31 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:41.739 /dev/nbd1' 00:06:41.739 11:47:31 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:41.739 11:47:31 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:06:41.998 11:47:31 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:06:41.998 11:47:31 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:06:41.998 11:47:31 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:41.998 11:47:31 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:41.998 11:47:31 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:41.998 11:47:31 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:41.998 11:47:31 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:41.998 11:47:31 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:41.998 11:47:31 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:41.998 11:47:31 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:41.998 256+0 records in 00:06:41.998 256+0 records out 00:06:41.998 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.010288 s, 102 MB/s 00:06:41.998 11:47:31 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:41.998 11:47:32 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:41.998 256+0 records in 00:06:41.998 256+0 records out 00:06:41.998 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0136235 s, 77.0 MB/s 00:06:41.998 11:47:32 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:41.998 11:47:32 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:41.998 256+0 records in 00:06:41.998 256+0 records out 00:06:41.998 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0142048 s, 73.8 MB/s 00:06:41.998 11:47:32 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:41.998 11:47:32 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:41.998 11:47:32 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:41.998 11:47:32 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:41.998 11:47:32 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:41.998 11:47:32 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:41.998 11:47:32 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:41.998 11:47:32 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:41.998 11:47:32 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:41.998 11:47:32 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:41.998 11:47:32 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:41.998 11:47:32 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:41.998 11:47:32 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:41.998 11:47:32 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:41.998 11:47:32 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:41.998 11:47:32 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:41.998 11:47:32 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:06:41.998 11:47:32 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:41.998 11:47:32 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:41.998 11:47:32 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:41.998 11:47:32 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:41.998 11:47:32 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:41.998 11:47:32 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:41.998 11:47:32 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:41.998 11:47:32 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:42.257 11:47:32 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:42.257 11:47:32 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:42.257 11:47:32 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:42.257 11:47:32 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:42.257 11:47:32 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:42.257 11:47:32 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:42.257 11:47:32 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:42.257 11:47:32 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:42.257 11:47:32 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:42.257 11:47:32 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:42.257 11:47:32 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:42.257 11:47:32 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:42.257 11:47:32 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:42.257 11:47:32 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:42.257 11:47:32 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:42.514 11:47:32 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:42.514 11:47:32 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:42.514 11:47:32 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:42.514 11:47:32 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:42.514 11:47:32 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:42.514 11:47:32 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:06:42.514 11:47:32 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:06:42.514 11:47:32 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:06:42.514 11:47:32 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:06:42.514 11:47:32 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:06:42.514 11:47:32 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:42.514 11:47:32 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:06:42.514 11:47:32 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:42.772 11:47:32 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:06:43.031 [2024-07-12 11:47:33.038881] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:43.031 [2024-07-12 11:47:33.105478] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:43.031 [2024-07-12 11:47:33.105481] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:43.031 [2024-07-12 11:47:33.146076] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:43.031 [2024-07-12 11:47:33.146115] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:46.314 11:47:35 event.app_repeat -- event/event.sh@38 -- # waitforlisten 556868 /var/tmp/spdk-nbd.sock 00:06:46.314 11:47:35 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 556868 ']' 00:06:46.314 11:47:35 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:46.314 11:47:35 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:46.314 11:47:35 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:46.314 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:46.314 11:47:35 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:46.314 11:47:35 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:46.314 11:47:36 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:46.314 11:47:36 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:06:46.314 11:47:36 event.app_repeat -- event/event.sh@39 -- # killprocess 556868 00:06:46.314 11:47:36 event.app_repeat -- common/autotest_common.sh@948 -- # '[' -z 556868 ']' 00:06:46.314 11:47:36 event.app_repeat -- common/autotest_common.sh@952 -- # kill -0 556868 00:06:46.314 11:47:36 event.app_repeat -- common/autotest_common.sh@953 -- # uname 00:06:46.314 11:47:36 event.app_repeat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:46.314 11:47:36 event.app_repeat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 556868 00:06:46.314 11:47:36 event.app_repeat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:46.314 11:47:36 event.app_repeat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:46.314 11:47:36 event.app_repeat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 556868' 00:06:46.314 killing process with pid 556868 00:06:46.314 11:47:36 event.app_repeat -- common/autotest_common.sh@967 -- # kill 556868 00:06:46.314 11:47:36 event.app_repeat -- common/autotest_common.sh@972 -- # wait 556868 00:06:46.314 spdk_app_start is called in Round 0. 00:06:46.314 Shutdown signal received, stop current app iteration 00:06:46.314 Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 reinitialization... 00:06:46.314 spdk_app_start is called in Round 1. 00:06:46.314 Shutdown signal received, stop current app iteration 00:06:46.314 Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 reinitialization... 00:06:46.314 spdk_app_start is called in Round 2. 00:06:46.314 Shutdown signal received, stop current app iteration 00:06:46.314 Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 reinitialization... 00:06:46.314 spdk_app_start is called in Round 3. 00:06:46.314 Shutdown signal received, stop current app iteration 00:06:46.314 11:47:36 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:06:46.314 11:47:36 event.app_repeat -- event/event.sh@42 -- # return 0 00:06:46.314 00:06:46.314 real 0m16.214s 00:06:46.314 user 0m35.045s 00:06:46.314 sys 0m2.359s 00:06:46.314 11:47:36 event.app_repeat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:46.314 11:47:36 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:46.314 ************************************ 00:06:46.314 END TEST app_repeat 00:06:46.314 ************************************ 00:06:46.314 11:47:36 event -- common/autotest_common.sh@1142 -- # return 0 00:06:46.314 11:47:36 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:06:46.314 00:06:46.314 real 0m25.148s 00:06:46.314 user 0m50.875s 00:06:46.314 sys 0m3.304s 00:06:46.314 11:47:36 event -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:46.314 11:47:36 event -- common/autotest_common.sh@10 -- # set +x 00:06:46.314 ************************************ 00:06:46.314 END TEST event 00:06:46.314 ************************************ 00:06:46.314 11:47:36 -- common/autotest_common.sh@1142 -- # return 0 00:06:46.314 11:47:36 -- spdk/autotest.sh@182 -- # run_test thread /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:06:46.314 11:47:36 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:46.314 11:47:36 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:46.314 11:47:36 -- common/autotest_common.sh@10 -- # set +x 00:06:46.314 ************************************ 00:06:46.314 START TEST thread 00:06:46.314 ************************************ 00:06:46.314 11:47:36 thread -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:06:46.314 * Looking for test storage... 00:06:46.314 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread 00:06:46.314 11:47:36 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:46.314 11:47:36 thread -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:06:46.314 11:47:36 thread -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:46.314 11:47:36 thread -- common/autotest_common.sh@10 -- # set +x 00:06:46.314 ************************************ 00:06:46.314 START TEST thread_poller_perf 00:06:46.314 ************************************ 00:06:46.314 11:47:36 thread.thread_poller_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:46.314 [2024-07-12 11:47:36.478001] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:06:46.314 [2024-07-12 11:47:36.478063] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid559837 ] 00:06:46.314 [2024-07-12 11:47:36.544585] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:46.573 [2024-07-12 11:47:36.618142] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:46.573 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:06:47.508 ====================================== 00:06:47.508 busy:2108099166 (cyc) 00:06:47.508 total_run_count: 425000 00:06:47.508 tsc_hz: 2100000000 (cyc) 00:06:47.508 ====================================== 00:06:47.508 poller_cost: 4960 (cyc), 2361 (nsec) 00:06:47.508 00:06:47.508 real 0m1.238s 00:06:47.508 user 0m1.149s 00:06:47.508 sys 0m0.085s 00:06:47.508 11:47:37 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:47.508 11:47:37 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:06:47.508 ************************************ 00:06:47.508 END TEST thread_poller_perf 00:06:47.508 ************************************ 00:06:47.508 11:47:37 thread -- common/autotest_common.sh@1142 -- # return 0 00:06:47.509 11:47:37 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:47.509 11:47:37 thread -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:06:47.509 11:47:37 thread -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:47.509 11:47:37 thread -- common/autotest_common.sh@10 -- # set +x 00:06:47.768 ************************************ 00:06:47.768 START TEST thread_poller_perf 00:06:47.768 ************************************ 00:06:47.768 11:47:37 thread.thread_poller_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:47.768 [2024-07-12 11:47:37.783618] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:06:47.768 [2024-07-12 11:47:37.783681] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid560083 ] 00:06:47.768 [2024-07-12 11:47:37.849726] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:47.768 [2024-07-12 11:47:37.919858] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:47.768 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:06:49.146 ====================================== 00:06:49.146 busy:2101431856 (cyc) 00:06:49.146 total_run_count: 5550000 00:06:49.146 tsc_hz: 2100000000 (cyc) 00:06:49.146 ====================================== 00:06:49.146 poller_cost: 378 (cyc), 180 (nsec) 00:06:49.146 00:06:49.146 real 0m1.234s 00:06:49.146 user 0m1.147s 00:06:49.146 sys 0m0.083s 00:06:49.146 11:47:38 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:49.146 11:47:38 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:06:49.146 ************************************ 00:06:49.146 END TEST thread_poller_perf 00:06:49.146 ************************************ 00:06:49.146 11:47:39 thread -- common/autotest_common.sh@1142 -- # return 0 00:06:49.146 11:47:39 thread -- thread/thread.sh@17 -- # [[ y != \y ]] 00:06:49.146 00:06:49.146 real 0m2.682s 00:06:49.146 user 0m2.383s 00:06:49.146 sys 0m0.304s 00:06:49.146 11:47:39 thread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:49.146 11:47:39 thread -- common/autotest_common.sh@10 -- # set +x 00:06:49.146 ************************************ 00:06:49.146 END TEST thread 00:06:49.146 ************************************ 00:06:49.146 11:47:39 -- common/autotest_common.sh@1142 -- # return 0 00:06:49.146 11:47:39 -- spdk/autotest.sh@183 -- # run_test accel /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:06:49.146 11:47:39 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:49.146 11:47:39 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:49.146 11:47:39 -- common/autotest_common.sh@10 -- # set +x 00:06:49.146 ************************************ 00:06:49.146 START TEST accel 00:06:49.146 ************************************ 00:06:49.146 11:47:39 accel -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:06:49.146 * Looking for test storage... 00:06:49.146 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:06:49.146 11:47:39 accel -- accel/accel.sh@81 -- # declare -A expected_opcs 00:06:49.146 11:47:39 accel -- accel/accel.sh@82 -- # get_expected_opcs 00:06:49.146 11:47:39 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:49.146 11:47:39 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=560369 00:06:49.146 11:47:39 accel -- accel/accel.sh@63 -- # waitforlisten 560369 00:06:49.146 11:47:39 accel -- common/autotest_common.sh@829 -- # '[' -z 560369 ']' 00:06:49.146 11:47:39 accel -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:49.146 11:47:39 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:06:49.146 11:47:39 accel -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:49.146 11:47:39 accel -- accel/accel.sh@61 -- # build_accel_config 00:06:49.146 11:47:39 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:49.146 11:47:39 accel -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:49.146 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:49.146 11:47:39 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:49.146 11:47:39 accel -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:49.146 11:47:39 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:49.146 11:47:39 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:49.146 11:47:39 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:49.146 11:47:39 accel -- common/autotest_common.sh@10 -- # set +x 00:06:49.146 11:47:39 accel -- accel/accel.sh@40 -- # local IFS=, 00:06:49.146 11:47:39 accel -- accel/accel.sh@41 -- # jq -r . 00:06:49.146 [2024-07-12 11:47:39.240705] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:06:49.146 [2024-07-12 11:47:39.240751] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid560369 ] 00:06:49.146 [2024-07-12 11:47:39.305854] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:49.146 [2024-07-12 11:47:39.376321] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:50.083 11:47:40 accel -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:50.084 11:47:40 accel -- common/autotest_common.sh@862 -- # return 0 00:06:50.084 11:47:40 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:06:50.084 11:47:40 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:06:50.084 11:47:40 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:06:50.084 11:47:40 accel -- accel/accel.sh@68 -- # [[ -n '' ]] 00:06:50.084 11:47:40 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:06:50.084 11:47:40 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:06:50.084 11:47:40 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:50.084 11:47:40 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:06:50.084 11:47:40 accel -- common/autotest_common.sh@10 -- # set +x 00:06:50.084 11:47:40 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:50.084 11:47:40 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:50.084 11:47:40 accel -- accel/accel.sh@72 -- # IFS== 00:06:50.084 11:47:40 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:50.084 11:47:40 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:50.084 11:47:40 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:50.084 11:47:40 accel -- accel/accel.sh@72 -- # IFS== 00:06:50.084 11:47:40 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:50.084 11:47:40 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:50.084 11:47:40 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:50.084 11:47:40 accel -- accel/accel.sh@72 -- # IFS== 00:06:50.084 11:47:40 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:50.084 11:47:40 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:50.084 11:47:40 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:50.084 11:47:40 accel -- accel/accel.sh@72 -- # IFS== 00:06:50.084 11:47:40 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:50.084 11:47:40 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:50.084 11:47:40 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:50.084 11:47:40 accel -- accel/accel.sh@72 -- # IFS== 00:06:50.084 11:47:40 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:50.084 11:47:40 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:50.084 11:47:40 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:50.084 11:47:40 accel -- accel/accel.sh@72 -- # IFS== 00:06:50.084 11:47:40 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:50.084 11:47:40 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:50.084 11:47:40 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:50.084 11:47:40 accel -- accel/accel.sh@72 -- # IFS== 00:06:50.084 11:47:40 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:50.084 11:47:40 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:50.084 11:47:40 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:50.084 11:47:40 accel -- accel/accel.sh@72 -- # IFS== 00:06:50.084 11:47:40 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:50.084 11:47:40 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:50.084 11:47:40 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:50.084 11:47:40 accel -- accel/accel.sh@72 -- # IFS== 00:06:50.084 11:47:40 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:50.084 11:47:40 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:50.084 11:47:40 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:50.084 11:47:40 accel -- accel/accel.sh@72 -- # IFS== 00:06:50.084 11:47:40 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:50.084 11:47:40 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:50.084 11:47:40 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:50.084 11:47:40 accel -- accel/accel.sh@72 -- # IFS== 00:06:50.084 11:47:40 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:50.084 11:47:40 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:50.084 11:47:40 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:50.084 11:47:40 accel -- accel/accel.sh@72 -- # IFS== 00:06:50.084 11:47:40 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:50.084 11:47:40 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:50.084 11:47:40 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:50.084 11:47:40 accel -- accel/accel.sh@72 -- # IFS== 00:06:50.084 11:47:40 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:50.084 11:47:40 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:50.084 11:47:40 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:50.084 11:47:40 accel -- accel/accel.sh@72 -- # IFS== 00:06:50.084 11:47:40 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:50.084 11:47:40 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:50.084 11:47:40 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:50.084 11:47:40 accel -- accel/accel.sh@72 -- # IFS== 00:06:50.084 11:47:40 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:50.084 11:47:40 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:50.084 11:47:40 accel -- accel/accel.sh@75 -- # killprocess 560369 00:06:50.084 11:47:40 accel -- common/autotest_common.sh@948 -- # '[' -z 560369 ']' 00:06:50.084 11:47:40 accel -- common/autotest_common.sh@952 -- # kill -0 560369 00:06:50.084 11:47:40 accel -- common/autotest_common.sh@953 -- # uname 00:06:50.084 11:47:40 accel -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:50.084 11:47:40 accel -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 560369 00:06:50.084 11:47:40 accel -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:50.084 11:47:40 accel -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:50.084 11:47:40 accel -- common/autotest_common.sh@966 -- # echo 'killing process with pid 560369' 00:06:50.084 killing process with pid 560369 00:06:50.084 11:47:40 accel -- common/autotest_common.sh@967 -- # kill 560369 00:06:50.084 11:47:40 accel -- common/autotest_common.sh@972 -- # wait 560369 00:06:50.343 11:47:40 accel -- accel/accel.sh@76 -- # trap - ERR 00:06:50.343 11:47:40 accel -- accel/accel.sh@89 -- # run_test accel_help accel_perf -h 00:06:50.343 11:47:40 accel -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:06:50.343 11:47:40 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:50.343 11:47:40 accel -- common/autotest_common.sh@10 -- # set +x 00:06:50.343 11:47:40 accel.accel_help -- common/autotest_common.sh@1123 -- # accel_perf -h 00:06:50.343 11:47:40 accel.accel_help -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:06:50.343 11:47:40 accel.accel_help -- accel/accel.sh@12 -- # build_accel_config 00:06:50.343 11:47:40 accel.accel_help -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:50.343 11:47:40 accel.accel_help -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:50.343 11:47:40 accel.accel_help -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:50.343 11:47:40 accel.accel_help -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:50.343 11:47:40 accel.accel_help -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:50.343 11:47:40 accel.accel_help -- accel/accel.sh@40 -- # local IFS=, 00:06:50.343 11:47:40 accel.accel_help -- accel/accel.sh@41 -- # jq -r . 00:06:50.343 11:47:40 accel.accel_help -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:50.343 11:47:40 accel.accel_help -- common/autotest_common.sh@10 -- # set +x 00:06:50.343 11:47:40 accel -- common/autotest_common.sh@1142 -- # return 0 00:06:50.343 11:47:40 accel -- accel/accel.sh@91 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:06:50.343 11:47:40 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:06:50.343 11:47:40 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:50.343 11:47:40 accel -- common/autotest_common.sh@10 -- # set +x 00:06:50.343 ************************************ 00:06:50.343 START TEST accel_missing_filename 00:06:50.343 ************************************ 00:06:50.343 11:47:40 accel.accel_missing_filename -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w compress 00:06:50.343 11:47:40 accel.accel_missing_filename -- common/autotest_common.sh@648 -- # local es=0 00:06:50.343 11:47:40 accel.accel_missing_filename -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress 00:06:50.343 11:47:40 accel.accel_missing_filename -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:06:50.343 11:47:40 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:50.343 11:47:40 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # type -t accel_perf 00:06:50.343 11:47:40 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:50.343 11:47:40 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress 00:06:50.343 11:47:40 accel.accel_missing_filename -- accel/accel.sh@12 -- # build_accel_config 00:06:50.343 11:47:40 accel.accel_missing_filename -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:06:50.343 11:47:40 accel.accel_missing_filename -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:50.343 11:47:40 accel.accel_missing_filename -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:50.343 11:47:40 accel.accel_missing_filename -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:50.343 11:47:40 accel.accel_missing_filename -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:50.343 11:47:40 accel.accel_missing_filename -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:50.343 11:47:40 accel.accel_missing_filename -- accel/accel.sh@40 -- # local IFS=, 00:06:50.343 11:47:40 accel.accel_missing_filename -- accel/accel.sh@41 -- # jq -r . 00:06:50.343 [2024-07-12 11:47:40.569057] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:06:50.343 [2024-07-12 11:47:40.569101] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid560638 ] 00:06:50.602 [2024-07-12 11:47:40.633777] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:50.602 [2024-07-12 11:47:40.705353] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:50.602 [2024-07-12 11:47:40.759136] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:50.602 [2024-07-12 11:47:40.818935] accel_perf.c:1464:main: *ERROR*: ERROR starting application 00:06:50.862 A filename is required. 00:06:50.862 11:47:40 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # es=234 00:06:50.862 11:47:40 accel.accel_missing_filename -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:50.862 11:47:40 accel.accel_missing_filename -- common/autotest_common.sh@660 -- # es=106 00:06:50.862 11:47:40 accel.accel_missing_filename -- common/autotest_common.sh@661 -- # case "$es" in 00:06:50.862 11:47:40 accel.accel_missing_filename -- common/autotest_common.sh@668 -- # es=1 00:06:50.862 11:47:40 accel.accel_missing_filename -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:50.862 00:06:50.862 real 0m0.355s 00:06:50.862 user 0m0.245s 00:06:50.862 sys 0m0.126s 00:06:50.862 11:47:40 accel.accel_missing_filename -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:50.862 11:47:40 accel.accel_missing_filename -- common/autotest_common.sh@10 -- # set +x 00:06:50.862 ************************************ 00:06:50.862 END TEST accel_missing_filename 00:06:50.862 ************************************ 00:06:50.862 11:47:40 accel -- common/autotest_common.sh@1142 -- # return 0 00:06:50.862 11:47:40 accel -- accel/accel.sh@93 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:50.862 11:47:40 accel -- common/autotest_common.sh@1099 -- # '[' 10 -le 1 ']' 00:06:50.862 11:47:40 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:50.862 11:47:40 accel -- common/autotest_common.sh@10 -- # set +x 00:06:50.862 ************************************ 00:06:50.862 START TEST accel_compress_verify 00:06:50.862 ************************************ 00:06:50.862 11:47:40 accel.accel_compress_verify -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:50.862 11:47:40 accel.accel_compress_verify -- common/autotest_common.sh@648 -- # local es=0 00:06:50.862 11:47:40 accel.accel_compress_verify -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:50.862 11:47:40 accel.accel_compress_verify -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:06:50.862 11:47:40 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:50.862 11:47:40 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # type -t accel_perf 00:06:50.862 11:47:40 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:50.862 11:47:40 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:50.862 11:47:40 accel.accel_compress_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:50.862 11:47:40 accel.accel_compress_verify -- accel/accel.sh@12 -- # build_accel_config 00:06:50.862 11:47:40 accel.accel_compress_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:50.862 11:47:40 accel.accel_compress_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:50.862 11:47:40 accel.accel_compress_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:50.862 11:47:40 accel.accel_compress_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:50.862 11:47:40 accel.accel_compress_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:50.862 11:47:40 accel.accel_compress_verify -- accel/accel.sh@40 -- # local IFS=, 00:06:50.862 11:47:40 accel.accel_compress_verify -- accel/accel.sh@41 -- # jq -r . 00:06:50.862 [2024-07-12 11:47:40.974569] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:06:50.862 [2024-07-12 11:47:40.974614] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid560659 ] 00:06:50.862 [2024-07-12 11:47:41.038906] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:51.121 [2024-07-12 11:47:41.109911] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:51.121 [2024-07-12 11:47:41.158898] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:51.121 [2024-07-12 11:47:41.218883] accel_perf.c:1464:main: *ERROR*: ERROR starting application 00:06:51.121 00:06:51.121 Compression does not support the verify option, aborting. 00:06:51.121 11:47:41 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # es=161 00:06:51.121 11:47:41 accel.accel_compress_verify -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:51.121 11:47:41 accel.accel_compress_verify -- common/autotest_common.sh@660 -- # es=33 00:06:51.121 11:47:41 accel.accel_compress_verify -- common/autotest_common.sh@661 -- # case "$es" in 00:06:51.121 11:47:41 accel.accel_compress_verify -- common/autotest_common.sh@668 -- # es=1 00:06:51.121 11:47:41 accel.accel_compress_verify -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:51.121 00:06:51.121 real 0m0.348s 00:06:51.121 user 0m0.256s 00:06:51.121 sys 0m0.123s 00:06:51.121 11:47:41 accel.accel_compress_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:51.121 11:47:41 accel.accel_compress_verify -- common/autotest_common.sh@10 -- # set +x 00:06:51.121 ************************************ 00:06:51.121 END TEST accel_compress_verify 00:06:51.121 ************************************ 00:06:51.121 11:47:41 accel -- common/autotest_common.sh@1142 -- # return 0 00:06:51.121 11:47:41 accel -- accel/accel.sh@95 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:06:51.121 11:47:41 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:06:51.121 11:47:41 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:51.121 11:47:41 accel -- common/autotest_common.sh@10 -- # set +x 00:06:51.121 ************************************ 00:06:51.121 START TEST accel_wrong_workload 00:06:51.121 ************************************ 00:06:51.121 11:47:41 accel.accel_wrong_workload -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w foobar 00:06:51.121 11:47:41 accel.accel_wrong_workload -- common/autotest_common.sh@648 -- # local es=0 00:06:51.121 11:47:41 accel.accel_wrong_workload -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:06:51.121 11:47:41 accel.accel_wrong_workload -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:06:51.121 11:47:41 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:51.121 11:47:41 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # type -t accel_perf 00:06:51.122 11:47:41 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:51.122 11:47:41 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w foobar 00:06:51.122 11:47:41 accel.accel_wrong_workload -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:06:51.122 11:47:41 accel.accel_wrong_workload -- accel/accel.sh@12 -- # build_accel_config 00:06:51.122 11:47:41 accel.accel_wrong_workload -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:51.122 11:47:41 accel.accel_wrong_workload -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:51.122 11:47:41 accel.accel_wrong_workload -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:51.122 11:47:41 accel.accel_wrong_workload -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:51.122 11:47:41 accel.accel_wrong_workload -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:51.381 11:47:41 accel.accel_wrong_workload -- accel/accel.sh@40 -- # local IFS=, 00:06:51.381 11:47:41 accel.accel_wrong_workload -- accel/accel.sh@41 -- # jq -r . 00:06:51.381 Unsupported workload type: foobar 00:06:51.381 [2024-07-12 11:47:41.392142] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:06:51.381 accel_perf options: 00:06:51.381 [-h help message] 00:06:51.381 [-q queue depth per core] 00:06:51.381 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:51.381 [-T number of threads per core 00:06:51.381 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:51.381 [-t time in seconds] 00:06:51.381 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:51.381 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:06:51.381 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:51.381 [-l for compress/decompress workloads, name of uncompressed input file 00:06:51.381 [-S for crc32c workload, use this seed value (default 0) 00:06:51.381 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:51.381 [-f for fill workload, use this BYTE value (default 255) 00:06:51.381 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:51.381 [-y verify result if this switch is on] 00:06:51.381 [-a tasks to allocate per core (default: same value as -q)] 00:06:51.381 Can be used to spread operations across a wider range of memory. 00:06:51.381 11:47:41 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # es=1 00:06:51.381 11:47:41 accel.accel_wrong_workload -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:51.381 11:47:41 accel.accel_wrong_workload -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:06:51.381 11:47:41 accel.accel_wrong_workload -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:51.381 00:06:51.381 real 0m0.038s 00:06:51.381 user 0m0.022s 00:06:51.381 sys 0m0.016s 00:06:51.381 11:47:41 accel.accel_wrong_workload -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:51.381 11:47:41 accel.accel_wrong_workload -- common/autotest_common.sh@10 -- # set +x 00:06:51.381 ************************************ 00:06:51.381 END TEST accel_wrong_workload 00:06:51.381 ************************************ 00:06:51.381 Error: writing output failed: Broken pipe 00:06:51.381 11:47:41 accel -- common/autotest_common.sh@1142 -- # return 0 00:06:51.381 11:47:41 accel -- accel/accel.sh@97 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:06:51.381 11:47:41 accel -- common/autotest_common.sh@1099 -- # '[' 10 -le 1 ']' 00:06:51.381 11:47:41 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:51.381 11:47:41 accel -- common/autotest_common.sh@10 -- # set +x 00:06:51.381 ************************************ 00:06:51.381 START TEST accel_negative_buffers 00:06:51.381 ************************************ 00:06:51.381 11:47:41 accel.accel_negative_buffers -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:06:51.381 11:47:41 accel.accel_negative_buffers -- common/autotest_common.sh@648 -- # local es=0 00:06:51.381 11:47:41 accel.accel_negative_buffers -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:06:51.381 11:47:41 accel.accel_negative_buffers -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:06:51.381 11:47:41 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:51.381 11:47:41 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # type -t accel_perf 00:06:51.381 11:47:41 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:51.381 11:47:41 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w xor -y -x -1 00:06:51.381 11:47:41 accel.accel_negative_buffers -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:06:51.381 11:47:41 accel.accel_negative_buffers -- accel/accel.sh@12 -- # build_accel_config 00:06:51.381 11:47:41 accel.accel_negative_buffers -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:51.381 11:47:41 accel.accel_negative_buffers -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:51.381 11:47:41 accel.accel_negative_buffers -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:51.381 11:47:41 accel.accel_negative_buffers -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:51.381 11:47:41 accel.accel_negative_buffers -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:51.381 11:47:41 accel.accel_negative_buffers -- accel/accel.sh@40 -- # local IFS=, 00:06:51.381 11:47:41 accel.accel_negative_buffers -- accel/accel.sh@41 -- # jq -r . 00:06:51.382 -x option must be non-negative. 00:06:51.382 [2024-07-12 11:47:41.497333] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:06:51.382 accel_perf options: 00:06:51.382 [-h help message] 00:06:51.382 [-q queue depth per core] 00:06:51.382 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:51.382 [-T number of threads per core 00:06:51.382 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:51.382 [-t time in seconds] 00:06:51.382 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:51.382 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:06:51.382 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:51.382 [-l for compress/decompress workloads, name of uncompressed input file 00:06:51.382 [-S for crc32c workload, use this seed value (default 0) 00:06:51.382 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:51.382 [-f for fill workload, use this BYTE value (default 255) 00:06:51.382 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:51.382 [-y verify result if this switch is on] 00:06:51.382 [-a tasks to allocate per core (default: same value as -q)] 00:06:51.382 Can be used to spread operations across a wider range of memory. 00:06:51.382 11:47:41 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # es=1 00:06:51.382 11:47:41 accel.accel_negative_buffers -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:51.382 11:47:41 accel.accel_negative_buffers -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:06:51.382 11:47:41 accel.accel_negative_buffers -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:51.382 00:06:51.382 real 0m0.040s 00:06:51.382 user 0m0.026s 00:06:51.382 sys 0m0.014s 00:06:51.382 11:47:41 accel.accel_negative_buffers -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:51.382 11:47:41 accel.accel_negative_buffers -- common/autotest_common.sh@10 -- # set +x 00:06:51.382 ************************************ 00:06:51.382 END TEST accel_negative_buffers 00:06:51.382 ************************************ 00:06:51.382 Error: writing output failed: Broken pipe 00:06:51.382 11:47:41 accel -- common/autotest_common.sh@1142 -- # return 0 00:06:51.382 11:47:41 accel -- accel/accel.sh@101 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:06:51.382 11:47:41 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:06:51.382 11:47:41 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:51.382 11:47:41 accel -- common/autotest_common.sh@10 -- # set +x 00:06:51.382 ************************************ 00:06:51.382 START TEST accel_crc32c 00:06:51.382 ************************************ 00:06:51.382 11:47:41 accel.accel_crc32c -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w crc32c -S 32 -y 00:06:51.382 11:47:41 accel.accel_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:06:51.382 11:47:41 accel.accel_crc32c -- accel/accel.sh@17 -- # local accel_module 00:06:51.382 11:47:41 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:51.382 11:47:41 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:51.382 11:47:41 accel.accel_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:51.382 11:47:41 accel.accel_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:51.382 11:47:41 accel.accel_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:06:51.382 11:47:41 accel.accel_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:51.382 11:47:41 accel.accel_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:51.382 11:47:41 accel.accel_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:51.382 11:47:41 accel.accel_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:51.382 11:47:41 accel.accel_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:51.382 11:47:41 accel.accel_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:06:51.382 11:47:41 accel.accel_crc32c -- accel/accel.sh@41 -- # jq -r . 00:06:51.382 [2024-07-12 11:47:41.598341] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:06:51.382 [2024-07-12 11:47:41.598399] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid560940 ] 00:06:51.642 [2024-07-12 11:47:41.663494] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:51.642 [2024-07-12 11:47:41.733282] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@20 -- # val=0x1 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@20 -- # val=crc32c 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@23 -- # accel_opc=crc32c 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@20 -- # val=software 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:51.642 11:47:41 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:51.643 11:47:41 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:06:51.643 11:47:41 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:51.643 11:47:41 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:51.643 11:47:41 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:51.643 11:47:41 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:06:51.643 11:47:41 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:51.643 11:47:41 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:51.643 11:47:41 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:51.643 11:47:41 accel.accel_crc32c -- accel/accel.sh@20 -- # val=1 00:06:51.643 11:47:41 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:51.643 11:47:41 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:51.643 11:47:41 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:51.643 11:47:41 accel.accel_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:06:51.643 11:47:41 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:51.643 11:47:41 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:51.643 11:47:41 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:51.643 11:47:41 accel.accel_crc32c -- accel/accel.sh@20 -- # val=Yes 00:06:51.643 11:47:41 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:51.643 11:47:41 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:51.643 11:47:41 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:51.643 11:47:41 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:51.643 11:47:41 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:51.643 11:47:41 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:51.643 11:47:41 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:51.643 11:47:41 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:51.643 11:47:41 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:51.643 11:47:41 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:51.643 11:47:41 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:53.019 11:47:42 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:53.019 11:47:42 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:53.019 11:47:42 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:53.019 11:47:42 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:53.019 11:47:42 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:53.019 11:47:42 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:53.019 11:47:42 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:53.019 11:47:42 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:53.019 11:47:42 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:53.019 11:47:42 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:53.019 11:47:42 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:53.019 11:47:42 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:53.019 11:47:42 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:53.019 11:47:42 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:53.019 11:47:42 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:53.019 11:47:42 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:53.019 11:47:42 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:53.019 11:47:42 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:53.019 11:47:42 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:53.019 11:47:42 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:53.019 11:47:42 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:53.019 11:47:42 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:53.019 11:47:42 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:53.019 11:47:42 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:53.019 11:47:42 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:53.019 11:47:42 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:06:53.019 11:47:42 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:53.019 00:06:53.019 real 0m1.360s 00:06:53.019 user 0m1.233s 00:06:53.019 sys 0m0.130s 00:06:53.019 11:47:42 accel.accel_crc32c -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:53.019 11:47:42 accel.accel_crc32c -- common/autotest_common.sh@10 -- # set +x 00:06:53.019 ************************************ 00:06:53.019 END TEST accel_crc32c 00:06:53.019 ************************************ 00:06:53.019 11:47:42 accel -- common/autotest_common.sh@1142 -- # return 0 00:06:53.019 11:47:42 accel -- accel/accel.sh@102 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:06:53.019 11:47:42 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:06:53.019 11:47:42 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:53.019 11:47:42 accel -- common/autotest_common.sh@10 -- # set +x 00:06:53.019 ************************************ 00:06:53.019 START TEST accel_crc32c_C2 00:06:53.019 ************************************ 00:06:53.019 11:47:42 accel.accel_crc32c_C2 -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w crc32c -y -C 2 00:06:53.019 11:47:42 accel.accel_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:06:53.019 11:47:42 accel.accel_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:06:53.019 11:47:42 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:53.019 11:47:42 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:53.019 11:47:42 accel.accel_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:53.019 11:47:42 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:53.019 11:47:42 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:06:53.019 11:47:42 accel.accel_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:53.019 11:47:42 accel.accel_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:53.019 11:47:42 accel.accel_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:53.019 11:47:42 accel.accel_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:53.019 11:47:42 accel.accel_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:53.019 11:47:42 accel.accel_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:06:53.019 11:47:42 accel.accel_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:06:53.019 [2024-07-12 11:47:43.019665] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:06:53.019 [2024-07-12 11:47:43.019711] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid561184 ] 00:06:53.019 [2024-07-12 11:47:43.084562] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:53.019 [2024-07-12 11:47:43.154614] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=crc32c 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:06:53.019 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.020 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:53.020 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:53.020 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:53.020 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.020 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:53.020 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:53.020 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:06:53.020 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.020 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:53.020 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:53.020 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:53.020 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.020 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:53.020 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:53.020 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:53.020 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.020 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:53.020 11:47:43 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:54.393 11:47:44 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:54.393 11:47:44 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:54.393 11:47:44 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:54.393 11:47:44 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:54.393 11:47:44 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:54.393 11:47:44 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:54.393 11:47:44 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:54.393 11:47:44 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:54.393 11:47:44 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:54.393 11:47:44 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:54.393 11:47:44 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:54.393 11:47:44 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:54.393 11:47:44 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:54.393 11:47:44 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:54.393 11:47:44 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:54.393 11:47:44 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:54.393 11:47:44 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:54.393 11:47:44 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:54.394 11:47:44 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:54.394 11:47:44 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:54.394 11:47:44 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:54.394 11:47:44 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:54.394 11:47:44 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:54.394 11:47:44 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:54.394 11:47:44 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:54.394 11:47:44 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:06:54.394 11:47:44 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:54.394 00:06:54.394 real 0m1.354s 00:06:54.394 user 0m1.243s 00:06:54.394 sys 0m0.120s 00:06:54.394 11:47:44 accel.accel_crc32c_C2 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:54.394 11:47:44 accel.accel_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:06:54.394 ************************************ 00:06:54.394 END TEST accel_crc32c_C2 00:06:54.394 ************************************ 00:06:54.394 11:47:44 accel -- common/autotest_common.sh@1142 -- # return 0 00:06:54.394 11:47:44 accel -- accel/accel.sh@103 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:06:54.394 11:47:44 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:06:54.394 11:47:44 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:54.394 11:47:44 accel -- common/autotest_common.sh@10 -- # set +x 00:06:54.394 ************************************ 00:06:54.394 START TEST accel_copy 00:06:54.394 ************************************ 00:06:54.394 11:47:44 accel.accel_copy -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy -y 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@16 -- # local accel_opc 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@17 -- # local accel_module 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@12 -- # build_accel_config 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@40 -- # local IFS=, 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@41 -- # jq -r . 00:06:54.394 [2024-07-12 11:47:44.433355] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:06:54.394 [2024-07-12 11:47:44.433412] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid561426 ] 00:06:54.394 [2024-07-12 11:47:44.497262] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:54.394 [2024-07-12 11:47:44.568429] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@20 -- # val=0x1 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@20 -- # val=copy 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@23 -- # accel_opc=copy 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@20 -- # val=software 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@22 -- # accel_module=software 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@20 -- # val=1 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@20 -- # val=Yes 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:54.394 11:47:44 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:55.771 11:47:45 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:55.771 11:47:45 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:55.771 11:47:45 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:55.771 11:47:45 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:55.771 11:47:45 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:55.771 11:47:45 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:55.771 11:47:45 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:55.771 11:47:45 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:55.771 11:47:45 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:55.771 11:47:45 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:55.771 11:47:45 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:55.771 11:47:45 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:55.771 11:47:45 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:55.771 11:47:45 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:55.771 11:47:45 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:55.771 11:47:45 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:55.771 11:47:45 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:55.771 11:47:45 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:55.771 11:47:45 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:55.771 11:47:45 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:55.771 11:47:45 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:55.771 11:47:45 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:55.771 11:47:45 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:55.771 11:47:45 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:55.771 11:47:45 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:55.771 11:47:45 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n copy ]] 00:06:55.771 11:47:45 accel.accel_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:55.771 00:06:55.771 real 0m1.357s 00:06:55.771 user 0m1.235s 00:06:55.771 sys 0m0.125s 00:06:55.771 11:47:45 accel.accel_copy -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:55.771 11:47:45 accel.accel_copy -- common/autotest_common.sh@10 -- # set +x 00:06:55.771 ************************************ 00:06:55.771 END TEST accel_copy 00:06:55.771 ************************************ 00:06:55.771 11:47:45 accel -- common/autotest_common.sh@1142 -- # return 0 00:06:55.771 11:47:45 accel -- accel/accel.sh@104 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:55.771 11:47:45 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:06:55.771 11:47:45 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:55.771 11:47:45 accel -- common/autotest_common.sh@10 -- # set +x 00:06:55.771 ************************************ 00:06:55.771 START TEST accel_fill 00:06:55.771 ************************************ 00:06:55.771 11:47:45 accel.accel_fill -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:55.771 11:47:45 accel.accel_fill -- accel/accel.sh@16 -- # local accel_opc 00:06:55.771 11:47:45 accel.accel_fill -- accel/accel.sh@17 -- # local accel_module 00:06:55.771 11:47:45 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:55.771 11:47:45 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:55.771 11:47:45 accel.accel_fill -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:55.772 11:47:45 accel.accel_fill -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:55.772 11:47:45 accel.accel_fill -- accel/accel.sh@12 -- # build_accel_config 00:06:55.772 11:47:45 accel.accel_fill -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:55.772 11:47:45 accel.accel_fill -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:55.772 11:47:45 accel.accel_fill -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:55.772 11:47:45 accel.accel_fill -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:55.772 11:47:45 accel.accel_fill -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:55.772 11:47:45 accel.accel_fill -- accel/accel.sh@40 -- # local IFS=, 00:06:55.772 11:47:45 accel.accel_fill -- accel/accel.sh@41 -- # jq -r . 00:06:55.772 [2024-07-12 11:47:45.858582] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:06:55.772 [2024-07-12 11:47:45.858633] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid561679 ] 00:06:55.772 [2024-07-12 11:47:45.923305] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:55.772 [2024-07-12 11:47:45.992386] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:56.030 11:47:46 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:56.030 11:47:46 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:56.030 11:47:46 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:56.030 11:47:46 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:56.030 11:47:46 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:56.030 11:47:46 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:56.030 11:47:46 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:56.030 11:47:46 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:56.030 11:47:46 accel.accel_fill -- accel/accel.sh@20 -- # val=0x1 00:06:56.030 11:47:46 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:56.030 11:47:46 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:56.030 11:47:46 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:56.030 11:47:46 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:56.030 11:47:46 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:56.030 11:47:46 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:56.030 11:47:46 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:56.030 11:47:46 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:56.030 11:47:46 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:56.030 11:47:46 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:56.030 11:47:46 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:56.030 11:47:46 accel.accel_fill -- accel/accel.sh@20 -- # val=fill 00:06:56.030 11:47:46 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:56.030 11:47:46 accel.accel_fill -- accel/accel.sh@23 -- # accel_opc=fill 00:06:56.030 11:47:46 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:56.030 11:47:46 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:56.030 11:47:46 accel.accel_fill -- accel/accel.sh@20 -- # val=0x80 00:06:56.030 11:47:46 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:56.030 11:47:46 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:56.030 11:47:46 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:56.030 11:47:46 accel.accel_fill -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:56.030 11:47:46 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:56.030 11:47:46 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:56.030 11:47:46 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:56.030 11:47:46 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:56.030 11:47:46 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:56.030 11:47:46 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:56.030 11:47:46 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:56.031 11:47:46 accel.accel_fill -- accel/accel.sh@20 -- # val=software 00:06:56.031 11:47:46 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:56.031 11:47:46 accel.accel_fill -- accel/accel.sh@22 -- # accel_module=software 00:06:56.031 11:47:46 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:56.031 11:47:46 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:56.031 11:47:46 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:06:56.031 11:47:46 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:56.031 11:47:46 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:56.031 11:47:46 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:56.031 11:47:46 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:06:56.031 11:47:46 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:56.031 11:47:46 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:56.031 11:47:46 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:56.031 11:47:46 accel.accel_fill -- accel/accel.sh@20 -- # val=1 00:06:56.031 11:47:46 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:56.031 11:47:46 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:56.031 11:47:46 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:56.031 11:47:46 accel.accel_fill -- accel/accel.sh@20 -- # val='1 seconds' 00:06:56.031 11:47:46 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:56.031 11:47:46 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:56.031 11:47:46 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:56.031 11:47:46 accel.accel_fill -- accel/accel.sh@20 -- # val=Yes 00:06:56.031 11:47:46 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:56.031 11:47:46 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:56.031 11:47:46 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:56.031 11:47:46 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:56.031 11:47:46 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:56.031 11:47:46 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:56.031 11:47:46 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:56.031 11:47:46 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:56.031 11:47:46 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:56.031 11:47:46 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:56.031 11:47:46 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:56.967 11:47:47 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:56.967 11:47:47 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:56.967 11:47:47 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:56.967 11:47:47 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:56.967 11:47:47 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:56.967 11:47:47 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:56.967 11:47:47 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:56.967 11:47:47 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:56.967 11:47:47 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:56.967 11:47:47 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:56.967 11:47:47 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:56.967 11:47:47 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:56.967 11:47:47 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:56.967 11:47:47 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:56.967 11:47:47 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:56.967 11:47:47 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:56.967 11:47:47 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:56.967 11:47:47 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:56.967 11:47:47 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:56.967 11:47:47 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:56.967 11:47:47 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:56.967 11:47:47 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:56.967 11:47:47 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:56.967 11:47:47 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:56.967 11:47:47 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:56.967 11:47:47 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n fill ]] 00:06:56.967 11:47:47 accel.accel_fill -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:56.967 00:06:56.967 real 0m1.355s 00:06:56.967 user 0m1.237s 00:06:56.967 sys 0m0.125s 00:06:56.967 11:47:47 accel.accel_fill -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:56.967 11:47:47 accel.accel_fill -- common/autotest_common.sh@10 -- # set +x 00:06:56.967 ************************************ 00:06:56.967 END TEST accel_fill 00:06:56.967 ************************************ 00:06:57.226 11:47:47 accel -- common/autotest_common.sh@1142 -- # return 0 00:06:57.226 11:47:47 accel -- accel/accel.sh@105 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:06:57.226 11:47:47 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:06:57.226 11:47:47 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:57.227 11:47:47 accel -- common/autotest_common.sh@10 -- # set +x 00:06:57.227 ************************************ 00:06:57.227 START TEST accel_copy_crc32c 00:06:57.227 ************************************ 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy_crc32c -y 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@17 -- # local accel_module 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@41 -- # jq -r . 00:06:57.227 [2024-07-12 11:47:47.276240] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:06:57.227 [2024-07-12 11:47:47.276284] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid561921 ] 00:06:57.227 [2024-07-12 11:47:47.342277] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:57.227 [2024-07-12 11:47:47.412491] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0x1 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=copy_crc32c 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:57.227 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:57.486 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:57.486 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=software 00:06:57.486 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:57.486 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:06:57.486 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:57.486 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:57.486 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:06:57.486 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:57.486 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:57.486 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:57.486 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:06:57.486 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:57.486 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:57.486 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:57.486 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=1 00:06:57.486 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:57.486 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:57.486 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:57.486 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:06:57.486 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:57.486 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:57.486 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:57.486 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=Yes 00:06:57.486 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:57.486 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:57.486 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:57.486 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:57.486 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:57.486 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:57.486 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:57.486 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:57.486 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:57.486 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:57.486 11:47:47 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:58.422 11:47:48 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:58.422 11:47:48 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:58.422 11:47:48 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:58.422 11:47:48 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:58.422 11:47:48 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:58.422 11:47:48 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:58.422 11:47:48 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:58.422 11:47:48 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:58.422 11:47:48 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:58.422 11:47:48 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:58.422 11:47:48 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:58.422 11:47:48 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:58.422 11:47:48 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:58.422 11:47:48 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:58.422 11:47:48 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:58.422 11:47:48 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:58.422 11:47:48 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:58.422 11:47:48 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:58.422 11:47:48 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:58.422 11:47:48 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:58.422 11:47:48 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:58.422 11:47:48 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:58.422 11:47:48 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:58.422 11:47:48 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:58.422 11:47:48 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:58.422 11:47:48 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:06:58.422 11:47:48 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:58.422 00:06:58.422 real 0m1.356s 00:06:58.422 user 0m1.240s 00:06:58.422 sys 0m0.122s 00:06:58.422 11:47:48 accel.accel_copy_crc32c -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:58.422 11:47:48 accel.accel_copy_crc32c -- common/autotest_common.sh@10 -- # set +x 00:06:58.422 ************************************ 00:06:58.422 END TEST accel_copy_crc32c 00:06:58.422 ************************************ 00:06:58.422 11:47:48 accel -- common/autotest_common.sh@1142 -- # return 0 00:06:58.422 11:47:48 accel -- accel/accel.sh@106 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:06:58.422 11:47:48 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:06:58.422 11:47:48 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:58.422 11:47:48 accel -- common/autotest_common.sh@10 -- # set +x 00:06:58.422 ************************************ 00:06:58.422 START TEST accel_copy_crc32c_C2 00:06:58.422 ************************************ 00:06:58.422 11:47:48 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:06:58.422 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:06:58.422 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:06:58.422 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:58.422 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:58.422 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:58.422 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:58.422 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:06:58.422 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:58.422 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:58.422 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:58.422 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:58.422 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:58.422 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:06:58.422 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:06:58.681 [2024-07-12 11:47:48.675406] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:06:58.681 [2024-07-12 11:47:48.675439] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid562171 ] 00:06:58.681 [2024-07-12 11:47:48.738930] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:58.681 [2024-07-12 11:47:48.809230] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:58.681 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:58.681 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.681 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:58.681 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:58.681 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:58.681 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.681 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:58.681 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:58.681 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:06:58.681 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.681 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:58.681 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:58.681 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:58.681 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.681 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:58.681 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:58.681 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:58.681 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.681 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:58.681 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:58.681 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=copy_crc32c 00:06:58.681 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.681 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:06:58.681 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='8192 bytes' 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:58.682 11:47:48 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:00.059 11:47:50 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:00.059 11:47:50 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:00.059 11:47:50 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:00.059 11:47:50 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:00.059 11:47:50 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:00.059 11:47:50 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:00.059 11:47:50 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:00.059 11:47:50 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:00.060 11:47:50 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:00.060 11:47:50 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:00.060 11:47:50 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:00.060 11:47:50 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:00.060 11:47:50 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:00.060 11:47:50 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:00.060 11:47:50 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:00.060 11:47:50 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:00.060 11:47:50 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:00.060 11:47:50 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:00.060 11:47:50 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:00.060 11:47:50 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:00.060 11:47:50 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:00.060 11:47:50 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:00.060 11:47:50 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:00.060 11:47:50 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:00.060 11:47:50 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:00.060 11:47:50 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:07:00.060 11:47:50 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:00.060 00:07:00.060 real 0m1.348s 00:07:00.060 user 0m1.230s 00:07:00.060 sys 0m0.115s 00:07:00.060 11:47:50 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:00.060 11:47:50 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:07:00.060 ************************************ 00:07:00.060 END TEST accel_copy_crc32c_C2 00:07:00.060 ************************************ 00:07:00.060 11:47:50 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:00.060 11:47:50 accel -- accel/accel.sh@107 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:07:00.060 11:47:50 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:00.060 11:47:50 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:00.060 11:47:50 accel -- common/autotest_common.sh@10 -- # set +x 00:07:00.060 ************************************ 00:07:00.060 START TEST accel_dualcast 00:07:00.060 ************************************ 00:07:00.060 11:47:50 accel.accel_dualcast -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dualcast -y 00:07:00.060 11:47:50 accel.accel_dualcast -- accel/accel.sh@16 -- # local accel_opc 00:07:00.060 11:47:50 accel.accel_dualcast -- accel/accel.sh@17 -- # local accel_module 00:07:00.060 11:47:50 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:00.060 11:47:50 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:00.060 11:47:50 accel.accel_dualcast -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:07:00.060 11:47:50 accel.accel_dualcast -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:07:00.060 11:47:50 accel.accel_dualcast -- accel/accel.sh@12 -- # build_accel_config 00:07:00.060 11:47:50 accel.accel_dualcast -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:00.060 11:47:50 accel.accel_dualcast -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:00.060 11:47:50 accel.accel_dualcast -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:00.060 11:47:50 accel.accel_dualcast -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:00.060 11:47:50 accel.accel_dualcast -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:00.060 11:47:50 accel.accel_dualcast -- accel/accel.sh@40 -- # local IFS=, 00:07:00.060 11:47:50 accel.accel_dualcast -- accel/accel.sh@41 -- # jq -r . 00:07:00.060 [2024-07-12 11:47:50.108286] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:07:00.060 [2024-07-12 11:47:50.108340] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid562416 ] 00:07:00.060 [2024-07-12 11:47:50.173734] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:00.060 [2024-07-12 11:47:50.250541] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:00.060 11:47:50 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:00.060 11:47:50 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:00.060 11:47:50 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:00.060 11:47:50 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:00.060 11:47:50 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:00.060 11:47:50 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:00.060 11:47:50 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:00.060 11:47:50 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:00.060 11:47:50 accel.accel_dualcast -- accel/accel.sh@20 -- # val=0x1 00:07:00.060 11:47:50 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:00.060 11:47:50 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:00.060 11:47:50 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:00.060 11:47:50 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:00.060 11:47:50 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:00.060 11:47:50 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:00.060 11:47:50 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:00.060 11:47:50 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:00.060 11:47:50 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:00.060 11:47:50 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:00.060 11:47:50 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:00.060 11:47:50 accel.accel_dualcast -- accel/accel.sh@20 -- # val=dualcast 00:07:00.060 11:47:50 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:00.060 11:47:50 accel.accel_dualcast -- accel/accel.sh@23 -- # accel_opc=dualcast 00:07:00.319 11:47:50 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:00.319 11:47:50 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:00.319 11:47:50 accel.accel_dualcast -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:00.319 11:47:50 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:00.319 11:47:50 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:00.319 11:47:50 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:00.319 11:47:50 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:00.319 11:47:50 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:00.319 11:47:50 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:00.319 11:47:50 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:00.319 11:47:50 accel.accel_dualcast -- accel/accel.sh@20 -- # val=software 00:07:00.319 11:47:50 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:00.319 11:47:50 accel.accel_dualcast -- accel/accel.sh@22 -- # accel_module=software 00:07:00.319 11:47:50 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:00.319 11:47:50 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:00.319 11:47:50 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:07:00.319 11:47:50 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:00.319 11:47:50 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:00.319 11:47:50 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:00.319 11:47:50 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:07:00.319 11:47:50 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:00.319 11:47:50 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:00.319 11:47:50 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:00.319 11:47:50 accel.accel_dualcast -- accel/accel.sh@20 -- # val=1 00:07:00.319 11:47:50 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:00.319 11:47:50 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:00.319 11:47:50 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:00.319 11:47:50 accel.accel_dualcast -- accel/accel.sh@20 -- # val='1 seconds' 00:07:00.319 11:47:50 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:00.319 11:47:50 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:00.319 11:47:50 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:00.319 11:47:50 accel.accel_dualcast -- accel/accel.sh@20 -- # val=Yes 00:07:00.320 11:47:50 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:00.320 11:47:50 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:00.320 11:47:50 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:00.320 11:47:50 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:00.320 11:47:50 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:00.320 11:47:50 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:00.320 11:47:50 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:00.320 11:47:50 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:00.320 11:47:50 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:00.320 11:47:50 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:00.320 11:47:50 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:01.255 11:47:51 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:01.255 11:47:51 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:01.255 11:47:51 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:01.255 11:47:51 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:01.255 11:47:51 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:01.255 11:47:51 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:01.255 11:47:51 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:01.255 11:47:51 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:01.255 11:47:51 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:01.255 11:47:51 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:01.255 11:47:51 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:01.255 11:47:51 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:01.255 11:47:51 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:01.255 11:47:51 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:01.255 11:47:51 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:01.255 11:47:51 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:01.255 11:47:51 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:01.255 11:47:51 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:01.255 11:47:51 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:01.255 11:47:51 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:01.255 11:47:51 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:01.255 11:47:51 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:01.255 11:47:51 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:01.255 11:47:51 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:01.255 11:47:51 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:01.255 11:47:51 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n dualcast ]] 00:07:01.255 11:47:51 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:01.255 00:07:01.255 real 0m1.361s 00:07:01.255 user 0m1.246s 00:07:01.255 sys 0m0.122s 00:07:01.255 11:47:51 accel.accel_dualcast -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:01.255 11:47:51 accel.accel_dualcast -- common/autotest_common.sh@10 -- # set +x 00:07:01.255 ************************************ 00:07:01.255 END TEST accel_dualcast 00:07:01.255 ************************************ 00:07:01.255 11:47:51 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:01.255 11:47:51 accel -- accel/accel.sh@108 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:07:01.255 11:47:51 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:01.255 11:47:51 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:01.255 11:47:51 accel -- common/autotest_common.sh@10 -- # set +x 00:07:01.514 ************************************ 00:07:01.514 START TEST accel_compare 00:07:01.514 ************************************ 00:07:01.514 11:47:51 accel.accel_compare -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compare -y 00:07:01.514 11:47:51 accel.accel_compare -- accel/accel.sh@16 -- # local accel_opc 00:07:01.514 11:47:51 accel.accel_compare -- accel/accel.sh@17 -- # local accel_module 00:07:01.514 11:47:51 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:01.514 11:47:51 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:01.514 11:47:51 accel.accel_compare -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:07:01.514 11:47:51 accel.accel_compare -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:07:01.514 11:47:51 accel.accel_compare -- accel/accel.sh@12 -- # build_accel_config 00:07:01.514 11:47:51 accel.accel_compare -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:01.514 11:47:51 accel.accel_compare -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:01.514 11:47:51 accel.accel_compare -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:01.514 11:47:51 accel.accel_compare -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:01.514 11:47:51 accel.accel_compare -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:01.514 11:47:51 accel.accel_compare -- accel/accel.sh@40 -- # local IFS=, 00:07:01.514 11:47:51 accel.accel_compare -- accel/accel.sh@41 -- # jq -r . 00:07:01.514 [2024-07-12 11:47:51.533532] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:07:01.514 [2024-07-12 11:47:51.533586] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid562662 ] 00:07:01.514 [2024-07-12 11:47:51.597872] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:01.514 [2024-07-12 11:47:51.672950] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:01.514 11:47:51 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:01.514 11:47:51 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:01.514 11:47:51 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:01.514 11:47:51 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:01.514 11:47:51 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:01.514 11:47:51 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:01.514 11:47:51 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:01.514 11:47:51 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:01.514 11:47:51 accel.accel_compare -- accel/accel.sh@20 -- # val=0x1 00:07:01.514 11:47:51 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:01.514 11:47:51 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:01.514 11:47:51 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:01.514 11:47:51 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:01.514 11:47:51 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@20 -- # val=compare 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@23 -- # accel_opc=compare 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@20 -- # val=software 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@22 -- # accel_module=software 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@20 -- # val=1 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@20 -- # val='1 seconds' 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@20 -- # val=Yes 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:01.515 11:47:51 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:02.891 11:47:52 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:02.891 11:47:52 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:02.891 11:47:52 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:02.891 11:47:52 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:02.891 11:47:52 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:02.891 11:47:52 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:02.891 11:47:52 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:02.891 11:47:52 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:02.891 11:47:52 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:02.891 11:47:52 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:02.891 11:47:52 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:02.891 11:47:52 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:02.891 11:47:52 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:02.891 11:47:52 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:02.891 11:47:52 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:02.891 11:47:52 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:02.891 11:47:52 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:02.891 11:47:52 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:02.891 11:47:52 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:02.891 11:47:52 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:02.891 11:47:52 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:02.891 11:47:52 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:02.892 11:47:52 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:02.892 11:47:52 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:02.892 11:47:52 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:02.892 11:47:52 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n compare ]] 00:07:02.892 11:47:52 accel.accel_compare -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:02.892 00:07:02.892 real 0m1.366s 00:07:02.892 user 0m1.241s 00:07:02.892 sys 0m0.129s 00:07:02.892 11:47:52 accel.accel_compare -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:02.892 11:47:52 accel.accel_compare -- common/autotest_common.sh@10 -- # set +x 00:07:02.892 ************************************ 00:07:02.892 END TEST accel_compare 00:07:02.892 ************************************ 00:07:02.892 11:47:52 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:02.892 11:47:52 accel -- accel/accel.sh@109 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:07:02.892 11:47:52 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:02.892 11:47:52 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:02.892 11:47:52 accel -- common/autotest_common.sh@10 -- # set +x 00:07:02.892 ************************************ 00:07:02.892 START TEST accel_xor 00:07:02.892 ************************************ 00:07:02.892 11:47:52 accel.accel_xor -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w xor -y 00:07:02.892 11:47:52 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:07:02.892 11:47:52 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:07:02.892 11:47:52 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:02.892 11:47:52 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:02.892 11:47:52 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:07:02.892 11:47:52 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:07:02.892 11:47:52 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:07:02.892 11:47:52 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:02.892 11:47:52 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:02.892 11:47:52 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:02.892 11:47:52 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:02.892 11:47:52 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:02.892 11:47:52 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:07:02.892 11:47:52 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:07:02.892 [2024-07-12 11:47:52.964524] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:07:02.892 [2024-07-12 11:47:52.964572] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid562914 ] 00:07:02.892 [2024-07-12 11:47:53.029815] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:02.892 [2024-07-12 11:47:53.099110] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:03.149 11:47:53 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:03.149 11:47:53 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:03.149 11:47:53 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:03.149 11:47:53 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:03.149 11:47:53 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:03.149 11:47:53 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:03.149 11:47:53 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:03.149 11:47:53 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:03.149 11:47:53 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:07:03.149 11:47:53 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:03.149 11:47:53 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:03.149 11:47:53 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:03.149 11:47:53 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:03.149 11:47:53 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:03.149 11:47:53 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:03.149 11:47:53 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@20 -- # val=2 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:03.150 11:47:53 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:04.084 11:47:54 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:04.084 11:47:54 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:04.084 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:04.084 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:04.084 11:47:54 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:04.084 11:47:54 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:04.084 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:04.084 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:04.084 11:47:54 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:04.085 11:47:54 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:04.085 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:04.085 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:04.085 11:47:54 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:04.085 11:47:54 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:04.085 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:04.085 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:04.085 11:47:54 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:04.085 11:47:54 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:04.085 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:04.085 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:04.085 11:47:54 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:04.085 11:47:54 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:04.085 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:04.085 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:04.085 11:47:54 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:04.085 11:47:54 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:07:04.085 11:47:54 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:04.085 00:07:04.085 real 0m1.359s 00:07:04.085 user 0m1.239s 00:07:04.085 sys 0m0.126s 00:07:04.085 11:47:54 accel.accel_xor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:04.085 11:47:54 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:07:04.085 ************************************ 00:07:04.085 END TEST accel_xor 00:07:04.085 ************************************ 00:07:04.085 11:47:54 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:04.085 11:47:54 accel -- accel/accel.sh@110 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:07:04.085 11:47:54 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:04.085 11:47:54 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:04.085 11:47:54 accel -- common/autotest_common.sh@10 -- # set +x 00:07:04.344 ************************************ 00:07:04.344 START TEST accel_xor 00:07:04.344 ************************************ 00:07:04.344 11:47:54 accel.accel_xor -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w xor -y -x 3 00:07:04.344 11:47:54 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:07:04.344 11:47:54 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:07:04.344 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:04.344 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:04.344 11:47:54 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:07:04.344 11:47:54 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:07:04.344 11:47:54 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:07:04.344 11:47:54 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:04.344 11:47:54 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:04.344 11:47:54 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:04.344 11:47:54 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:04.344 11:47:54 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:04.344 11:47:54 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:07:04.344 11:47:54 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:07:04.344 [2024-07-12 11:47:54.374718] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:07:04.344 [2024-07-12 11:47:54.374761] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid563159 ] 00:07:04.344 [2024-07-12 11:47:54.438275] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:04.344 [2024-07-12 11:47:54.508414] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:04.344 11:47:54 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:04.344 11:47:54 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:04.344 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:04.344 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:04.344 11:47:54 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:04.344 11:47:54 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:04.344 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:04.344 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:04.344 11:47:54 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:07:04.344 11:47:54 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:04.344 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:04.344 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:04.344 11:47:54 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:04.344 11:47:54 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:04.344 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:04.344 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:04.344 11:47:54 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:04.344 11:47:54 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:04.344 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:04.344 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:04.344 11:47:54 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:07:04.344 11:47:54 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:04.344 11:47:54 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:07:04.344 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:04.344 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:04.344 11:47:54 accel.accel_xor -- accel/accel.sh@20 -- # val=3 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:04.345 11:47:54 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:05.722 11:47:55 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:05.722 11:47:55 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:05.722 11:47:55 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:05.722 11:47:55 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:05.722 11:47:55 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:05.722 11:47:55 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:05.722 11:47:55 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:05.722 11:47:55 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:05.722 11:47:55 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:05.722 11:47:55 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:05.722 11:47:55 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:05.722 11:47:55 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:05.722 11:47:55 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:05.722 11:47:55 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:05.722 11:47:55 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:05.722 11:47:55 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:05.722 11:47:55 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:05.722 11:47:55 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:05.722 11:47:55 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:05.722 11:47:55 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:05.722 11:47:55 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:05.722 11:47:55 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:05.722 11:47:55 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:05.722 11:47:55 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:05.722 11:47:55 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:05.722 11:47:55 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:07:05.722 11:47:55 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:05.722 00:07:05.722 real 0m1.360s 00:07:05.722 user 0m1.247s 00:07:05.722 sys 0m0.114s 00:07:05.722 11:47:55 accel.accel_xor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:05.722 11:47:55 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:07:05.722 ************************************ 00:07:05.722 END TEST accel_xor 00:07:05.722 ************************************ 00:07:05.722 11:47:55 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:05.722 11:47:55 accel -- accel/accel.sh@111 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:07:05.722 11:47:55 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:07:05.722 11:47:55 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:05.722 11:47:55 accel -- common/autotest_common.sh@10 -- # set +x 00:07:05.722 ************************************ 00:07:05.722 START TEST accel_dif_verify 00:07:05.722 ************************************ 00:07:05.722 11:47:55 accel.accel_dif_verify -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_verify 00:07:05.722 11:47:55 accel.accel_dif_verify -- accel/accel.sh@16 -- # local accel_opc 00:07:05.722 11:47:55 accel.accel_dif_verify -- accel/accel.sh@17 -- # local accel_module 00:07:05.722 11:47:55 accel.accel_dif_verify -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:07:05.722 11:47:55 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:05.722 11:47:55 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:05.722 11:47:55 accel.accel_dif_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:07:05.722 11:47:55 accel.accel_dif_verify -- accel/accel.sh@12 -- # build_accel_config 00:07:05.722 11:47:55 accel.accel_dif_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:05.722 11:47:55 accel.accel_dif_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:05.722 11:47:55 accel.accel_dif_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:05.722 11:47:55 accel.accel_dif_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:05.722 11:47:55 accel.accel_dif_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:05.722 11:47:55 accel.accel_dif_verify -- accel/accel.sh@40 -- # local IFS=, 00:07:05.722 11:47:55 accel.accel_dif_verify -- accel/accel.sh@41 -- # jq -r . 00:07:05.722 [2024-07-12 11:47:55.782545] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:07:05.722 [2024-07-12 11:47:55.782581] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid563408 ] 00:07:05.722 [2024-07-12 11:47:55.846110] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:05.722 [2024-07-12 11:47:55.916356] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=0x1 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=dif_verify 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@23 -- # accel_opc=dif_verify 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='512 bytes' 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='8 bytes' 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=software 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@22 -- # accel_module=software 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=1 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='1 seconds' 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=No 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:05.981 11:47:55 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:05.982 11:47:55 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:05.982 11:47:55 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:05.982 11:47:55 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:05.982 11:47:55 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:05.982 11:47:55 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:05.982 11:47:55 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:06.918 11:47:57 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:06.918 11:47:57 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:06.918 11:47:57 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:06.918 11:47:57 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:06.918 11:47:57 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:06.918 11:47:57 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:06.918 11:47:57 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:06.918 11:47:57 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:06.918 11:47:57 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:06.918 11:47:57 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:06.918 11:47:57 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:06.918 11:47:57 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:06.918 11:47:57 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:06.918 11:47:57 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:06.918 11:47:57 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:06.918 11:47:57 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:06.918 11:47:57 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:06.918 11:47:57 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:06.918 11:47:57 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:06.918 11:47:57 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:06.918 11:47:57 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:06.918 11:47:57 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:06.918 11:47:57 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:06.918 11:47:57 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:06.918 11:47:57 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:06.918 11:47:57 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n dif_verify ]] 00:07:06.918 11:47:57 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:06.918 00:07:06.918 real 0m1.345s 00:07:06.918 user 0m1.231s 00:07:06.918 sys 0m0.119s 00:07:06.918 11:47:57 accel.accel_dif_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:06.918 11:47:57 accel.accel_dif_verify -- common/autotest_common.sh@10 -- # set +x 00:07:06.918 ************************************ 00:07:06.918 END TEST accel_dif_verify 00:07:06.918 ************************************ 00:07:06.918 11:47:57 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:06.918 11:47:57 accel -- accel/accel.sh@112 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:07:06.918 11:47:57 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:07:06.918 11:47:57 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:06.918 11:47:57 accel -- common/autotest_common.sh@10 -- # set +x 00:07:07.178 ************************************ 00:07:07.178 START TEST accel_dif_generate 00:07:07.178 ************************************ 00:07:07.178 11:47:57 accel.accel_dif_generate -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_generate 00:07:07.178 11:47:57 accel.accel_dif_generate -- accel/accel.sh@16 -- # local accel_opc 00:07:07.178 11:47:57 accel.accel_dif_generate -- accel/accel.sh@17 -- # local accel_module 00:07:07.178 11:47:57 accel.accel_dif_generate -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:07:07.178 11:47:57 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:07.178 11:47:57 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:07.178 11:47:57 accel.accel_dif_generate -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:07:07.178 11:47:57 accel.accel_dif_generate -- accel/accel.sh@12 -- # build_accel_config 00:07:07.178 11:47:57 accel.accel_dif_generate -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:07.178 11:47:57 accel.accel_dif_generate -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:07.178 11:47:57 accel.accel_dif_generate -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:07.178 11:47:57 accel.accel_dif_generate -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:07.178 11:47:57 accel.accel_dif_generate -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:07.178 11:47:57 accel.accel_dif_generate -- accel/accel.sh@40 -- # local IFS=, 00:07:07.178 11:47:57 accel.accel_dif_generate -- accel/accel.sh@41 -- # jq -r . 00:07:07.178 [2024-07-12 11:47:57.194990] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:07:07.178 [2024-07-12 11:47:57.195034] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid563651 ] 00:07:07.178 [2024-07-12 11:47:57.258379] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:07.178 [2024-07-12 11:47:57.330364] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:07.178 11:47:57 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:07.178 11:47:57 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:07.178 11:47:57 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:07.178 11:47:57 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:07.178 11:47:57 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:07.178 11:47:57 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:07.178 11:47:57 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:07.178 11:47:57 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:07.178 11:47:57 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=0x1 00:07:07.178 11:47:57 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:07.178 11:47:57 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=dif_generate 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@23 -- # accel_opc=dif_generate 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='512 bytes' 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='8 bytes' 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=software 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@22 -- # accel_module=software 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=1 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='1 seconds' 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=No 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:07.179 11:47:57 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:08.552 11:47:58 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:08.552 11:47:58 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:08.552 11:47:58 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:08.552 11:47:58 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:08.552 11:47:58 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:08.552 11:47:58 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:08.552 11:47:58 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:08.552 11:47:58 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:08.552 11:47:58 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:08.552 11:47:58 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:08.552 11:47:58 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:08.552 11:47:58 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:08.552 11:47:58 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:08.552 11:47:58 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:08.552 11:47:58 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:08.552 11:47:58 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:08.552 11:47:58 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:08.552 11:47:58 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:08.552 11:47:58 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:08.552 11:47:58 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:08.552 11:47:58 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:08.552 11:47:58 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:08.552 11:47:58 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:08.552 11:47:58 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:08.552 11:47:58 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:08.552 11:47:58 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n dif_generate ]] 00:07:08.552 11:47:58 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:08.552 00:07:08.552 real 0m1.358s 00:07:08.552 user 0m1.250s 00:07:08.552 sys 0m0.112s 00:07:08.552 11:47:58 accel.accel_dif_generate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:08.552 11:47:58 accel.accel_dif_generate -- common/autotest_common.sh@10 -- # set +x 00:07:08.552 ************************************ 00:07:08.552 END TEST accel_dif_generate 00:07:08.552 ************************************ 00:07:08.552 11:47:58 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:08.552 11:47:58 accel -- accel/accel.sh@113 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:07:08.552 11:47:58 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:07:08.552 11:47:58 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:08.552 11:47:58 accel -- common/autotest_common.sh@10 -- # set +x 00:07:08.552 ************************************ 00:07:08.552 START TEST accel_dif_generate_copy 00:07:08.552 ************************************ 00:07:08.552 11:47:58 accel.accel_dif_generate_copy -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_generate_copy 00:07:08.552 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@16 -- # local accel_opc 00:07:08.552 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@17 -- # local accel_module 00:07:08.552 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:08.552 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:08.552 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:07:08.552 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:07:08.552 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # build_accel_config 00:07:08.552 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:08.552 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:08.552 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:08.552 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:08.552 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:08.552 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@40 -- # local IFS=, 00:07:08.552 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@41 -- # jq -r . 00:07:08.552 [2024-07-12 11:47:58.612224] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:07:08.552 [2024-07-12 11:47:58.612269] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid563897 ] 00:07:08.552 [2024-07-12 11:47:58.676529] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:08.552 [2024-07-12 11:47:58.746684] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:08.552 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:08.552 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:08.552 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:08.552 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:08.552 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:08.813 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:08.813 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:08.813 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:08.813 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=0x1 00:07:08.813 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:08.813 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:08.813 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:08.813 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:08.813 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=dif_generate_copy 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@23 -- # accel_opc=dif_generate_copy 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=software 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@22 -- # accel_module=software 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=1 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=No 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:08.814 11:47:58 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:09.749 11:47:59 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:09.749 11:47:59 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:09.749 11:47:59 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:09.749 11:47:59 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:09.749 11:47:59 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:09.749 11:47:59 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:09.749 11:47:59 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:09.749 11:47:59 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:09.749 11:47:59 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:09.749 11:47:59 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:09.749 11:47:59 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:09.749 11:47:59 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:09.749 11:47:59 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:09.749 11:47:59 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:09.749 11:47:59 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:09.749 11:47:59 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:09.749 11:47:59 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:09.749 11:47:59 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:09.749 11:47:59 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:09.749 11:47:59 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:09.749 11:47:59 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:09.749 11:47:59 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:09.749 11:47:59 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:09.749 11:47:59 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:09.749 11:47:59 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:09.749 11:47:59 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n dif_generate_copy ]] 00:07:09.749 11:47:59 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:09.749 00:07:09.749 real 0m1.352s 00:07:09.749 user 0m1.233s 00:07:09.749 sys 0m0.129s 00:07:09.749 11:47:59 accel.accel_dif_generate_copy -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:09.749 11:47:59 accel.accel_dif_generate_copy -- common/autotest_common.sh@10 -- # set +x 00:07:09.749 ************************************ 00:07:09.749 END TEST accel_dif_generate_copy 00:07:09.749 ************************************ 00:07:09.749 11:47:59 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:09.749 11:47:59 accel -- accel/accel.sh@115 -- # [[ y == y ]] 00:07:09.749 11:47:59 accel -- accel/accel.sh@116 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:09.749 11:47:59 accel -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:07:09.749 11:47:59 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:09.749 11:47:59 accel -- common/autotest_common.sh@10 -- # set +x 00:07:09.749 ************************************ 00:07:09.749 START TEST accel_comp 00:07:09.749 ************************************ 00:07:09.749 11:47:59 accel.accel_comp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:09.749 11:47:59 accel.accel_comp -- accel/accel.sh@16 -- # local accel_opc 00:07:09.749 11:47:59 accel.accel_comp -- accel/accel.sh@17 -- # local accel_module 00:07:09.749 11:47:59 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:09.749 11:47:59 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:09.749 11:47:59 accel.accel_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:09.749 11:47:59 accel.accel_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:09.749 11:47:59 accel.accel_comp -- accel/accel.sh@12 -- # build_accel_config 00:07:09.749 11:47:59 accel.accel_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:09.749 11:47:59 accel.accel_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:09.749 11:47:59 accel.accel_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:09.749 11:47:59 accel.accel_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:09.749 11:47:59 accel.accel_comp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:09.749 11:47:59 accel.accel_comp -- accel/accel.sh@40 -- # local IFS=, 00:07:09.749 11:47:59 accel.accel_comp -- accel/accel.sh@41 -- # jq -r . 00:07:10.007 [2024-07-12 11:48:00.018158] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:07:10.007 [2024-07-12 11:48:00.018205] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid564146 ] 00:07:10.007 [2024-07-12 11:48:00.085099] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:10.007 [2024-07-12 11:48:00.157614] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:10.007 11:48:00 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:10.007 11:48:00 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:10.007 11:48:00 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:10.007 11:48:00 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:10.007 11:48:00 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:10.007 11:48:00 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:10.007 11:48:00 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:10.007 11:48:00 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:10.007 11:48:00 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:10.007 11:48:00 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:10.007 11:48:00 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:10.007 11:48:00 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:10.007 11:48:00 accel.accel_comp -- accel/accel.sh@20 -- # val=0x1 00:07:10.007 11:48:00 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@20 -- # val=compress 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@20 -- # val=software 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@22 -- # accel_module=software 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@20 -- # val=1 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@20 -- # val=No 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:10.008 11:48:00 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:11.428 11:48:01 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:11.428 11:48:01 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.428 11:48:01 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:11.428 11:48:01 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:11.428 11:48:01 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:11.428 11:48:01 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.428 11:48:01 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:11.428 11:48:01 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:11.428 11:48:01 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:11.428 11:48:01 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.428 11:48:01 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:11.428 11:48:01 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:11.428 11:48:01 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:11.428 11:48:01 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.428 11:48:01 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:11.428 11:48:01 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:11.428 11:48:01 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:11.428 11:48:01 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.428 11:48:01 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:11.428 11:48:01 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:11.428 11:48:01 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:11.428 11:48:01 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.428 11:48:01 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:11.428 11:48:01 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:11.428 11:48:01 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:11.428 11:48:01 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:07:11.428 11:48:01 accel.accel_comp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:11.428 00:07:11.428 real 0m1.369s 00:07:11.428 user 0m1.246s 00:07:11.428 sys 0m0.125s 00:07:11.428 11:48:01 accel.accel_comp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:11.428 11:48:01 accel.accel_comp -- common/autotest_common.sh@10 -- # set +x 00:07:11.428 ************************************ 00:07:11.428 END TEST accel_comp 00:07:11.428 ************************************ 00:07:11.428 11:48:01 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:11.428 11:48:01 accel -- accel/accel.sh@117 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:11.428 11:48:01 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:11.428 11:48:01 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:11.428 11:48:01 accel -- common/autotest_common.sh@10 -- # set +x 00:07:11.428 ************************************ 00:07:11.428 START TEST accel_decomp 00:07:11.428 ************************************ 00:07:11.428 11:48:01 accel.accel_decomp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:11.428 11:48:01 accel.accel_decomp -- accel/accel.sh@16 -- # local accel_opc 00:07:11.428 11:48:01 accel.accel_decomp -- accel/accel.sh@17 -- # local accel_module 00:07:11.428 11:48:01 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:11.428 11:48:01 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:11.428 11:48:01 accel.accel_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:11.428 11:48:01 accel.accel_decomp -- accel/accel.sh@12 -- # build_accel_config 00:07:11.428 11:48:01 accel.accel_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:11.428 11:48:01 accel.accel_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:11.428 11:48:01 accel.accel_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:11.428 11:48:01 accel.accel_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:11.428 11:48:01 accel.accel_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:11.428 11:48:01 accel.accel_decomp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:11.428 11:48:01 accel.accel_decomp -- accel/accel.sh@40 -- # local IFS=, 00:07:11.428 11:48:01 accel.accel_decomp -- accel/accel.sh@41 -- # jq -r . 00:07:11.428 [2024-07-12 11:48:01.444216] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:07:11.428 [2024-07-12 11:48:01.444261] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid564475 ] 00:07:11.428 [2024-07-12 11:48:01.508603] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:11.428 [2024-07-12 11:48:01.580089] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:11.428 11:48:01 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:11.428 11:48:01 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.428 11:48:01 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:11.428 11:48:01 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:11.428 11:48:01 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:11.428 11:48:01 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.428 11:48:01 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:11.428 11:48:01 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:11.428 11:48:01 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@20 -- # val=0x1 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@20 -- # val=decompress 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@20 -- # val=software 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@22 -- # accel_module=software 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@20 -- # val=1 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@20 -- # val=Yes 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:11.429 11:48:01 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:12.882 11:48:02 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:12.882 11:48:02 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:12.882 11:48:02 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:12.882 11:48:02 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:12.882 11:48:02 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:12.882 11:48:02 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:12.882 11:48:02 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:12.882 11:48:02 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:12.882 11:48:02 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:12.882 11:48:02 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:12.882 11:48:02 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:12.882 11:48:02 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:12.882 11:48:02 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:12.882 11:48:02 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:12.882 11:48:02 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:12.882 11:48:02 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:12.882 11:48:02 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:12.883 11:48:02 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:12.883 11:48:02 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:12.883 11:48:02 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:12.883 11:48:02 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:12.883 11:48:02 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:12.883 11:48:02 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:12.883 11:48:02 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:12.883 11:48:02 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:12.883 11:48:02 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:12.883 11:48:02 accel.accel_decomp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:12.883 00:07:12.883 real 0m1.360s 00:07:12.883 user 0m0.014s 00:07:12.883 sys 0m0.001s 00:07:12.883 11:48:02 accel.accel_decomp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:12.883 11:48:02 accel.accel_decomp -- common/autotest_common.sh@10 -- # set +x 00:07:12.883 ************************************ 00:07:12.883 END TEST accel_decomp 00:07:12.883 ************************************ 00:07:12.883 11:48:02 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:12.883 11:48:02 accel -- accel/accel.sh@118 -- # run_test accel_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:12.883 11:48:02 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:07:12.883 11:48:02 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:12.883 11:48:02 accel -- common/autotest_common.sh@10 -- # set +x 00:07:12.883 ************************************ 00:07:12.883 START TEST accel_decomp_full 00:07:12.883 ************************************ 00:07:12.883 11:48:02 accel.accel_decomp_full -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:12.883 11:48:02 accel.accel_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:07:12.883 11:48:02 accel.accel_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:07:12.883 11:48:02 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:12.883 11:48:02 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:12.883 11:48:02 accel.accel_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:12.883 11:48:02 accel.accel_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:12.883 11:48:02 accel.accel_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:07:12.883 11:48:02 accel.accel_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:12.883 11:48:02 accel.accel_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:12.883 11:48:02 accel.accel_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:12.883 11:48:02 accel.accel_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:12.883 11:48:02 accel.accel_decomp_full -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:12.883 11:48:02 accel.accel_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:07:12.883 11:48:02 accel.accel_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:07:12.883 [2024-07-12 11:48:02.847407] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:07:12.883 [2024-07-12 11:48:02.847451] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid564765 ] 00:07:12.883 [2024-07-12 11:48:02.911772] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:12.883 [2024-07-12 11:48:02.982753] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=software 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@22 -- # accel_module=software 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=1 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:12.883 11:48:03 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:14.260 11:48:04 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:14.260 11:48:04 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:14.260 11:48:04 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:14.260 11:48:04 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:14.260 11:48:04 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:14.260 11:48:04 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:14.260 11:48:04 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:14.260 11:48:04 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:14.260 11:48:04 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:14.260 11:48:04 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:14.260 11:48:04 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:14.260 11:48:04 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:14.260 11:48:04 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:14.260 11:48:04 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:14.260 11:48:04 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:14.260 11:48:04 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:14.260 11:48:04 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:14.260 11:48:04 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:14.260 11:48:04 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:14.260 11:48:04 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:14.260 11:48:04 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:14.260 11:48:04 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:14.260 11:48:04 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:14.260 11:48:04 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:14.260 11:48:04 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:14.260 11:48:04 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:14.260 11:48:04 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:14.260 00:07:14.260 real 0m1.366s 00:07:14.260 user 0m0.012s 00:07:14.260 sys 0m0.002s 00:07:14.260 11:48:04 accel.accel_decomp_full -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:14.260 11:48:04 accel.accel_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:07:14.260 ************************************ 00:07:14.260 END TEST accel_decomp_full 00:07:14.260 ************************************ 00:07:14.260 11:48:04 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:14.260 11:48:04 accel -- accel/accel.sh@119 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:14.260 11:48:04 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:07:14.260 11:48:04 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:14.260 11:48:04 accel -- common/autotest_common.sh@10 -- # set +x 00:07:14.260 ************************************ 00:07:14.260 START TEST accel_decomp_mcore 00:07:14.260 ************************************ 00:07:14.260 11:48:04 accel.accel_decomp_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:14.260 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:07:14.261 [2024-07-12 11:48:04.260733] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:07:14.261 [2024-07-12 11:48:04.260768] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid565022 ] 00:07:14.261 [2024-07-12 11:48:04.322598] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:14.261 [2024-07-12 11:48:04.395998] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:14.261 [2024-07-12 11:48:04.396099] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:14.261 [2024-07-12 11:48:04.396162] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:14.261 [2024-07-12 11:48:04.396163] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=software 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@22 -- # accel_module=software 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:14.261 11:48:04 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:15.640 00:07:15.640 real 0m1.351s 00:07:15.640 user 0m0.013s 00:07:15.640 sys 0m0.001s 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:15.640 11:48:05 accel.accel_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:07:15.640 ************************************ 00:07:15.640 END TEST accel_decomp_mcore 00:07:15.640 ************************************ 00:07:15.640 11:48:05 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:15.640 11:48:05 accel -- accel/accel.sh@120 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:15.640 11:48:05 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:07:15.640 11:48:05 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:15.640 11:48:05 accel -- common/autotest_common.sh@10 -- # set +x 00:07:15.640 ************************************ 00:07:15.640 START TEST accel_decomp_full_mcore 00:07:15.640 ************************************ 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:07:15.640 [2024-07-12 11:48:05.665373] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:07:15.640 [2024-07-12 11:48:05.665406] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid565267 ] 00:07:15.640 [2024-07-12 11:48:05.727087] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:15.640 [2024-07-12 11:48:05.800130] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:15.640 [2024-07-12 11:48:05.800230] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:15.640 [2024-07-12 11:48:05.800324] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:15.640 [2024-07-12 11:48:05.800326] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:15.640 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=software 00:07:15.641 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:15.641 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=software 00:07:15.641 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:15.641 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:15.641 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:15.641 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:15.641 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:15.641 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:15.641 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:07:15.641 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:15.641 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:15.641 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:15.641 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:07:15.641 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:15.641 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:15.641 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:15.641 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:07:15.641 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:15.641 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:15.641 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:15.641 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:07:15.641 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:15.641 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:15.641 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:15.641 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:07:15.641 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:15.641 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:15.641 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:15.641 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:15.641 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:15.641 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:15.641 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:15.641 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:15.641 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:15.641 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:15.641 11:48:05 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:17.027 00:07:17.027 real 0m1.356s 00:07:17.027 user 0m0.012s 00:07:17.027 sys 0m0.002s 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:17.027 11:48:07 accel.accel_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:07:17.027 ************************************ 00:07:17.027 END TEST accel_decomp_full_mcore 00:07:17.027 ************************************ 00:07:17.027 11:48:07 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:17.027 11:48:07 accel -- accel/accel.sh@121 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:17.027 11:48:07 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:07:17.027 11:48:07 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:17.027 11:48:07 accel -- common/autotest_common.sh@10 -- # set +x 00:07:17.027 ************************************ 00:07:17.027 START TEST accel_decomp_mthread 00:07:17.027 ************************************ 00:07:17.027 11:48:07 accel.accel_decomp_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:17.027 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:07:17.027 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:07:17.027 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:17.027 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:17.027 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:17.027 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:17.027 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:07:17.027 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:17.027 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:17.027 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:17.027 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:17.027 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:17.027 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:07:17.027 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:07:17.027 [2024-07-12 11:48:07.077733] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:07:17.027 [2024-07-12 11:48:07.077767] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid565521 ] 00:07:17.027 [2024-07-12 11:48:07.139766] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:17.027 [2024-07-12 11:48:07.210686] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:17.027 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:17.027 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:17.027 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:17.027 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:17.027 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:17.027 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:17.027 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:17.027 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:17.027 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:17.027 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:17.027 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:17.027 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=software 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@22 -- # accel_module=software 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:17.286 11:48:07 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:18.224 11:48:08 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:18.224 11:48:08 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:18.224 11:48:08 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:18.224 11:48:08 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:18.224 11:48:08 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:18.224 11:48:08 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:18.224 11:48:08 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:18.224 11:48:08 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:18.224 11:48:08 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:18.224 11:48:08 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:18.224 11:48:08 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:18.224 11:48:08 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:18.224 11:48:08 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:18.224 11:48:08 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:18.224 11:48:08 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:18.224 11:48:08 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:18.224 11:48:08 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:18.224 11:48:08 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:18.224 11:48:08 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:18.224 11:48:08 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:18.224 11:48:08 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:18.224 11:48:08 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:18.224 11:48:08 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:18.224 11:48:08 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:18.224 11:48:08 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:18.224 11:48:08 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:18.224 11:48:08 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:18.224 11:48:08 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:18.224 11:48:08 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:18.224 11:48:08 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:18.224 11:48:08 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:18.224 00:07:18.224 real 0m1.346s 00:07:18.224 user 0m0.011s 00:07:18.224 sys 0m0.003s 00:07:18.224 11:48:08 accel.accel_decomp_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:18.224 11:48:08 accel.accel_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:07:18.224 ************************************ 00:07:18.224 END TEST accel_decomp_mthread 00:07:18.224 ************************************ 00:07:18.224 11:48:08 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:18.224 11:48:08 accel -- accel/accel.sh@122 -- # run_test accel_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:18.224 11:48:08 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:07:18.224 11:48:08 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:18.224 11:48:08 accel -- common/autotest_common.sh@10 -- # set +x 00:07:18.224 ************************************ 00:07:18.224 START TEST accel_decomp_full_mthread 00:07:18.224 ************************************ 00:07:18.224 11:48:08 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:18.224 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:07:18.224 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:07:18.224 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:18.224 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:18.224 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:18.224 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:18.224 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:07:18.485 [2024-07-12 11:48:08.495799] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:07:18.485 [2024-07-12 11:48:08.495844] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid565823 ] 00:07:18.485 [2024-07-12 11:48:08.560193] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:18.485 [2024-07-12 11:48:08.632452] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=software 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=software 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:18.485 11:48:08 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:19.869 11:48:09 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:19.869 11:48:09 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:19.869 11:48:09 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:19.869 11:48:09 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:19.869 11:48:09 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:19.869 11:48:09 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:19.869 11:48:09 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:19.869 11:48:09 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:19.869 11:48:09 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:19.869 11:48:09 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:19.869 11:48:09 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:19.869 11:48:09 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:19.869 11:48:09 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:19.869 11:48:09 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:19.869 11:48:09 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:19.869 11:48:09 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:19.869 11:48:09 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:19.869 11:48:09 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:19.869 11:48:09 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:19.869 11:48:09 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:19.869 11:48:09 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:19.869 11:48:09 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:19.869 11:48:09 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:19.869 11:48:09 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:19.869 11:48:09 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:19.869 11:48:09 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:19.869 11:48:09 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:19.869 11:48:09 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:19.869 11:48:09 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:19.869 11:48:09 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:19.869 11:48:09 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:19.869 00:07:19.869 real 0m1.395s 00:07:19.869 user 0m0.013s 00:07:19.869 sys 0m0.001s 00:07:19.869 11:48:09 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:19.869 11:48:09 accel.accel_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:07:19.869 ************************************ 00:07:19.869 END TEST accel_decomp_full_mthread 00:07:19.869 ************************************ 00:07:19.869 11:48:09 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:19.869 11:48:09 accel -- accel/accel.sh@124 -- # [[ y == y ]] 00:07:19.869 11:48:09 accel -- accel/accel.sh@125 -- # COMPRESSDEV=1 00:07:19.869 11:48:09 accel -- accel/accel.sh@126 -- # get_expected_opcs 00:07:19.869 11:48:09 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:19.869 11:48:09 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=566349 00:07:19.869 11:48:09 accel -- accel/accel.sh@63 -- # waitforlisten 566349 00:07:19.869 11:48:09 accel -- common/autotest_common.sh@829 -- # '[' -z 566349 ']' 00:07:19.869 11:48:09 accel -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:19.869 11:48:09 accel -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:19.869 11:48:09 accel -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:19.869 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:19.869 11:48:09 accel -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:19.869 11:48:09 accel -- common/autotest_common.sh@10 -- # set +x 00:07:19.869 11:48:09 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:07:19.869 11:48:09 accel -- accel/accel.sh@61 -- # build_accel_config 00:07:19.869 11:48:09 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:19.869 11:48:09 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:19.869 11:48:09 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:19.869 11:48:09 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:19.869 11:48:09 accel -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:19.869 11:48:09 accel -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:19.869 11:48:09 accel -- accel/accel.sh@40 -- # local IFS=, 00:07:19.869 11:48:09 accel -- accel/accel.sh@41 -- # jq -r . 00:07:19.869 [2024-07-12 11:48:09.943513] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:07:19.869 [2024-07-12 11:48:09.943565] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid566349 ] 00:07:19.869 [2024-07-12 11:48:10.010599] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:19.869 [2024-07-12 11:48:10.095536] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:20.444 [2024-07-12 11:48:10.470358] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:20.702 11:48:10 accel -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:20.702 11:48:10 accel -- common/autotest_common.sh@862 -- # return 0 00:07:20.702 11:48:10 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:07:20.702 11:48:10 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:07:20.702 11:48:10 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:07:20.702 11:48:10 accel -- accel/accel.sh@68 -- # [[ -n 1 ]] 00:07:20.702 11:48:10 accel -- accel/accel.sh@68 -- # check_save_config compressdev_scan_accel_module 00:07:20.702 11:48:10 accel -- accel/accel.sh@56 -- # rpc_cmd save_config 00:07:20.702 11:48:10 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:20.702 11:48:10 accel -- common/autotest_common.sh@10 -- # set +x 00:07:20.702 11:48:10 accel -- accel/accel.sh@56 -- # grep compressdev_scan_accel_module 00:07:20.702 11:48:10 accel -- accel/accel.sh@56 -- # jq -r '.subsystems[] | select(.subsystem=="accel").config[]' 00:07:20.702 11:48:10 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:20.702 "method": "compressdev_scan_accel_module", 00:07:20.702 11:48:10 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:07:20.702 11:48:10 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:07:20.702 11:48:10 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:07:20.702 11:48:10 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:20.702 11:48:10 accel -- common/autotest_common.sh@10 -- # set +x 00:07:20.702 11:48:10 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:20.702 11:48:10 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:20.702 11:48:10 accel -- accel/accel.sh@72 -- # IFS== 00:07:20.702 11:48:10 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:20.702 11:48:10 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:20.702 11:48:10 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:20.702 11:48:10 accel -- accel/accel.sh@72 -- # IFS== 00:07:20.702 11:48:10 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:20.702 11:48:10 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:20.702 11:48:10 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:20.702 11:48:10 accel -- accel/accel.sh@72 -- # IFS== 00:07:20.702 11:48:10 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:20.702 11:48:10 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:20.702 11:48:10 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:20.702 11:48:10 accel -- accel/accel.sh@72 -- # IFS== 00:07:20.702 11:48:10 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:20.702 11:48:10 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:20.702 11:48:10 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:20.702 11:48:10 accel -- accel/accel.sh@72 -- # IFS== 00:07:20.702 11:48:10 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:20.702 11:48:10 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:20.702 11:48:10 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:20.702 11:48:10 accel -- accel/accel.sh@72 -- # IFS== 00:07:20.702 11:48:10 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:20.702 11:48:10 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:20.702 11:48:10 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:20.702 11:48:10 accel -- accel/accel.sh@72 -- # IFS== 00:07:20.702 11:48:10 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:20.702 11:48:10 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:07:20.702 11:48:10 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:20.702 11:48:10 accel -- accel/accel.sh@72 -- # IFS== 00:07:20.702 11:48:10 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:20.702 11:48:10 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:07:20.702 11:48:10 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:20.702 11:48:10 accel -- accel/accel.sh@72 -- # IFS== 00:07:20.702 11:48:10 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:20.702 11:48:10 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:20.702 11:48:10 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:20.702 11:48:10 accel -- accel/accel.sh@72 -- # IFS== 00:07:20.702 11:48:10 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:20.702 11:48:10 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:20.702 11:48:10 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:20.702 11:48:10 accel -- accel/accel.sh@72 -- # IFS== 00:07:20.702 11:48:10 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:20.702 11:48:10 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:20.702 11:48:10 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:20.702 11:48:10 accel -- accel/accel.sh@72 -- # IFS== 00:07:20.702 11:48:10 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:20.702 11:48:10 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:20.702 11:48:10 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:20.702 11:48:10 accel -- accel/accel.sh@72 -- # IFS== 00:07:20.702 11:48:10 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:20.702 11:48:10 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:20.702 11:48:10 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:20.702 11:48:10 accel -- accel/accel.sh@72 -- # IFS== 00:07:20.702 11:48:10 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:20.702 11:48:10 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:20.702 11:48:10 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:20.702 11:48:10 accel -- accel/accel.sh@72 -- # IFS== 00:07:20.702 11:48:10 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:20.702 11:48:10 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:20.702 11:48:10 accel -- accel/accel.sh@75 -- # killprocess 566349 00:07:20.702 11:48:10 accel -- common/autotest_common.sh@948 -- # '[' -z 566349 ']' 00:07:20.702 11:48:10 accel -- common/autotest_common.sh@952 -- # kill -0 566349 00:07:20.702 11:48:10 accel -- common/autotest_common.sh@953 -- # uname 00:07:20.702 11:48:10 accel -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:20.702 11:48:10 accel -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 566349 00:07:20.960 11:48:10 accel -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:20.960 11:48:10 accel -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:20.960 11:48:10 accel -- common/autotest_common.sh@966 -- # echo 'killing process with pid 566349' 00:07:20.960 killing process with pid 566349 00:07:20.960 11:48:10 accel -- common/autotest_common.sh@967 -- # kill 566349 00:07:20.960 11:48:10 accel -- common/autotest_common.sh@972 -- # wait 566349 00:07:21.220 11:48:11 accel -- accel/accel.sh@76 -- # trap - ERR 00:07:21.220 11:48:11 accel -- accel/accel.sh@127 -- # run_test accel_cdev_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:21.220 11:48:11 accel -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:07:21.220 11:48:11 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:21.220 11:48:11 accel -- common/autotest_common.sh@10 -- # set +x 00:07:21.220 ************************************ 00:07:21.220 START TEST accel_cdev_comp 00:07:21.220 ************************************ 00:07:21.220 11:48:11 accel.accel_cdev_comp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:21.220 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@16 -- # local accel_opc 00:07:21.220 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@17 -- # local accel_module 00:07:21.220 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:21.220 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:21.220 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:21.220 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:21.220 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@12 -- # build_accel_config 00:07:21.220 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:21.220 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:21.220 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:21.220 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:21.220 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:21.220 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:21.220 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@40 -- # local IFS=, 00:07:21.220 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@41 -- # jq -r . 00:07:21.220 [2024-07-12 11:48:11.346192] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:07:21.220 [2024-07-12 11:48:11.346243] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid566642 ] 00:07:21.220 [2024-07-12 11:48:11.412370] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:21.478 [2024-07-12 11:48:11.485604] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:21.736 [2024-07-12 11:48:11.860181] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:21.736 [2024-07-12 11:48:11.861774] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2216a30 PMD being used: compress_qat 00:07:21.736 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:21.736 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:21.736 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:21.736 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:21.736 [2024-07-12 11:48:11.864983] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x221b830 PMD being used: compress_qat 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=0x1 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=compress 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=1 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=No 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:21.737 11:48:11 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:23.110 11:48:13 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:23.110 11:48:13 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:23.110 11:48:13 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:23.110 11:48:13 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:23.110 11:48:13 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:23.110 11:48:13 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:23.110 11:48:13 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:23.110 11:48:13 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:23.110 11:48:13 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:23.110 11:48:13 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:23.110 11:48:13 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:23.110 11:48:13 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:23.110 11:48:13 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:23.110 11:48:13 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:23.110 11:48:13 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:23.110 11:48:13 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:23.110 11:48:13 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:23.110 11:48:13 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:23.110 11:48:13 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:23.110 11:48:13 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:23.110 11:48:13 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:23.110 11:48:13 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:23.110 11:48:13 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:23.110 11:48:13 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:23.110 11:48:13 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:23.110 11:48:13 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:07:23.110 11:48:13 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:23.110 00:07:23.110 real 0m1.694s 00:07:23.110 user 0m1.401s 00:07:23.110 sys 0m0.291s 00:07:23.110 11:48:13 accel.accel_cdev_comp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:23.110 11:48:13 accel.accel_cdev_comp -- common/autotest_common.sh@10 -- # set +x 00:07:23.110 ************************************ 00:07:23.110 END TEST accel_cdev_comp 00:07:23.110 ************************************ 00:07:23.110 11:48:13 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:23.110 11:48:13 accel -- accel/accel.sh@128 -- # run_test accel_cdev_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:23.110 11:48:13 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:23.110 11:48:13 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:23.110 11:48:13 accel -- common/autotest_common.sh@10 -- # set +x 00:07:23.110 ************************************ 00:07:23.110 START TEST accel_cdev_decomp 00:07:23.110 ************************************ 00:07:23.110 11:48:13 accel.accel_cdev_decomp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:23.110 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@16 -- # local accel_opc 00:07:23.110 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@17 -- # local accel_module 00:07:23.110 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:23.110 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:23.110 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:23.110 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:23.110 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # build_accel_config 00:07:23.110 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:23.110 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:23.110 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:23.110 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:23.110 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:23.110 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:23.110 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@40 -- # local IFS=, 00:07:23.110 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@41 -- # jq -r . 00:07:23.110 [2024-07-12 11:48:13.100802] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:07:23.110 [2024-07-12 11:48:13.100853] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid567107 ] 00:07:23.110 [2024-07-12 11:48:13.164193] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:23.110 [2024-07-12 11:48:13.234190] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:23.368 [2024-07-12 11:48:13.597164] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:23.368 [2024-07-12 11:48:13.598789] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xa34a30 PMD being used: compress_qat 00:07:23.368 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:23.368 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:23.368 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:23.368 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:23.368 [2024-07-12 11:48:13.602149] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xa39830 PMD being used: compress_qat 00:07:23.368 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:23.368 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:23.368 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:23.368 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:23.368 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:23.368 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:23.368 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:23.368 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:23.368 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=0x1 00:07:23.368 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:23.368 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:23.368 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:23.368 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:23.368 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:23.368 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:23.368 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:23.368 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:23.368 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:23.368 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:23.368 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:23.368 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=decompress 00:07:23.368 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:23.369 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:23.369 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:23.369 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:23.369 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:23.369 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:23.369 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:23.369 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:23.369 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:23.369 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:23.369 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:23.369 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:23.369 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:23.369 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:23.369 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:23.369 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:23.369 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:23.369 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:23.369 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:23.369 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:23.369 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:23.369 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:07:23.369 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:23.369 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:23.369 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:23.369 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:07:23.628 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:23.628 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:23.628 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:23.628 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=1 00:07:23.628 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:23.628 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:23.628 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:23.628 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:07:23.628 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:23.628 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:23.628 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:23.628 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=Yes 00:07:23.628 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:23.628 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:23.628 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:23.628 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:23.628 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:23.628 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:23.628 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:23.628 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:23.628 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:23.628 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:23.628 11:48:13 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:24.566 11:48:14 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:24.566 11:48:14 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:24.566 11:48:14 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:24.566 11:48:14 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:24.566 11:48:14 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:24.566 11:48:14 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:24.566 11:48:14 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:24.566 11:48:14 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:24.566 11:48:14 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:24.566 11:48:14 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:24.566 11:48:14 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:24.566 11:48:14 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:24.566 11:48:14 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:24.566 11:48:14 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:24.566 11:48:14 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:24.566 11:48:14 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:24.566 11:48:14 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:24.566 11:48:14 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:24.566 11:48:14 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:24.566 11:48:14 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:24.566 11:48:14 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:24.566 11:48:14 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:24.566 11:48:14 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:24.566 11:48:14 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:24.566 11:48:14 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:24.566 11:48:14 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:24.566 11:48:14 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:24.566 00:07:24.566 real 0m1.675s 00:07:24.566 user 0m1.410s 00:07:24.566 sys 0m0.270s 00:07:24.566 11:48:14 accel.accel_cdev_decomp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:24.566 11:48:14 accel.accel_cdev_decomp -- common/autotest_common.sh@10 -- # set +x 00:07:24.566 ************************************ 00:07:24.566 END TEST accel_cdev_decomp 00:07:24.566 ************************************ 00:07:24.566 11:48:14 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:24.566 11:48:14 accel -- accel/accel.sh@129 -- # run_test accel_cdev_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:24.566 11:48:14 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:07:24.566 11:48:14 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:24.566 11:48:14 accel -- common/autotest_common.sh@10 -- # set +x 00:07:24.566 ************************************ 00:07:24.566 START TEST accel_cdev_decomp_full 00:07:24.566 ************************************ 00:07:24.566 11:48:14 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:24.566 11:48:14 accel.accel_cdev_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:07:24.566 11:48:14 accel.accel_cdev_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:07:24.825 11:48:14 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:24.825 11:48:14 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:24.825 11:48:14 accel.accel_cdev_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:24.825 11:48:14 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:24.825 11:48:14 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:07:24.825 11:48:14 accel.accel_cdev_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:24.825 11:48:14 accel.accel_cdev_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:24.825 11:48:14 accel.accel_cdev_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:24.825 11:48:14 accel.accel_cdev_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:24.825 11:48:14 accel.accel_cdev_decomp_full -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:24.825 11:48:14 accel.accel_cdev_decomp_full -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:24.825 11:48:14 accel.accel_cdev_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:07:24.825 11:48:14 accel.accel_cdev_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:07:24.825 [2024-07-12 11:48:14.838333] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:07:24.825 [2024-07-12 11:48:14.838383] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid567350 ] 00:07:24.825 [2024-07-12 11:48:14.903810] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:24.825 [2024-07-12 11:48:14.974448] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:25.394 [2024-07-12 11:48:15.345522] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:25.394 [2024-07-12 11:48:15.347156] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xadda30 PMD being used: compress_qat 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:25.394 [2024-07-12 11:48:15.349713] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xaddad0 PMD being used: compress_qat 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=1 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:25.394 11:48:15 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:26.331 11:48:16 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:26.331 11:48:16 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:26.331 11:48:16 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:26.331 11:48:16 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:26.331 11:48:16 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:26.331 11:48:16 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:26.331 11:48:16 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:26.331 11:48:16 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:26.331 11:48:16 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:26.331 11:48:16 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:26.331 11:48:16 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:26.331 11:48:16 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:26.331 11:48:16 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:26.331 11:48:16 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:26.331 11:48:16 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:26.331 11:48:16 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:26.331 11:48:16 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:26.331 11:48:16 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:26.331 11:48:16 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:26.331 11:48:16 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:26.331 11:48:16 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:26.331 11:48:16 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:26.331 11:48:16 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:26.331 11:48:16 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:26.331 11:48:16 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:26.331 11:48:16 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:26.331 11:48:16 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:26.332 00:07:26.332 real 0m1.684s 00:07:26.332 user 0m1.400s 00:07:26.332 sys 0m0.284s 00:07:26.332 11:48:16 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:26.332 11:48:16 accel.accel_cdev_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:07:26.332 ************************************ 00:07:26.332 END TEST accel_cdev_decomp_full 00:07:26.332 ************************************ 00:07:26.332 11:48:16 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:26.332 11:48:16 accel -- accel/accel.sh@130 -- # run_test accel_cdev_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:26.332 11:48:16 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:07:26.332 11:48:16 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:26.332 11:48:16 accel -- common/autotest_common.sh@10 -- # set +x 00:07:26.332 ************************************ 00:07:26.332 START TEST accel_cdev_decomp_mcore 00:07:26.332 ************************************ 00:07:26.332 11:48:16 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:26.332 11:48:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:07:26.332 11:48:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:07:26.332 11:48:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:26.332 11:48:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:26.332 11:48:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:26.332 11:48:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:26.332 11:48:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:07:26.332 11:48:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:26.332 11:48:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:26.332 11:48:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:26.332 11:48:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:26.332 11:48:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:26.332 11:48:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:26.332 11:48:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:07:26.332 11:48:16 accel.accel_cdev_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:07:26.332 [2024-07-12 11:48:16.567293] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:07:26.332 [2024-07-12 11:48:16.567325] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid567603 ] 00:07:26.591 [2024-07-12 11:48:16.630810] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:26.591 [2024-07-12 11:48:16.704109] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:26.591 [2024-07-12 11:48:16.704203] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:26.591 [2024-07-12 11:48:16.704312] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:26.591 [2024-07-12 11:48:16.704314] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:26.850 [2024-07-12 11:48:17.078997] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:26.850 [2024-07-12 11:48:17.080647] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x126d000 PMD being used: compress_qat 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:26.850 [2024-07-12 11:48:17.085013] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fa30419b8b0 PMD being used: compress_qat 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:26.850 [2024-07-12 11:48:17.086064] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fa2fc19b8b0 PMD being used: compress_qat 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:07:26.850 [2024-07-12 11:48:17.086571] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x12722f0 PMD being used: compress_qat 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:26.850 [2024-07-12 11:48:17.086631] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fa2f419b8b0 PMD being used: compress_qat 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:26.850 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:26.851 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:07:26.851 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:26.851 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:26.851 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:26.851 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:07:26.851 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:26.851 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:26.851 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:26.851 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:07:27.109 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:27.109 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:27.109 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:27.109 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:07:27.110 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:27.110 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:27.110 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:27.110 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:07:27.110 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:27.110 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:27.110 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:27.110 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:27.110 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:27.110 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:27.110 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:27.110 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:27.110 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:27.110 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:27.110 11:48:17 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:28.046 00:07:28.046 real 0m1.692s 00:07:28.046 user 0m5.771s 00:07:28.046 sys 0m0.301s 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:28.046 11:48:18 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:07:28.046 ************************************ 00:07:28.046 END TEST accel_cdev_decomp_mcore 00:07:28.046 ************************************ 00:07:28.046 11:48:18 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:28.046 11:48:18 accel -- accel/accel.sh@131 -- # run_test accel_cdev_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:28.046 11:48:18 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:07:28.046 11:48:18 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:28.046 11:48:18 accel -- common/autotest_common.sh@10 -- # set +x 00:07:28.305 ************************************ 00:07:28.305 START TEST accel_cdev_decomp_full_mcore 00:07:28.305 ************************************ 00:07:28.305 11:48:18 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:28.305 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:07:28.305 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:07:28.305 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:28.305 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:28.305 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:28.305 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:28.305 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:07:28.305 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:28.305 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:28.305 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:28.305 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:28.305 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:28.305 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:28.305 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:07:28.305 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:07:28.305 [2024-07-12 11:48:18.338742] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:07:28.305 [2024-07-12 11:48:18.338789] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid567966 ] 00:07:28.305 [2024-07-12 11:48:18.408884] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:28.305 [2024-07-12 11:48:18.485839] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:28.305 [2024-07-12 11:48:18.485935] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:28.305 [2024-07-12 11:48:18.486025] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:28.305 [2024-07-12 11:48:18.486026] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:28.873 [2024-07-12 11:48:18.861925] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:28.873 [2024-07-12 11:48:18.863568] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x838000 PMD being used: compress_qat 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:28.874 [2024-07-12 11:48:18.867110] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f708c19b8b0 PMD being used: compress_qat 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:28.874 [2024-07-12 11:48:18.868118] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f708419b8b0 PMD being used: compress_qat 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:07:28.874 [2024-07-12 11:48:18.868647] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x83b330 PMD being used: compress_qat 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:28.874 [2024-07-12 11:48:18.868709] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f707c19b8b0 PMD being used: compress_qat 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:28.874 11:48:18 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:29.812 00:07:29.812 real 0m1.717s 00:07:29.812 user 0m5.807s 00:07:29.812 sys 0m0.291s 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:29.812 11:48:20 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:07:29.812 ************************************ 00:07:29.812 END TEST accel_cdev_decomp_full_mcore 00:07:29.812 ************************************ 00:07:30.072 11:48:20 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:30.072 11:48:20 accel -- accel/accel.sh@132 -- # run_test accel_cdev_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:30.072 11:48:20 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:07:30.072 11:48:20 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:30.072 11:48:20 accel -- common/autotest_common.sh@10 -- # set +x 00:07:30.072 ************************************ 00:07:30.072 START TEST accel_cdev_decomp_mthread 00:07:30.072 ************************************ 00:07:30.072 11:48:20 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:30.072 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:07:30.072 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:07:30.072 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:30.072 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:30.072 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:30.072 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:30.072 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:07:30.072 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:30.072 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:30.072 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:30.072 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:30.072 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:30.072 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:30.072 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:07:30.072 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:07:30.072 [2024-07-12 11:48:20.121459] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:07:30.072 [2024-07-12 11:48:20.121503] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid568317 ] 00:07:30.072 [2024-07-12 11:48:20.186540] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:30.072 [2024-07-12 11:48:20.258915] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:30.640 [2024-07-12 11:48:20.629862] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:30.640 [2024-07-12 11:48:20.631463] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1e0ea30 PMD being used: compress_qat 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:30.640 [2024-07-12 11:48:20.635343] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1e13c90 PMD being used: compress_qat 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:07:30.640 [2024-07-12 11:48:20.636955] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1f36a80 PMD being used: compress_qat 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:30.640 11:48:20 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:31.577 11:48:21 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:31.577 11:48:21 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:31.577 11:48:21 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:31.577 11:48:21 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:31.577 11:48:21 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:31.577 11:48:21 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:31.577 11:48:21 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:31.577 11:48:21 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:31.577 11:48:21 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:31.578 11:48:21 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:31.578 11:48:21 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:31.578 11:48:21 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:31.578 11:48:21 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:31.578 11:48:21 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:31.578 11:48:21 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:31.578 11:48:21 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:31.578 11:48:21 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:31.578 11:48:21 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:31.578 11:48:21 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:31.578 11:48:21 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:31.578 11:48:21 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:31.578 11:48:21 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:31.578 11:48:21 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:31.578 11:48:21 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:31.578 11:48:21 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:31.578 11:48:21 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:31.578 11:48:21 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:31.578 11:48:21 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:31.578 11:48:21 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:31.578 11:48:21 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:31.578 11:48:21 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:31.578 00:07:31.578 real 0m1.689s 00:07:31.578 user 0m1.402s 00:07:31.578 sys 0m0.290s 00:07:31.578 11:48:21 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:31.578 11:48:21 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:07:31.578 ************************************ 00:07:31.578 END TEST accel_cdev_decomp_mthread 00:07:31.578 ************************************ 00:07:31.578 11:48:21 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:31.578 11:48:21 accel -- accel/accel.sh@133 -- # run_test accel_cdev_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:31.578 11:48:21 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:07:31.578 11:48:21 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:31.578 11:48:21 accel -- common/autotest_common.sh@10 -- # set +x 00:07:31.838 ************************************ 00:07:31.838 START TEST accel_cdev_decomp_full_mthread 00:07:31.838 ************************************ 00:07:31.838 11:48:21 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:31.838 11:48:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:07:31.838 11:48:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:07:31.838 11:48:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:31.838 11:48:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:31.838 11:48:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:31.838 11:48:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:31.838 11:48:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:07:31.838 11:48:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:31.838 11:48:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:31.838 11:48:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:31.838 11:48:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:31.838 11:48:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:31.838 11:48:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:31.838 11:48:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:07:31.838 11:48:21 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:07:31.838 [2024-07-12 11:48:21.859454] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:07:31.838 [2024-07-12 11:48:21.859489] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid568570 ] 00:07:31.838 [2024-07-12 11:48:21.921937] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:31.838 [2024-07-12 11:48:21.992496] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:32.405 [2024-07-12 11:48:22.355516] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:32.405 [2024-07-12 11:48:22.357193] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2786a30 PMD being used: compress_qat 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:32.405 [2024-07-12 11:48:22.360361] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2786ad0 PMD being used: compress_qat 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:07:32.405 [2024-07-12 11:48:22.362111] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x28ae670 PMD being used: compress_qat 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:32.405 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:32.406 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:32.406 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:07:32.406 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:32.406 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:32.406 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:32.406 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:07:32.406 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:32.406 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:32.406 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:32.406 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:07:32.406 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:32.406 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:32.406 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:32.406 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:32.406 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:32.406 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:32.406 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:32.406 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:32.406 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:32.406 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:32.406 11:48:22 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:33.339 11:48:23 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:33.339 11:48:23 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:33.339 11:48:23 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:33.339 11:48:23 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:33.339 11:48:23 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:33.339 11:48:23 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:33.339 11:48:23 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:33.339 11:48:23 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:33.339 11:48:23 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:33.339 11:48:23 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:33.339 11:48:23 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:33.339 11:48:23 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:33.339 11:48:23 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:33.339 11:48:23 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:33.339 11:48:23 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:33.339 11:48:23 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:33.339 11:48:23 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:33.339 11:48:23 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:33.339 11:48:23 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:33.339 11:48:23 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:33.339 11:48:23 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:33.339 11:48:23 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:33.339 11:48:23 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:33.339 11:48:23 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:33.339 11:48:23 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:33.339 11:48:23 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:33.339 11:48:23 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:33.339 11:48:23 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:33.339 11:48:23 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:33.339 11:48:23 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:33.339 11:48:23 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:33.339 00:07:33.339 real 0m1.666s 00:07:33.339 user 0m1.389s 00:07:33.339 sys 0m0.281s 00:07:33.339 11:48:23 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:33.339 11:48:23 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:07:33.339 ************************************ 00:07:33.339 END TEST accel_cdev_decomp_full_mthread 00:07:33.339 ************************************ 00:07:33.339 11:48:23 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:33.339 11:48:23 accel -- accel/accel.sh@134 -- # unset COMPRESSDEV 00:07:33.339 11:48:23 accel -- accel/accel.sh@137 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:33.339 11:48:23 accel -- accel/accel.sh@137 -- # build_accel_config 00:07:33.339 11:48:23 accel -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:07:33.339 11:48:23 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:33.339 11:48:23 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:33.339 11:48:23 accel -- common/autotest_common.sh@10 -- # set +x 00:07:33.339 11:48:23 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:33.339 11:48:23 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:33.339 11:48:23 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:33.339 11:48:23 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:33.339 11:48:23 accel -- accel/accel.sh@40 -- # local IFS=, 00:07:33.339 11:48:23 accel -- accel/accel.sh@41 -- # jq -r . 00:07:33.339 ************************************ 00:07:33.339 START TEST accel_dif_functional_tests 00:07:33.339 ************************************ 00:07:33.339 11:48:23 accel.accel_dif_functional_tests -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:33.598 [2024-07-12 11:48:23.613800] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:07:33.598 [2024-07-12 11:48:23.613835] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid568825 ] 00:07:33.598 [2024-07-12 11:48:23.677525] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:33.598 [2024-07-12 11:48:23.751147] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:33.598 [2024-07-12 11:48:23.751245] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:33.598 [2024-07-12 11:48:23.751247] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:33.598 00:07:33.598 00:07:33.598 CUnit - A unit testing framework for C - Version 2.1-3 00:07:33.598 http://cunit.sourceforge.net/ 00:07:33.598 00:07:33.598 00:07:33.598 Suite: accel_dif 00:07:33.598 Test: verify: DIF generated, GUARD check ...passed 00:07:33.598 Test: verify: DIF generated, APPTAG check ...passed 00:07:33.598 Test: verify: DIF generated, REFTAG check ...passed 00:07:33.598 Test: verify: DIF not generated, GUARD check ...[2024-07-12 11:48:23.833850] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:33.598 passed 00:07:33.598 Test: verify: DIF not generated, APPTAG check ...[2024-07-12 11:48:23.833898] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:33.598 passed 00:07:33.598 Test: verify: DIF not generated, REFTAG check ...[2024-07-12 11:48:23.833933] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:33.598 passed 00:07:33.598 Test: verify: APPTAG correct, APPTAG check ...passed 00:07:33.598 Test: verify: APPTAG incorrect, APPTAG check ...[2024-07-12 11:48:23.833976] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:07:33.598 passed 00:07:33.598 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:07:33.598 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:07:33.598 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:07:33.598 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-07-12 11:48:23.834076] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:07:33.598 passed 00:07:33.598 Test: verify copy: DIF generated, GUARD check ...passed 00:07:33.598 Test: verify copy: DIF generated, APPTAG check ...passed 00:07:33.598 Test: verify copy: DIF generated, REFTAG check ...passed 00:07:33.598 Test: verify copy: DIF not generated, GUARD check ...[2024-07-12 11:48:23.834185] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:33.598 passed 00:07:33.598 Test: verify copy: DIF not generated, APPTAG check ...[2024-07-12 11:48:23.834207] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:33.598 passed 00:07:33.598 Test: verify copy: DIF not generated, REFTAG check ...[2024-07-12 11:48:23.834225] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:33.598 passed 00:07:33.598 Test: generate copy: DIF generated, GUARD check ...passed 00:07:33.598 Test: generate copy: DIF generated, APTTAG check ...passed 00:07:33.598 Test: generate copy: DIF generated, REFTAG check ...passed 00:07:33.598 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:07:33.598 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:07:33.598 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:07:33.598 Test: generate copy: iovecs-len validate ...[2024-07-12 11:48:23.834384] dif.c:1190:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:07:33.598 passed 00:07:33.598 Test: generate copy: buffer alignment validate ...passed 00:07:33.598 00:07:33.598 Run Summary: Type Total Ran Passed Failed Inactive 00:07:33.598 suites 1 1 n/a 0 0 00:07:33.598 tests 26 26 26 0 0 00:07:33.598 asserts 115 115 115 0 n/a 00:07:33.598 00:07:33.598 Elapsed time = 0.000 seconds 00:07:33.858 00:07:33.858 real 0m0.438s 00:07:33.858 user 0m0.653s 00:07:33.858 sys 0m0.154s 00:07:33.858 11:48:24 accel.accel_dif_functional_tests -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:33.858 11:48:24 accel.accel_dif_functional_tests -- common/autotest_common.sh@10 -- # set +x 00:07:33.858 ************************************ 00:07:33.858 END TEST accel_dif_functional_tests 00:07:33.858 ************************************ 00:07:33.858 11:48:24 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:33.858 00:07:33.858 real 0m44.937s 00:07:33.858 user 0m54.948s 00:07:33.858 sys 0m7.081s 00:07:33.858 11:48:24 accel -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:33.858 11:48:24 accel -- common/autotest_common.sh@10 -- # set +x 00:07:33.858 ************************************ 00:07:33.858 END TEST accel 00:07:33.858 ************************************ 00:07:33.858 11:48:24 -- common/autotest_common.sh@1142 -- # return 0 00:07:33.858 11:48:24 -- spdk/autotest.sh@184 -- # run_test accel_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:33.858 11:48:24 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:33.858 11:48:24 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:33.858 11:48:24 -- common/autotest_common.sh@10 -- # set +x 00:07:33.858 ************************************ 00:07:33.858 START TEST accel_rpc 00:07:33.858 ************************************ 00:07:34.117 11:48:24 accel_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:34.117 * Looking for test storage... 00:07:34.117 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:07:34.117 11:48:24 accel_rpc -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:34.117 11:48:24 accel_rpc -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=569097 00:07:34.117 11:48:24 accel_rpc -- accel/accel_rpc.sh@15 -- # waitforlisten 569097 00:07:34.117 11:48:24 accel_rpc -- common/autotest_common.sh@829 -- # '[' -z 569097 ']' 00:07:34.117 11:48:24 accel_rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:34.117 11:48:24 accel_rpc -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:07:34.117 11:48:24 accel_rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:34.117 11:48:24 accel_rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:34.117 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:34.117 11:48:24 accel_rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:34.117 11:48:24 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:34.117 [2024-07-12 11:48:24.231771] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:07:34.117 [2024-07-12 11:48:24.231810] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid569097 ] 00:07:34.117 [2024-07-12 11:48:24.296185] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:34.376 [2024-07-12 11:48:24.375495] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:34.944 11:48:25 accel_rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:34.944 11:48:25 accel_rpc -- common/autotest_common.sh@862 -- # return 0 00:07:34.944 11:48:25 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:07:34.944 11:48:25 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:07:34.944 11:48:25 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:07:34.944 11:48:25 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:07:34.944 11:48:25 accel_rpc -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:07:34.944 11:48:25 accel_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:34.944 11:48:25 accel_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:34.944 11:48:25 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:34.944 ************************************ 00:07:34.944 START TEST accel_assign_opcode 00:07:34.944 ************************************ 00:07:34.944 11:48:25 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1123 -- # accel_assign_opcode_test_suite 00:07:34.944 11:48:25 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:07:34.944 11:48:25 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:34.944 11:48:25 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:07:34.944 [2024-07-12 11:48:25.037568] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:07:34.944 11:48:25 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:34.944 11:48:25 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:07:34.944 11:48:25 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:34.944 11:48:25 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:07:34.944 [2024-07-12 11:48:25.045583] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:07:34.944 11:48:25 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:34.944 11:48:25 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:07:34.944 11:48:25 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:34.944 11:48:25 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:07:35.203 11:48:25 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:35.203 11:48:25 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:07:35.203 11:48:25 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:35.203 11:48:25 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:07:35.203 11:48:25 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:07:35.203 11:48:25 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # grep software 00:07:35.203 11:48:25 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:35.203 software 00:07:35.203 00:07:35.203 real 0m0.243s 00:07:35.203 user 0m0.041s 00:07:35.203 sys 0m0.007s 00:07:35.203 11:48:25 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:35.203 11:48:25 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:07:35.203 ************************************ 00:07:35.203 END TEST accel_assign_opcode 00:07:35.203 ************************************ 00:07:35.203 11:48:25 accel_rpc -- common/autotest_common.sh@1142 -- # return 0 00:07:35.203 11:48:25 accel_rpc -- accel/accel_rpc.sh@55 -- # killprocess 569097 00:07:35.203 11:48:25 accel_rpc -- common/autotest_common.sh@948 -- # '[' -z 569097 ']' 00:07:35.203 11:48:25 accel_rpc -- common/autotest_common.sh@952 -- # kill -0 569097 00:07:35.203 11:48:25 accel_rpc -- common/autotest_common.sh@953 -- # uname 00:07:35.203 11:48:25 accel_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:35.203 11:48:25 accel_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 569097 00:07:35.203 11:48:25 accel_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:35.203 11:48:25 accel_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:35.203 11:48:25 accel_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 569097' 00:07:35.203 killing process with pid 569097 00:07:35.203 11:48:25 accel_rpc -- common/autotest_common.sh@967 -- # kill 569097 00:07:35.203 11:48:25 accel_rpc -- common/autotest_common.sh@972 -- # wait 569097 00:07:35.461 00:07:35.461 real 0m1.537s 00:07:35.461 user 0m1.556s 00:07:35.461 sys 0m0.408s 00:07:35.461 11:48:25 accel_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:35.461 11:48:25 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:35.461 ************************************ 00:07:35.461 END TEST accel_rpc 00:07:35.461 ************************************ 00:07:35.461 11:48:25 -- common/autotest_common.sh@1142 -- # return 0 00:07:35.461 11:48:25 -- spdk/autotest.sh@185 -- # run_test app_cmdline /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:07:35.461 11:48:25 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:35.461 11:48:25 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:35.461 11:48:25 -- common/autotest_common.sh@10 -- # set +x 00:07:35.720 ************************************ 00:07:35.720 START TEST app_cmdline 00:07:35.720 ************************************ 00:07:35.720 11:48:25 app_cmdline -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:07:35.720 * Looking for test storage... 00:07:35.720 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:07:35.720 11:48:25 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:07:35.720 11:48:25 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=569405 00:07:35.720 11:48:25 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 569405 00:07:35.720 11:48:25 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:07:35.720 11:48:25 app_cmdline -- common/autotest_common.sh@829 -- # '[' -z 569405 ']' 00:07:35.720 11:48:25 app_cmdline -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:35.720 11:48:25 app_cmdline -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:35.720 11:48:25 app_cmdline -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:35.720 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:35.720 11:48:25 app_cmdline -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:35.720 11:48:25 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:35.720 [2024-07-12 11:48:25.849735] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:07:35.720 [2024-07-12 11:48:25.849777] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid569405 ] 00:07:35.720 [2024-07-12 11:48:25.914608] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:35.979 [2024-07-12 11:48:25.991522] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:36.545 11:48:26 app_cmdline -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:36.545 11:48:26 app_cmdline -- common/autotest_common.sh@862 -- # return 0 00:07:36.545 11:48:26 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:07:36.804 { 00:07:36.804 "version": "SPDK v24.09-pre git sha1 b2ac96cc2", 00:07:36.804 "fields": { 00:07:36.804 "major": 24, 00:07:36.804 "minor": 9, 00:07:36.804 "patch": 0, 00:07:36.804 "suffix": "-pre", 00:07:36.804 "commit": "b2ac96cc2" 00:07:36.804 } 00:07:36.804 } 00:07:36.804 11:48:26 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:07:36.804 11:48:26 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:07:36.804 11:48:26 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:07:36.804 11:48:26 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:07:36.804 11:48:26 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:07:36.804 11:48:26 app_cmdline -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:36.804 11:48:26 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:36.804 11:48:26 app_cmdline -- app/cmdline.sh@26 -- # sort 00:07:36.804 11:48:26 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:07:36.804 11:48:26 app_cmdline -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:36.804 11:48:26 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:07:36.804 11:48:26 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:07:36.804 11:48:26 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:36.804 11:48:26 app_cmdline -- common/autotest_common.sh@648 -- # local es=0 00:07:36.804 11:48:26 app_cmdline -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:36.804 11:48:26 app_cmdline -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:07:36.804 11:48:26 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:36.804 11:48:26 app_cmdline -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:07:36.804 11:48:26 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:36.804 11:48:26 app_cmdline -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:07:36.804 11:48:26 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:36.804 11:48:26 app_cmdline -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:07:36.804 11:48:26 app_cmdline -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:07:36.804 11:48:26 app_cmdline -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:36.804 request: 00:07:36.804 { 00:07:36.804 "method": "env_dpdk_get_mem_stats", 00:07:36.804 "req_id": 1 00:07:36.804 } 00:07:36.804 Got JSON-RPC error response 00:07:36.804 response: 00:07:36.804 { 00:07:36.804 "code": -32601, 00:07:36.804 "message": "Method not found" 00:07:36.804 } 00:07:36.804 11:48:27 app_cmdline -- common/autotest_common.sh@651 -- # es=1 00:07:36.804 11:48:27 app_cmdline -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:36.804 11:48:27 app_cmdline -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:07:36.804 11:48:27 app_cmdline -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:36.804 11:48:27 app_cmdline -- app/cmdline.sh@1 -- # killprocess 569405 00:07:36.804 11:48:27 app_cmdline -- common/autotest_common.sh@948 -- # '[' -z 569405 ']' 00:07:36.804 11:48:27 app_cmdline -- common/autotest_common.sh@952 -- # kill -0 569405 00:07:36.804 11:48:27 app_cmdline -- common/autotest_common.sh@953 -- # uname 00:07:36.804 11:48:27 app_cmdline -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:36.804 11:48:27 app_cmdline -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 569405 00:07:37.063 11:48:27 app_cmdline -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:37.063 11:48:27 app_cmdline -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:37.063 11:48:27 app_cmdline -- common/autotest_common.sh@966 -- # echo 'killing process with pid 569405' 00:07:37.063 killing process with pid 569405 00:07:37.063 11:48:27 app_cmdline -- common/autotest_common.sh@967 -- # kill 569405 00:07:37.063 11:48:27 app_cmdline -- common/autotest_common.sh@972 -- # wait 569405 00:07:37.321 00:07:37.322 real 0m1.675s 00:07:37.322 user 0m1.966s 00:07:37.322 sys 0m0.445s 00:07:37.322 11:48:27 app_cmdline -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:37.322 11:48:27 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:37.322 ************************************ 00:07:37.322 END TEST app_cmdline 00:07:37.322 ************************************ 00:07:37.322 11:48:27 -- common/autotest_common.sh@1142 -- # return 0 00:07:37.322 11:48:27 -- spdk/autotest.sh@186 -- # run_test version /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:07:37.322 11:48:27 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:37.322 11:48:27 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:37.322 11:48:27 -- common/autotest_common.sh@10 -- # set +x 00:07:37.322 ************************************ 00:07:37.322 START TEST version 00:07:37.322 ************************************ 00:07:37.322 11:48:27 version -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:07:37.322 * Looking for test storage... 00:07:37.322 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:07:37.322 11:48:27 version -- app/version.sh@17 -- # get_header_version major 00:07:37.322 11:48:27 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:07:37.322 11:48:27 version -- app/version.sh@14 -- # cut -f2 00:07:37.322 11:48:27 version -- app/version.sh@14 -- # tr -d '"' 00:07:37.322 11:48:27 version -- app/version.sh@17 -- # major=24 00:07:37.322 11:48:27 version -- app/version.sh@18 -- # get_header_version minor 00:07:37.322 11:48:27 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:07:37.322 11:48:27 version -- app/version.sh@14 -- # cut -f2 00:07:37.322 11:48:27 version -- app/version.sh@14 -- # tr -d '"' 00:07:37.322 11:48:27 version -- app/version.sh@18 -- # minor=9 00:07:37.322 11:48:27 version -- app/version.sh@19 -- # get_header_version patch 00:07:37.322 11:48:27 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:07:37.322 11:48:27 version -- app/version.sh@14 -- # cut -f2 00:07:37.322 11:48:27 version -- app/version.sh@14 -- # tr -d '"' 00:07:37.322 11:48:27 version -- app/version.sh@19 -- # patch=0 00:07:37.322 11:48:27 version -- app/version.sh@20 -- # get_header_version suffix 00:07:37.322 11:48:27 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:07:37.322 11:48:27 version -- app/version.sh@14 -- # cut -f2 00:07:37.322 11:48:27 version -- app/version.sh@14 -- # tr -d '"' 00:07:37.322 11:48:27 version -- app/version.sh@20 -- # suffix=-pre 00:07:37.322 11:48:27 version -- app/version.sh@22 -- # version=24.9 00:07:37.322 11:48:27 version -- app/version.sh@25 -- # (( patch != 0 )) 00:07:37.322 11:48:27 version -- app/version.sh@28 -- # version=24.9rc0 00:07:37.322 11:48:27 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:07:37.322 11:48:27 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:07:37.581 11:48:27 version -- app/version.sh@30 -- # py_version=24.9rc0 00:07:37.581 11:48:27 version -- app/version.sh@31 -- # [[ 24.9rc0 == \2\4\.\9\r\c\0 ]] 00:07:37.581 00:07:37.581 real 0m0.147s 00:07:37.581 user 0m0.084s 00:07:37.581 sys 0m0.099s 00:07:37.581 11:48:27 version -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:37.581 11:48:27 version -- common/autotest_common.sh@10 -- # set +x 00:07:37.581 ************************************ 00:07:37.581 END TEST version 00:07:37.581 ************************************ 00:07:37.581 11:48:27 -- common/autotest_common.sh@1142 -- # return 0 00:07:37.581 11:48:27 -- spdk/autotest.sh@188 -- # '[' 1 -eq 1 ']' 00:07:37.581 11:48:27 -- spdk/autotest.sh@189 -- # run_test blockdev_general /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:07:37.581 11:48:27 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:37.581 11:48:27 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:37.581 11:48:27 -- common/autotest_common.sh@10 -- # set +x 00:07:37.581 ************************************ 00:07:37.581 START TEST blockdev_general 00:07:37.581 ************************************ 00:07:37.581 11:48:27 blockdev_general -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:07:37.581 * Looking for test storage... 00:07:37.581 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:07:37.581 11:48:27 blockdev_general -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:07:37.581 11:48:27 blockdev_general -- bdev/nbd_common.sh@6 -- # set -e 00:07:37.581 11:48:27 blockdev_general -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:07:37.581 11:48:27 blockdev_general -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:07:37.581 11:48:27 blockdev_general -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:07:37.581 11:48:27 blockdev_general -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:07:37.581 11:48:27 blockdev_general -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:07:37.581 11:48:27 blockdev_general -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:07:37.581 11:48:27 blockdev_general -- bdev/blockdev.sh@20 -- # : 00:07:37.581 11:48:27 blockdev_general -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:07:37.581 11:48:27 blockdev_general -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:07:37.581 11:48:27 blockdev_general -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:07:37.581 11:48:27 blockdev_general -- bdev/blockdev.sh@674 -- # uname -s 00:07:37.581 11:48:27 blockdev_general -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:07:37.581 11:48:27 blockdev_general -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:07:37.581 11:48:27 blockdev_general -- bdev/blockdev.sh@682 -- # test_type=bdev 00:07:37.581 11:48:27 blockdev_general -- bdev/blockdev.sh@683 -- # crypto_device= 00:07:37.581 11:48:27 blockdev_general -- bdev/blockdev.sh@684 -- # dek= 00:07:37.581 11:48:27 blockdev_general -- bdev/blockdev.sh@685 -- # env_ctx= 00:07:37.581 11:48:27 blockdev_general -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:07:37.581 11:48:27 blockdev_general -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:07:37.581 11:48:27 blockdev_general -- bdev/blockdev.sh@690 -- # [[ bdev == bdev ]] 00:07:37.581 11:48:27 blockdev_general -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:07:37.581 11:48:27 blockdev_general -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:07:37.581 11:48:27 blockdev_general -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=569770 00:07:37.581 11:48:27 blockdev_general -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:07:37.581 11:48:27 blockdev_general -- bdev/blockdev.sh@49 -- # waitforlisten 569770 00:07:37.581 11:48:27 blockdev_general -- common/autotest_common.sh@829 -- # '[' -z 569770 ']' 00:07:37.581 11:48:27 blockdev_general -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:37.581 11:48:27 blockdev_general -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:37.581 11:48:27 blockdev_general -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:07:37.581 11:48:27 blockdev_general -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:37.581 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:37.581 11:48:27 blockdev_general -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:37.581 11:48:27 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:37.581 [2024-07-12 11:48:27.810604] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:07:37.581 [2024-07-12 11:48:27.810646] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid569770 ] 00:07:37.840 [2024-07-12 11:48:27.875730] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:37.840 [2024-07-12 11:48:27.951979] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:38.406 11:48:28 blockdev_general -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:38.406 11:48:28 blockdev_general -- common/autotest_common.sh@862 -- # return 0 00:07:38.406 11:48:28 blockdev_general -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:07:38.406 11:48:28 blockdev_general -- bdev/blockdev.sh@696 -- # setup_bdev_conf 00:07:38.406 11:48:28 blockdev_general -- bdev/blockdev.sh@53 -- # rpc_cmd 00:07:38.406 11:48:28 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:38.406 11:48:28 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:38.665 [2024-07-12 11:48:28.777808] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:38.665 [2024-07-12 11:48:28.777846] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:38.665 00:07:38.665 [2024-07-12 11:48:28.785800] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:38.665 [2024-07-12 11:48:28.785814] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:38.665 00:07:38.665 Malloc0 00:07:38.665 Malloc1 00:07:38.665 Malloc2 00:07:38.665 Malloc3 00:07:38.665 Malloc4 00:07:38.665 Malloc5 00:07:38.665 Malloc6 00:07:38.665 Malloc7 00:07:38.665 Malloc8 00:07:38.665 Malloc9 00:07:38.665 [2024-07-12 11:48:28.910026] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:38.665 [2024-07-12 11:48:28.910061] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:38.665 [2024-07-12 11:48:28.910072] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1897700 00:07:38.665 [2024-07-12 11:48:28.910078] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:38.924 [2024-07-12 11:48:28.911012] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:38.924 [2024-07-12 11:48:28.911031] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:07:38.924 TestPT 00:07:38.924 11:48:28 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:38.924 11:48:28 blockdev_general -- bdev/blockdev.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile bs=2048 count=5000 00:07:38.924 5000+0 records in 00:07:38.924 5000+0 records out 00:07:38.924 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0103234 s, 992 MB/s 00:07:38.925 11:48:28 blockdev_general -- bdev/blockdev.sh@77 -- # rpc_cmd bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile AIO0 2048 00:07:38.925 11:48:28 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:38.925 11:48:28 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:38.925 AIO0 00:07:38.925 11:48:28 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:38.925 11:48:28 blockdev_general -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:07:38.925 11:48:28 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:38.925 11:48:28 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:38.925 11:48:29 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:38.925 11:48:29 blockdev_general -- bdev/blockdev.sh@740 -- # cat 00:07:38.925 11:48:29 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:07:38.925 11:48:29 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:38.925 11:48:29 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:38.925 11:48:29 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:38.925 11:48:29 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:07:38.925 11:48:29 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:38.925 11:48:29 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:38.925 11:48:29 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:38.925 11:48:29 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:07:38.925 11:48:29 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:38.925 11:48:29 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:38.925 11:48:29 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:38.925 11:48:29 blockdev_general -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:07:38.925 11:48:29 blockdev_general -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:07:38.925 11:48:29 blockdev_general -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:07:38.925 11:48:29 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:38.925 11:48:29 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:38.925 11:48:29 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:38.925 11:48:29 blockdev_general -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:07:38.925 11:48:29 blockdev_general -- bdev/blockdev.sh@749 -- # jq -r .name 00:07:38.926 11:48:29 blockdev_general -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "29a8fdaa-3220-4071-b25e-738ed86662a7"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "29a8fdaa-3220-4071-b25e-738ed86662a7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "beaefb4e-f654-55cf-bb7e-e49b737552c1"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "beaefb4e-f654-55cf-bb7e-e49b737552c1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "dbb42538-0ae6-52d9-9bcf-133bf8425691"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "dbb42538-0ae6-52d9-9bcf-133bf8425691",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "d17783e3-052b-52cb-bd66-9b2ce9ea6276"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "d17783e3-052b-52cb-bd66-9b2ce9ea6276",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "8ad10dcd-f415-55d1-ba1b-65c6d55f91ba"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "8ad10dcd-f415-55d1-ba1b-65c6d55f91ba",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "7028417e-6dc8-5281-9f6f-bb788ffbd1e6"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "7028417e-6dc8-5281-9f6f-bb788ffbd1e6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "b371e7e7-051d-5780-8690-87a00573951f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b371e7e7-051d-5780-8690-87a00573951f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "83c5d8fb-d379-549f-804b-f70859975ba1"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "83c5d8fb-d379-549f-804b-f70859975ba1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "f34ea871-47b6-55b6-8ef3-58b44e0cef72"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f34ea871-47b6-55b6-8ef3-58b44e0cef72",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "d0e84be1-e6cd-57b9-af72-7a772f020081"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "d0e84be1-e6cd-57b9-af72-7a772f020081",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "8bf9b11e-eff3-5d98-b73f-2922458a6197"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "8bf9b11e-eff3-5d98-b73f-2922458a6197",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "18849010-fbb6-51f4-b18b-cc7a42f9e5cc"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "18849010-fbb6-51f4-b18b-cc7a42f9e5cc",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "222481d5-1dab-4aae-bbba-3ca76a0366df"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "222481d5-1dab-4aae-bbba-3ca76a0366df",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "222481d5-1dab-4aae-bbba-3ca76a0366df",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "ddb60a11-543b-4525-941d-473c8f0fa170",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "eea1506b-9bbb-4965-9ae6-f6a0326ffcfa",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "9cb2fc3c-9bea-4d08-be6a-d1d368617882"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "9cb2fc3c-9bea-4d08-be6a-d1d368617882",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "9cb2fc3c-9bea-4d08-be6a-d1d368617882",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "2cfba0c8-8a4c-4205-b5b1-f9484a6d0a31",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "b0cdac0c-786d-43d6-ba0d-47919827d96b",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "93c54a51-ea1d-4613-9c81-5ed60b054b00"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "93c54a51-ea1d-4613-9c81-5ed60b054b00",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "93c54a51-ea1d-4613-9c81-5ed60b054b00",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "a93dba3e-c01c-4afb-a638-ac5eab3927c4",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "491d6243-7f14-4d7a-95f3-8a9857aaf76b",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "c3585356-201b-4eb4-af9b-c8f07b4a8910"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "c3585356-201b-4eb4-af9b-c8f07b4a8910",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:07:39.185 11:48:29 blockdev_general -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:07:39.185 11:48:29 blockdev_general -- bdev/blockdev.sh@752 -- # hello_world_bdev=Malloc0 00:07:39.185 11:48:29 blockdev_general -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:07:39.185 11:48:29 blockdev_general -- bdev/blockdev.sh@754 -- # killprocess 569770 00:07:39.185 11:48:29 blockdev_general -- common/autotest_common.sh@948 -- # '[' -z 569770 ']' 00:07:39.185 11:48:29 blockdev_general -- common/autotest_common.sh@952 -- # kill -0 569770 00:07:39.185 11:48:29 blockdev_general -- common/autotest_common.sh@953 -- # uname 00:07:39.185 11:48:29 blockdev_general -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:39.185 11:48:29 blockdev_general -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 569770 00:07:39.185 11:48:29 blockdev_general -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:39.185 11:48:29 blockdev_general -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:39.185 11:48:29 blockdev_general -- common/autotest_common.sh@966 -- # echo 'killing process with pid 569770' 00:07:39.185 killing process with pid 569770 00:07:39.185 11:48:29 blockdev_general -- common/autotest_common.sh@967 -- # kill 569770 00:07:39.185 11:48:29 blockdev_general -- common/autotest_common.sh@972 -- # wait 569770 00:07:39.444 11:48:29 blockdev_general -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:07:39.444 11:48:29 blockdev_general -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:07:39.444 11:48:29 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:39.444 11:48:29 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:39.444 11:48:29 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:39.444 ************************************ 00:07:39.444 START TEST bdev_hello_world 00:07:39.444 ************************************ 00:07:39.444 11:48:29 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:07:39.444 [2024-07-12 11:48:29.682951] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:07:39.444 [2024-07-12 11:48:29.682983] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid570228 ] 00:07:39.703 [2024-07-12 11:48:29.744997] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:39.703 [2024-07-12 11:48:29.815257] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:39.703 [2024-07-12 11:48:29.946434] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:39.703 [2024-07-12 11:48:29.946471] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:07:39.703 [2024-07-12 11:48:29.946478] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:39.962 [2024-07-12 11:48:29.954442] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:39.962 [2024-07-12 11:48:29.954457] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:39.962 [2024-07-12 11:48:29.962455] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:39.962 [2024-07-12 11:48:29.962469] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:39.962 [2024-07-12 11:48:30.029785] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:39.962 [2024-07-12 11:48:30.029822] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:39.962 [2024-07-12 11:48:30.029831] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x255b900 00:07:39.962 [2024-07-12 11:48:30.029837] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:39.962 [2024-07-12 11:48:30.030802] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:39.962 [2024-07-12 11:48:30.030822] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:07:39.962 [2024-07-12 11:48:30.157901] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:07:39.962 [2024-07-12 11:48:30.157937] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev Malloc0 00:07:39.962 [2024-07-12 11:48:30.157960] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:07:39.962 [2024-07-12 11:48:30.157994] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:07:39.962 [2024-07-12 11:48:30.158028] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:07:39.962 [2024-07-12 11:48:30.158039] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:07:39.962 [2024-07-12 11:48:30.158073] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:07:39.962 00:07:39.962 [2024-07-12 11:48:30.158089] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:07:40.222 00:07:40.222 real 0m0.775s 00:07:40.222 user 0m0.538s 00:07:40.222 sys 0m0.211s 00:07:40.222 11:48:30 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:40.222 11:48:30 blockdev_general.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:07:40.222 ************************************ 00:07:40.222 END TEST bdev_hello_world 00:07:40.222 ************************************ 00:07:40.222 11:48:30 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:07:40.222 11:48:30 blockdev_general -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:07:40.222 11:48:30 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:07:40.222 11:48:30 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:40.222 11:48:30 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:40.480 ************************************ 00:07:40.480 START TEST bdev_bounds 00:07:40.480 ************************************ 00:07:40.480 11:48:30 blockdev_general.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:07:40.480 11:48:30 blockdev_general.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=570263 00:07:40.480 11:48:30 blockdev_general.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:07:40.480 11:48:30 blockdev_general.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 570263' 00:07:40.480 Process bdevio pid: 570263 00:07:40.480 11:48:30 blockdev_general.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 570263 00:07:40.480 11:48:30 blockdev_general.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 570263 ']' 00:07:40.480 11:48:30 blockdev_general.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:07:40.480 11:48:30 blockdev_general.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:40.480 11:48:30 blockdev_general.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:40.480 11:48:30 blockdev_general.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:40.480 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:40.480 11:48:30 blockdev_general.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:40.480 11:48:30 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:07:40.480 [2024-07-12 11:48:30.520122] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:07:40.481 [2024-07-12 11:48:30.520164] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid570263 ] 00:07:40.481 [2024-07-12 11:48:30.585909] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:40.481 [2024-07-12 11:48:30.665431] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:40.481 [2024-07-12 11:48:30.665538] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:40.481 [2024-07-12 11:48:30.665545] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:40.739 [2024-07-12 11:48:30.801765] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:40.739 [2024-07-12 11:48:30.801811] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:07:40.739 [2024-07-12 11:48:30.801819] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:40.739 [2024-07-12 11:48:30.809776] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:40.739 [2024-07-12 11:48:30.809792] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:40.739 [2024-07-12 11:48:30.817793] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:40.739 [2024-07-12 11:48:30.817807] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:40.739 [2024-07-12 11:48:30.885191] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:40.739 [2024-07-12 11:48:30.885227] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:40.739 [2024-07-12 11:48:30.885236] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe52be0 00:07:40.739 [2024-07-12 11:48:30.885242] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:40.739 [2024-07-12 11:48:30.886278] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:40.739 [2024-07-12 11:48:30.886298] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:07:41.308 11:48:31 blockdev_general.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:41.308 11:48:31 blockdev_general.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:07:41.308 11:48:31 blockdev_general.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:07:41.308 I/O targets: 00:07:41.308 Malloc0: 65536 blocks of 512 bytes (32 MiB) 00:07:41.308 Malloc1p0: 32768 blocks of 512 bytes (16 MiB) 00:07:41.308 Malloc1p1: 32768 blocks of 512 bytes (16 MiB) 00:07:41.308 Malloc2p0: 8192 blocks of 512 bytes (4 MiB) 00:07:41.308 Malloc2p1: 8192 blocks of 512 bytes (4 MiB) 00:07:41.308 Malloc2p2: 8192 blocks of 512 bytes (4 MiB) 00:07:41.308 Malloc2p3: 8192 blocks of 512 bytes (4 MiB) 00:07:41.308 Malloc2p4: 8192 blocks of 512 bytes (4 MiB) 00:07:41.308 Malloc2p5: 8192 blocks of 512 bytes (4 MiB) 00:07:41.308 Malloc2p6: 8192 blocks of 512 bytes (4 MiB) 00:07:41.308 Malloc2p7: 8192 blocks of 512 bytes (4 MiB) 00:07:41.308 TestPT: 65536 blocks of 512 bytes (32 MiB) 00:07:41.308 raid0: 131072 blocks of 512 bytes (64 MiB) 00:07:41.308 concat0: 131072 blocks of 512 bytes (64 MiB) 00:07:41.308 raid1: 65536 blocks of 512 bytes (32 MiB) 00:07:41.308 AIO0: 5000 blocks of 2048 bytes (10 MiB) 00:07:41.308 00:07:41.308 00:07:41.308 CUnit - A unit testing framework for C - Version 2.1-3 00:07:41.308 http://cunit.sourceforge.net/ 00:07:41.308 00:07:41.308 00:07:41.308 Suite: bdevio tests on: AIO0 00:07:41.308 Test: blockdev write read block ...passed 00:07:41.308 Test: blockdev write zeroes read block ...passed 00:07:41.308 Test: blockdev write zeroes read no split ...passed 00:07:41.308 Test: blockdev write zeroes read split ...passed 00:07:41.308 Test: blockdev write zeroes read split partial ...passed 00:07:41.308 Test: blockdev reset ...passed 00:07:41.308 Test: blockdev write read 8 blocks ...passed 00:07:41.308 Test: blockdev write read size > 128k ...passed 00:07:41.308 Test: blockdev write read invalid size ...passed 00:07:41.308 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:41.308 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:41.308 Test: blockdev write read max offset ...passed 00:07:41.308 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:41.308 Test: blockdev writev readv 8 blocks ...passed 00:07:41.308 Test: blockdev writev readv 30 x 1block ...passed 00:07:41.308 Test: blockdev writev readv block ...passed 00:07:41.308 Test: blockdev writev readv size > 128k ...passed 00:07:41.308 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:41.308 Test: blockdev comparev and writev ...passed 00:07:41.308 Test: blockdev nvme passthru rw ...passed 00:07:41.308 Test: blockdev nvme passthru vendor specific ...passed 00:07:41.308 Test: blockdev nvme admin passthru ...passed 00:07:41.308 Test: blockdev copy ...passed 00:07:41.308 Suite: bdevio tests on: raid1 00:07:41.308 Test: blockdev write read block ...passed 00:07:41.308 Test: blockdev write zeroes read block ...passed 00:07:41.308 Test: blockdev write zeroes read no split ...passed 00:07:41.308 Test: blockdev write zeroes read split ...passed 00:07:41.308 Test: blockdev write zeroes read split partial ...passed 00:07:41.308 Test: blockdev reset ...passed 00:07:41.308 Test: blockdev write read 8 blocks ...passed 00:07:41.308 Test: blockdev write read size > 128k ...passed 00:07:41.308 Test: blockdev write read invalid size ...passed 00:07:41.308 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:41.308 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:41.308 Test: blockdev write read max offset ...passed 00:07:41.308 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:41.308 Test: blockdev writev readv 8 blocks ...passed 00:07:41.308 Test: blockdev writev readv 30 x 1block ...passed 00:07:41.308 Test: blockdev writev readv block ...passed 00:07:41.308 Test: blockdev writev readv size > 128k ...passed 00:07:41.308 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:41.308 Test: blockdev comparev and writev ...passed 00:07:41.308 Test: blockdev nvme passthru rw ...passed 00:07:41.308 Test: blockdev nvme passthru vendor specific ...passed 00:07:41.308 Test: blockdev nvme admin passthru ...passed 00:07:41.308 Test: blockdev copy ...passed 00:07:41.308 Suite: bdevio tests on: concat0 00:07:41.308 Test: blockdev write read block ...passed 00:07:41.308 Test: blockdev write zeroes read block ...passed 00:07:41.308 Test: blockdev write zeroes read no split ...passed 00:07:41.308 Test: blockdev write zeroes read split ...passed 00:07:41.308 Test: blockdev write zeroes read split partial ...passed 00:07:41.308 Test: blockdev reset ...passed 00:07:41.308 Test: blockdev write read 8 blocks ...passed 00:07:41.308 Test: blockdev write read size > 128k ...passed 00:07:41.308 Test: blockdev write read invalid size ...passed 00:07:41.308 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:41.308 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:41.308 Test: blockdev write read max offset ...passed 00:07:41.308 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:41.308 Test: blockdev writev readv 8 blocks ...passed 00:07:41.308 Test: blockdev writev readv 30 x 1block ...passed 00:07:41.308 Test: blockdev writev readv block ...passed 00:07:41.308 Test: blockdev writev readv size > 128k ...passed 00:07:41.308 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:41.308 Test: blockdev comparev and writev ...passed 00:07:41.308 Test: blockdev nvme passthru rw ...passed 00:07:41.308 Test: blockdev nvme passthru vendor specific ...passed 00:07:41.308 Test: blockdev nvme admin passthru ...passed 00:07:41.308 Test: blockdev copy ...passed 00:07:41.308 Suite: bdevio tests on: raid0 00:07:41.308 Test: blockdev write read block ...passed 00:07:41.308 Test: blockdev write zeroes read block ...passed 00:07:41.308 Test: blockdev write zeroes read no split ...passed 00:07:41.308 Test: blockdev write zeroes read split ...passed 00:07:41.308 Test: blockdev write zeroes read split partial ...passed 00:07:41.308 Test: blockdev reset ...passed 00:07:41.308 Test: blockdev write read 8 blocks ...passed 00:07:41.308 Test: blockdev write read size > 128k ...passed 00:07:41.308 Test: blockdev write read invalid size ...passed 00:07:41.308 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:41.308 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:41.308 Test: blockdev write read max offset ...passed 00:07:41.308 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:41.308 Test: blockdev writev readv 8 blocks ...passed 00:07:41.308 Test: blockdev writev readv 30 x 1block ...passed 00:07:41.308 Test: blockdev writev readv block ...passed 00:07:41.308 Test: blockdev writev readv size > 128k ...passed 00:07:41.308 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:41.308 Test: blockdev comparev and writev ...passed 00:07:41.308 Test: blockdev nvme passthru rw ...passed 00:07:41.308 Test: blockdev nvme passthru vendor specific ...passed 00:07:41.308 Test: blockdev nvme admin passthru ...passed 00:07:41.308 Test: blockdev copy ...passed 00:07:41.308 Suite: bdevio tests on: TestPT 00:07:41.308 Test: blockdev write read block ...passed 00:07:41.308 Test: blockdev write zeroes read block ...passed 00:07:41.308 Test: blockdev write zeroes read no split ...passed 00:07:41.308 Test: blockdev write zeroes read split ...passed 00:07:41.308 Test: blockdev write zeroes read split partial ...passed 00:07:41.308 Test: blockdev reset ...passed 00:07:41.308 Test: blockdev write read 8 blocks ...passed 00:07:41.308 Test: blockdev write read size > 128k ...passed 00:07:41.308 Test: blockdev write read invalid size ...passed 00:07:41.308 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:41.308 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:41.308 Test: blockdev write read max offset ...passed 00:07:41.308 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:41.308 Test: blockdev writev readv 8 blocks ...passed 00:07:41.308 Test: blockdev writev readv 30 x 1block ...passed 00:07:41.308 Test: blockdev writev readv block ...passed 00:07:41.308 Test: blockdev writev readv size > 128k ...passed 00:07:41.308 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:41.308 Test: blockdev comparev and writev ...passed 00:07:41.308 Test: blockdev nvme passthru rw ...passed 00:07:41.308 Test: blockdev nvme passthru vendor specific ...passed 00:07:41.308 Test: blockdev nvme admin passthru ...passed 00:07:41.308 Test: blockdev copy ...passed 00:07:41.308 Suite: bdevio tests on: Malloc2p7 00:07:41.308 Test: blockdev write read block ...passed 00:07:41.308 Test: blockdev write zeroes read block ...passed 00:07:41.308 Test: blockdev write zeroes read no split ...passed 00:07:41.308 Test: blockdev write zeroes read split ...passed 00:07:41.308 Test: blockdev write zeroes read split partial ...passed 00:07:41.308 Test: blockdev reset ...passed 00:07:41.308 Test: blockdev write read 8 blocks ...passed 00:07:41.308 Test: blockdev write read size > 128k ...passed 00:07:41.308 Test: blockdev write read invalid size ...passed 00:07:41.308 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:41.308 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:41.308 Test: blockdev write read max offset ...passed 00:07:41.308 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:41.309 Test: blockdev writev readv 8 blocks ...passed 00:07:41.309 Test: blockdev writev readv 30 x 1block ...passed 00:07:41.309 Test: blockdev writev readv block ...passed 00:07:41.309 Test: blockdev writev readv size > 128k ...passed 00:07:41.309 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:41.309 Test: blockdev comparev and writev ...passed 00:07:41.309 Test: blockdev nvme passthru rw ...passed 00:07:41.309 Test: blockdev nvme passthru vendor specific ...passed 00:07:41.309 Test: blockdev nvme admin passthru ...passed 00:07:41.309 Test: blockdev copy ...passed 00:07:41.309 Suite: bdevio tests on: Malloc2p6 00:07:41.309 Test: blockdev write read block ...passed 00:07:41.309 Test: blockdev write zeroes read block ...passed 00:07:41.309 Test: blockdev write zeroes read no split ...passed 00:07:41.309 Test: blockdev write zeroes read split ...passed 00:07:41.309 Test: blockdev write zeroes read split partial ...passed 00:07:41.309 Test: blockdev reset ...passed 00:07:41.309 Test: blockdev write read 8 blocks ...passed 00:07:41.309 Test: blockdev write read size > 128k ...passed 00:07:41.309 Test: blockdev write read invalid size ...passed 00:07:41.309 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:41.309 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:41.309 Test: blockdev write read max offset ...passed 00:07:41.309 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:41.309 Test: blockdev writev readv 8 blocks ...passed 00:07:41.309 Test: blockdev writev readv 30 x 1block ...passed 00:07:41.309 Test: blockdev writev readv block ...passed 00:07:41.309 Test: blockdev writev readv size > 128k ...passed 00:07:41.309 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:41.309 Test: blockdev comparev and writev ...passed 00:07:41.309 Test: blockdev nvme passthru rw ...passed 00:07:41.309 Test: blockdev nvme passthru vendor specific ...passed 00:07:41.309 Test: blockdev nvme admin passthru ...passed 00:07:41.309 Test: blockdev copy ...passed 00:07:41.309 Suite: bdevio tests on: Malloc2p5 00:07:41.309 Test: blockdev write read block ...passed 00:07:41.309 Test: blockdev write zeroes read block ...passed 00:07:41.309 Test: blockdev write zeroes read no split ...passed 00:07:41.309 Test: blockdev write zeroes read split ...passed 00:07:41.309 Test: blockdev write zeroes read split partial ...passed 00:07:41.309 Test: blockdev reset ...passed 00:07:41.309 Test: blockdev write read 8 blocks ...passed 00:07:41.309 Test: blockdev write read size > 128k ...passed 00:07:41.309 Test: blockdev write read invalid size ...passed 00:07:41.309 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:41.309 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:41.309 Test: blockdev write read max offset ...passed 00:07:41.309 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:41.309 Test: blockdev writev readv 8 blocks ...passed 00:07:41.309 Test: blockdev writev readv 30 x 1block ...passed 00:07:41.309 Test: blockdev writev readv block ...passed 00:07:41.309 Test: blockdev writev readv size > 128k ...passed 00:07:41.309 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:41.309 Test: blockdev comparev and writev ...passed 00:07:41.309 Test: blockdev nvme passthru rw ...passed 00:07:41.309 Test: blockdev nvme passthru vendor specific ...passed 00:07:41.309 Test: blockdev nvme admin passthru ...passed 00:07:41.309 Test: blockdev copy ...passed 00:07:41.309 Suite: bdevio tests on: Malloc2p4 00:07:41.309 Test: blockdev write read block ...passed 00:07:41.309 Test: blockdev write zeroes read block ...passed 00:07:41.309 Test: blockdev write zeroes read no split ...passed 00:07:41.309 Test: blockdev write zeroes read split ...passed 00:07:41.309 Test: blockdev write zeroes read split partial ...passed 00:07:41.309 Test: blockdev reset ...passed 00:07:41.309 Test: blockdev write read 8 blocks ...passed 00:07:41.309 Test: blockdev write read size > 128k ...passed 00:07:41.309 Test: blockdev write read invalid size ...passed 00:07:41.309 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:41.309 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:41.309 Test: blockdev write read max offset ...passed 00:07:41.309 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:41.309 Test: blockdev writev readv 8 blocks ...passed 00:07:41.309 Test: blockdev writev readv 30 x 1block ...passed 00:07:41.309 Test: blockdev writev readv block ...passed 00:07:41.309 Test: blockdev writev readv size > 128k ...passed 00:07:41.309 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:41.309 Test: blockdev comparev and writev ...passed 00:07:41.309 Test: blockdev nvme passthru rw ...passed 00:07:41.309 Test: blockdev nvme passthru vendor specific ...passed 00:07:41.309 Test: blockdev nvme admin passthru ...passed 00:07:41.569 Test: blockdev copy ...passed 00:07:41.569 Suite: bdevio tests on: Malloc2p3 00:07:41.569 Test: blockdev write read block ...passed 00:07:41.569 Test: blockdev write zeroes read block ...passed 00:07:41.569 Test: blockdev write zeroes read no split ...passed 00:07:41.569 Test: blockdev write zeroes read split ...passed 00:07:41.569 Test: blockdev write zeroes read split partial ...passed 00:07:41.569 Test: blockdev reset ...passed 00:07:41.569 Test: blockdev write read 8 blocks ...passed 00:07:41.569 Test: blockdev write read size > 128k ...passed 00:07:41.569 Test: blockdev write read invalid size ...passed 00:07:41.569 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:41.569 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:41.569 Test: blockdev write read max offset ...passed 00:07:41.569 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:41.569 Test: blockdev writev readv 8 blocks ...passed 00:07:41.569 Test: blockdev writev readv 30 x 1block ...passed 00:07:41.569 Test: blockdev writev readv block ...passed 00:07:41.569 Test: blockdev writev readv size > 128k ...passed 00:07:41.569 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:41.569 Test: blockdev comparev and writev ...passed 00:07:41.569 Test: blockdev nvme passthru rw ...passed 00:07:41.569 Test: blockdev nvme passthru vendor specific ...passed 00:07:41.569 Test: blockdev nvme admin passthru ...passed 00:07:41.569 Test: blockdev copy ...passed 00:07:41.569 Suite: bdevio tests on: Malloc2p2 00:07:41.569 Test: blockdev write read block ...passed 00:07:41.569 Test: blockdev write zeroes read block ...passed 00:07:41.569 Test: blockdev write zeroes read no split ...passed 00:07:41.569 Test: blockdev write zeroes read split ...passed 00:07:41.569 Test: blockdev write zeroes read split partial ...passed 00:07:41.569 Test: blockdev reset ...passed 00:07:41.569 Test: blockdev write read 8 blocks ...passed 00:07:41.569 Test: blockdev write read size > 128k ...passed 00:07:41.569 Test: blockdev write read invalid size ...passed 00:07:41.569 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:41.569 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:41.569 Test: blockdev write read max offset ...passed 00:07:41.569 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:41.569 Test: blockdev writev readv 8 blocks ...passed 00:07:41.569 Test: blockdev writev readv 30 x 1block ...passed 00:07:41.569 Test: blockdev writev readv block ...passed 00:07:41.569 Test: blockdev writev readv size > 128k ...passed 00:07:41.569 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:41.569 Test: blockdev comparev and writev ...passed 00:07:41.569 Test: blockdev nvme passthru rw ...passed 00:07:41.569 Test: blockdev nvme passthru vendor specific ...passed 00:07:41.569 Test: blockdev nvme admin passthru ...passed 00:07:41.569 Test: blockdev copy ...passed 00:07:41.569 Suite: bdevio tests on: Malloc2p1 00:07:41.569 Test: blockdev write read block ...passed 00:07:41.569 Test: blockdev write zeroes read block ...passed 00:07:41.569 Test: blockdev write zeroes read no split ...passed 00:07:41.569 Test: blockdev write zeroes read split ...passed 00:07:41.569 Test: blockdev write zeroes read split partial ...passed 00:07:41.569 Test: blockdev reset ...passed 00:07:41.569 Test: blockdev write read 8 blocks ...passed 00:07:41.569 Test: blockdev write read size > 128k ...passed 00:07:41.569 Test: blockdev write read invalid size ...passed 00:07:41.569 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:41.569 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:41.569 Test: blockdev write read max offset ...passed 00:07:41.569 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:41.569 Test: blockdev writev readv 8 blocks ...passed 00:07:41.569 Test: blockdev writev readv 30 x 1block ...passed 00:07:41.569 Test: blockdev writev readv block ...passed 00:07:41.569 Test: blockdev writev readv size > 128k ...passed 00:07:41.569 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:41.569 Test: blockdev comparev and writev ...passed 00:07:41.569 Test: blockdev nvme passthru rw ...passed 00:07:41.569 Test: blockdev nvme passthru vendor specific ...passed 00:07:41.569 Test: blockdev nvme admin passthru ...passed 00:07:41.569 Test: blockdev copy ...passed 00:07:41.569 Suite: bdevio tests on: Malloc2p0 00:07:41.569 Test: blockdev write read block ...passed 00:07:41.569 Test: blockdev write zeroes read block ...passed 00:07:41.569 Test: blockdev write zeroes read no split ...passed 00:07:41.569 Test: blockdev write zeroes read split ...passed 00:07:41.569 Test: blockdev write zeroes read split partial ...passed 00:07:41.569 Test: blockdev reset ...passed 00:07:41.569 Test: blockdev write read 8 blocks ...passed 00:07:41.569 Test: blockdev write read size > 128k ...passed 00:07:41.569 Test: blockdev write read invalid size ...passed 00:07:41.569 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:41.569 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:41.569 Test: blockdev write read max offset ...passed 00:07:41.569 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:41.569 Test: blockdev writev readv 8 blocks ...passed 00:07:41.569 Test: blockdev writev readv 30 x 1block ...passed 00:07:41.569 Test: blockdev writev readv block ...passed 00:07:41.569 Test: blockdev writev readv size > 128k ...passed 00:07:41.569 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:41.569 Test: blockdev comparev and writev ...passed 00:07:41.569 Test: blockdev nvme passthru rw ...passed 00:07:41.569 Test: blockdev nvme passthru vendor specific ...passed 00:07:41.569 Test: blockdev nvme admin passthru ...passed 00:07:41.569 Test: blockdev copy ...passed 00:07:41.569 Suite: bdevio tests on: Malloc1p1 00:07:41.569 Test: blockdev write read block ...passed 00:07:41.569 Test: blockdev write zeroes read block ...passed 00:07:41.569 Test: blockdev write zeroes read no split ...passed 00:07:41.569 Test: blockdev write zeroes read split ...passed 00:07:41.569 Test: blockdev write zeroes read split partial ...passed 00:07:41.569 Test: blockdev reset ...passed 00:07:41.569 Test: blockdev write read 8 blocks ...passed 00:07:41.569 Test: blockdev write read size > 128k ...passed 00:07:41.569 Test: blockdev write read invalid size ...passed 00:07:41.569 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:41.569 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:41.569 Test: blockdev write read max offset ...passed 00:07:41.569 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:41.569 Test: blockdev writev readv 8 blocks ...passed 00:07:41.569 Test: blockdev writev readv 30 x 1block ...passed 00:07:41.569 Test: blockdev writev readv block ...passed 00:07:41.569 Test: blockdev writev readv size > 128k ...passed 00:07:41.569 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:41.569 Test: blockdev comparev and writev ...passed 00:07:41.569 Test: blockdev nvme passthru rw ...passed 00:07:41.569 Test: blockdev nvme passthru vendor specific ...passed 00:07:41.569 Test: blockdev nvme admin passthru ...passed 00:07:41.569 Test: blockdev copy ...passed 00:07:41.569 Suite: bdevio tests on: Malloc1p0 00:07:41.569 Test: blockdev write read block ...passed 00:07:41.569 Test: blockdev write zeroes read block ...passed 00:07:41.569 Test: blockdev write zeroes read no split ...passed 00:07:41.569 Test: blockdev write zeroes read split ...passed 00:07:41.569 Test: blockdev write zeroes read split partial ...passed 00:07:41.569 Test: blockdev reset ...passed 00:07:41.569 Test: blockdev write read 8 blocks ...passed 00:07:41.569 Test: blockdev write read size > 128k ...passed 00:07:41.569 Test: blockdev write read invalid size ...passed 00:07:41.569 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:41.569 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:41.569 Test: blockdev write read max offset ...passed 00:07:41.570 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:41.570 Test: blockdev writev readv 8 blocks ...passed 00:07:41.570 Test: blockdev writev readv 30 x 1block ...passed 00:07:41.570 Test: blockdev writev readv block ...passed 00:07:41.570 Test: blockdev writev readv size > 128k ...passed 00:07:41.570 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:41.570 Test: blockdev comparev and writev ...passed 00:07:41.570 Test: blockdev nvme passthru rw ...passed 00:07:41.570 Test: blockdev nvme passthru vendor specific ...passed 00:07:41.570 Test: blockdev nvme admin passthru ...passed 00:07:41.570 Test: blockdev copy ...passed 00:07:41.570 Suite: bdevio tests on: Malloc0 00:07:41.570 Test: blockdev write read block ...passed 00:07:41.570 Test: blockdev write zeroes read block ...passed 00:07:41.570 Test: blockdev write zeroes read no split ...passed 00:07:41.570 Test: blockdev write zeroes read split ...passed 00:07:41.570 Test: blockdev write zeroes read split partial ...passed 00:07:41.570 Test: blockdev reset ...passed 00:07:41.570 Test: blockdev write read 8 blocks ...passed 00:07:41.570 Test: blockdev write read size > 128k ...passed 00:07:41.570 Test: blockdev write read invalid size ...passed 00:07:41.570 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:41.570 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:41.570 Test: blockdev write read max offset ...passed 00:07:41.570 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:41.570 Test: blockdev writev readv 8 blocks ...passed 00:07:41.570 Test: blockdev writev readv 30 x 1block ...passed 00:07:41.570 Test: blockdev writev readv block ...passed 00:07:41.570 Test: blockdev writev readv size > 128k ...passed 00:07:41.570 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:41.570 Test: blockdev comparev and writev ...passed 00:07:41.570 Test: blockdev nvme passthru rw ...passed 00:07:41.570 Test: blockdev nvme passthru vendor specific ...passed 00:07:41.570 Test: blockdev nvme admin passthru ...passed 00:07:41.570 Test: blockdev copy ...passed 00:07:41.570 00:07:41.570 Run Summary: Type Total Ran Passed Failed Inactive 00:07:41.570 suites 16 16 n/a 0 0 00:07:41.570 tests 368 368 368 0 0 00:07:41.570 asserts 2224 2224 2224 0 n/a 00:07:41.570 00:07:41.570 Elapsed time = 0.461 seconds 00:07:41.570 0 00:07:41.570 11:48:31 blockdev_general.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 570263 00:07:41.570 11:48:31 blockdev_general.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 570263 ']' 00:07:41.570 11:48:31 blockdev_general.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 570263 00:07:41.570 11:48:31 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:07:41.570 11:48:31 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:41.570 11:48:31 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 570263 00:07:41.570 11:48:31 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:41.570 11:48:31 blockdev_general.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:41.570 11:48:31 blockdev_general.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 570263' 00:07:41.570 killing process with pid 570263 00:07:41.570 11:48:31 blockdev_general.bdev_bounds -- common/autotest_common.sh@967 -- # kill 570263 00:07:41.570 11:48:31 blockdev_general.bdev_bounds -- common/autotest_common.sh@972 -- # wait 570263 00:07:41.830 11:48:31 blockdev_general.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:07:41.830 00:07:41.830 real 0m1.442s 00:07:41.830 user 0m3.706s 00:07:41.830 sys 0m0.340s 00:07:41.830 11:48:31 blockdev_general.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:41.830 11:48:31 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:07:41.830 ************************************ 00:07:41.830 END TEST bdev_bounds 00:07:41.830 ************************************ 00:07:41.830 11:48:31 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:07:41.830 11:48:31 blockdev_general -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:07:41.830 11:48:31 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:07:41.830 11:48:31 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:41.830 11:48:31 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:41.830 ************************************ 00:07:41.830 START TEST bdev_nbd 00:07:41.830 ************************************ 00:07:41.830 11:48:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:07:41.830 11:48:31 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:07:41.830 11:48:31 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:07:41.830 11:48:31 blockdev_general.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:41.830 11:48:31 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:07:41.830 11:48:31 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:07:41.830 11:48:31 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:07:41.830 11:48:31 blockdev_general.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=16 00:07:41.830 11:48:31 blockdev_general.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:07:41.830 11:48:31 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:41.830 11:48:31 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:07:41.830 11:48:31 blockdev_general.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=16 00:07:41.830 11:48:31 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:41.830 11:48:31 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:07:41.830 11:48:31 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:07:41.831 11:48:31 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:07:41.831 11:48:31 blockdev_general.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=570573 00:07:41.831 11:48:31 blockdev_general.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:07:41.831 11:48:31 blockdev_general.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:07:41.831 11:48:31 blockdev_general.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 570573 /var/tmp/spdk-nbd.sock 00:07:41.831 11:48:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 570573 ']' 00:07:41.831 11:48:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:41.831 11:48:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:41.831 11:48:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:41.831 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:41.831 11:48:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:41.831 11:48:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:07:41.831 [2024-07-12 11:48:32.040533] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:07:41.831 [2024-07-12 11:48:32.040574] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:07:42.089 [2024-07-12 11:48:32.104860] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:42.089 [2024-07-12 11:48:32.182648] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:42.089 [2024-07-12 11:48:32.317080] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:42.089 [2024-07-12 11:48:32.317128] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:07:42.089 [2024-07-12 11:48:32.317135] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:42.089 [2024-07-12 11:48:32.325091] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:42.090 [2024-07-12 11:48:32.325108] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:42.090 [2024-07-12 11:48:32.333106] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:42.090 [2024-07-12 11:48:32.333120] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:42.348 [2024-07-12 11:48:32.399937] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:42.348 [2024-07-12 11:48:32.399967] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:42.348 [2024-07-12 11:48:32.399976] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b7fbd0 00:07:42.348 [2024-07-12 11:48:32.399982] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:42.348 [2024-07-12 11:48:32.401068] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:42.348 [2024-07-12 11:48:32.401089] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:07:42.606 11:48:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:42.606 11:48:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:07:42.606 11:48:32 blockdev_general.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:07:42.606 11:48:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:42.606 11:48:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:07:42.606 11:48:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:07:42.606 11:48:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:07:42.606 11:48:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:42.606 11:48:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:07:42.606 11:48:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:07:42.606 11:48:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:07:42.606 11:48:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:07:42.606 11:48:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:07:42.606 11:48:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:42.606 11:48:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 00:07:42.867 11:48:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:07:42.867 11:48:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:07:42.867 11:48:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:07:42.867 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:07:42.867 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:42.867 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:42.867 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:42.867 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:07:42.867 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:42.867 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:42.867 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:42.867 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:42.867 1+0 records in 00:07:42.867 1+0 records out 00:07:42.867 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00022875 s, 17.9 MB/s 00:07:42.867 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:42.867 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:42.867 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:42.867 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:42.867 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:42.867 11:48:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:42.867 11:48:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:42.867 11:48:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 00:07:43.127 11:48:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:07:43.127 11:48:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:07:43.127 11:48:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:07:43.127 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:07:43.127 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:43.127 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:43.127 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:43.127 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:07:43.127 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:43.127 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:43.127 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:43.127 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:43.127 1+0 records in 00:07:43.127 1+0 records out 00:07:43.127 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000207807 s, 19.7 MB/s 00:07:43.127 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:43.127 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:43.127 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:43.127 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:43.127 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:43.127 11:48:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:43.127 11:48:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:43.127 11:48:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 00:07:43.386 11:48:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:07:43.386 11:48:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:07:43.386 11:48:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:07:43.386 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:07:43.386 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:43.386 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:43.386 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:43.386 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:07:43.386 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:43.386 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:43.386 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:43.386 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:43.386 1+0 records in 00:07:43.386 1+0 records out 00:07:43.386 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000241527 s, 17.0 MB/s 00:07:43.386 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:43.386 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:43.386 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:43.386 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:43.386 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:43.386 11:48:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:43.386 11:48:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:43.386 11:48:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 00:07:43.386 11:48:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:07:43.386 11:48:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:07:43.386 11:48:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:07:43.386 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:07:43.386 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:43.386 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:43.386 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:43.386 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:07:43.645 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:43.645 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:43.645 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:43.645 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:43.645 1+0 records in 00:07:43.645 1+0 records out 00:07:43.645 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000243068 s, 16.9 MB/s 00:07:43.645 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:43.645 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:43.645 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:43.645 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:43.645 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:43.645 11:48:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:43.645 11:48:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:43.645 11:48:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 00:07:43.645 11:48:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd4 00:07:43.645 11:48:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd4 00:07:43.646 11:48:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd4 00:07:43.646 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd4 00:07:43.646 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:43.646 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:43.646 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:43.646 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd4 /proc/partitions 00:07:43.646 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:43.646 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:43.646 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:43.646 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:43.646 1+0 records in 00:07:43.646 1+0 records out 00:07:43.646 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000245574 s, 16.7 MB/s 00:07:43.646 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:43.646 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:43.646 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:43.646 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:43.646 11:48:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:43.646 11:48:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:43.646 11:48:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:43.646 11:48:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 00:07:43.905 11:48:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd5 00:07:43.905 11:48:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd5 00:07:43.905 11:48:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd5 00:07:43.905 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd5 00:07:43.905 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:43.905 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:43.905 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:43.905 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd5 /proc/partitions 00:07:43.905 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:43.905 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:43.905 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:43.905 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:43.905 1+0 records in 00:07:43.905 1+0 records out 00:07:43.905 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000238952 s, 17.1 MB/s 00:07:43.905 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:43.905 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:43.905 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:43.905 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:43.905 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:43.905 11:48:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:43.905 11:48:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:43.905 11:48:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 00:07:44.164 11:48:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd6 00:07:44.164 11:48:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd6 00:07:44.164 11:48:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd6 00:07:44.164 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd6 00:07:44.164 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:44.164 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:44.164 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:44.165 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd6 /proc/partitions 00:07:44.165 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:44.165 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:44.165 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:44.165 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:44.165 1+0 records in 00:07:44.165 1+0 records out 00:07:44.165 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000272314 s, 15.0 MB/s 00:07:44.165 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:44.165 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:44.165 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:44.165 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:44.165 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:44.165 11:48:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:44.165 11:48:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:44.165 11:48:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 00:07:44.165 11:48:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd7 00:07:44.424 11:48:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd7 00:07:44.424 11:48:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd7 00:07:44.424 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd7 00:07:44.424 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:44.424 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:44.424 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:44.424 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd7 /proc/partitions 00:07:44.424 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:44.424 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:44.424 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:44.424 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:44.424 1+0 records in 00:07:44.424 1+0 records out 00:07:44.424 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000302562 s, 13.5 MB/s 00:07:44.424 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:44.424 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:44.424 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:44.424 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:44.424 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:44.424 11:48:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:44.424 11:48:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:44.424 11:48:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 00:07:44.424 11:48:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd8 00:07:44.424 11:48:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd8 00:07:44.424 11:48:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd8 00:07:44.424 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd8 00:07:44.424 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:44.424 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:44.424 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:44.424 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd8 /proc/partitions 00:07:44.424 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:44.424 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:44.424 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:44.424 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:44.424 1+0 records in 00:07:44.424 1+0 records out 00:07:44.424 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000247534 s, 16.5 MB/s 00:07:44.424 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:44.424 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:44.424 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:44.424 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:44.424 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:44.424 11:48:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:44.424 11:48:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:44.424 11:48:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 00:07:44.684 11:48:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd9 00:07:44.684 11:48:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd9 00:07:44.684 11:48:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd9 00:07:44.684 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd9 00:07:44.684 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:44.684 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:44.684 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:44.684 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd9 /proc/partitions 00:07:44.684 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:44.684 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:44.684 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:44.684 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:44.684 1+0 records in 00:07:44.684 1+0 records out 00:07:44.684 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000262541 s, 15.6 MB/s 00:07:44.684 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:44.684 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:44.684 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:44.684 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:44.684 11:48:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:44.684 11:48:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:44.684 11:48:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:44.684 11:48:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 00:07:44.943 11:48:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd10 00:07:44.943 11:48:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd10 00:07:44.943 11:48:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd10 00:07:44.943 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:07:44.943 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:44.943 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:44.943 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:44.943 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:07:44.943 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:44.943 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:44.943 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:44.943 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:44.943 1+0 records in 00:07:44.943 1+0 records out 00:07:44.943 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000331282 s, 12.4 MB/s 00:07:44.943 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:44.943 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:44.943 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:44.943 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:44.943 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:44.943 11:48:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:44.943 11:48:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:44.943 11:48:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT 00:07:45.203 11:48:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd11 00:07:45.203 11:48:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd11 00:07:45.203 11:48:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd11 00:07:45.203 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:07:45.203 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:45.203 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:45.203 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:45.203 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:07:45.203 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:45.203 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:45.203 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:45.203 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:45.203 1+0 records in 00:07:45.203 1+0 records out 00:07:45.203 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000332007 s, 12.3 MB/s 00:07:45.203 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:45.203 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:45.203 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:45.203 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:45.203 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:45.203 11:48:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:45.203 11:48:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:45.203 11:48:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 00:07:45.203 11:48:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd12 00:07:45.203 11:48:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd12 00:07:45.203 11:48:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd12 00:07:45.203 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd12 00:07:45.203 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:45.203 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:45.203 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:45.203 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd12 /proc/partitions 00:07:45.203 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:45.203 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:45.203 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:45.203 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:45.203 1+0 records in 00:07:45.203 1+0 records out 00:07:45.203 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000292375 s, 14.0 MB/s 00:07:45.203 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:45.203 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:45.203 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:45.203 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:45.203 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:45.203 11:48:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:45.203 11:48:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:45.203 11:48:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 00:07:45.462 11:48:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd13 00:07:45.462 11:48:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd13 00:07:45.462 11:48:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd13 00:07:45.462 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd13 00:07:45.462 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:45.462 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:45.462 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:45.462 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd13 /proc/partitions 00:07:45.463 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:45.463 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:45.463 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:45.463 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:45.463 1+0 records in 00:07:45.463 1+0 records out 00:07:45.463 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000340431 s, 12.0 MB/s 00:07:45.463 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:45.463 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:45.463 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:45.463 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:45.463 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:45.463 11:48:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:45.463 11:48:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:45.463 11:48:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 00:07:45.722 11:48:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd14 00:07:45.722 11:48:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd14 00:07:45.722 11:48:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd14 00:07:45.722 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd14 00:07:45.722 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:45.722 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:45.722 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:45.722 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd14 /proc/partitions 00:07:45.722 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:45.722 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:45.722 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:45.722 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:45.722 1+0 records in 00:07:45.722 1+0 records out 00:07:45.722 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000344469 s, 11.9 MB/s 00:07:45.722 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:45.722 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:45.722 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:45.722 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:45.722 11:48:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:45.722 11:48:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:45.722 11:48:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:45.722 11:48:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 00:07:45.981 11:48:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd15 00:07:45.981 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd15 00:07:45.981 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd15 00:07:45.981 11:48:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd15 00:07:45.981 11:48:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:45.981 11:48:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:45.981 11:48:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:45.981 11:48:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd15 /proc/partitions 00:07:45.981 11:48:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:45.981 11:48:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:45.981 11:48:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:45.981 11:48:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:45.981 1+0 records in 00:07:45.981 1+0 records out 00:07:45.981 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000321013 s, 12.8 MB/s 00:07:45.981 11:48:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:45.981 11:48:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:45.981 11:48:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:45.981 11:48:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:45.981 11:48:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:45.981 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:45.981 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:45.981 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:45.981 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:07:45.981 { 00:07:45.981 "nbd_device": "/dev/nbd0", 00:07:45.981 "bdev_name": "Malloc0" 00:07:45.981 }, 00:07:45.981 { 00:07:45.981 "nbd_device": "/dev/nbd1", 00:07:45.981 "bdev_name": "Malloc1p0" 00:07:45.981 }, 00:07:45.981 { 00:07:45.981 "nbd_device": "/dev/nbd2", 00:07:45.981 "bdev_name": "Malloc1p1" 00:07:45.981 }, 00:07:45.981 { 00:07:45.981 "nbd_device": "/dev/nbd3", 00:07:45.982 "bdev_name": "Malloc2p0" 00:07:45.982 }, 00:07:45.982 { 00:07:45.982 "nbd_device": "/dev/nbd4", 00:07:45.982 "bdev_name": "Malloc2p1" 00:07:45.982 }, 00:07:45.982 { 00:07:45.982 "nbd_device": "/dev/nbd5", 00:07:45.982 "bdev_name": "Malloc2p2" 00:07:45.982 }, 00:07:45.982 { 00:07:45.982 "nbd_device": "/dev/nbd6", 00:07:45.982 "bdev_name": "Malloc2p3" 00:07:45.982 }, 00:07:45.982 { 00:07:45.982 "nbd_device": "/dev/nbd7", 00:07:45.982 "bdev_name": "Malloc2p4" 00:07:45.982 }, 00:07:45.982 { 00:07:45.982 "nbd_device": "/dev/nbd8", 00:07:45.982 "bdev_name": "Malloc2p5" 00:07:45.982 }, 00:07:45.982 { 00:07:45.982 "nbd_device": "/dev/nbd9", 00:07:45.982 "bdev_name": "Malloc2p6" 00:07:45.982 }, 00:07:45.982 { 00:07:45.982 "nbd_device": "/dev/nbd10", 00:07:45.982 "bdev_name": "Malloc2p7" 00:07:45.982 }, 00:07:45.982 { 00:07:45.982 "nbd_device": "/dev/nbd11", 00:07:45.982 "bdev_name": "TestPT" 00:07:45.982 }, 00:07:45.982 { 00:07:45.982 "nbd_device": "/dev/nbd12", 00:07:45.982 "bdev_name": "raid0" 00:07:45.982 }, 00:07:45.982 { 00:07:45.982 "nbd_device": "/dev/nbd13", 00:07:45.982 "bdev_name": "concat0" 00:07:45.982 }, 00:07:45.982 { 00:07:45.982 "nbd_device": "/dev/nbd14", 00:07:45.982 "bdev_name": "raid1" 00:07:45.982 }, 00:07:45.982 { 00:07:45.982 "nbd_device": "/dev/nbd15", 00:07:45.982 "bdev_name": "AIO0" 00:07:45.982 } 00:07:45.982 ]' 00:07:45.982 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:07:45.982 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:07:45.982 { 00:07:45.982 "nbd_device": "/dev/nbd0", 00:07:45.982 "bdev_name": "Malloc0" 00:07:45.982 }, 00:07:45.982 { 00:07:45.982 "nbd_device": "/dev/nbd1", 00:07:45.982 "bdev_name": "Malloc1p0" 00:07:45.982 }, 00:07:45.982 { 00:07:45.982 "nbd_device": "/dev/nbd2", 00:07:45.982 "bdev_name": "Malloc1p1" 00:07:45.982 }, 00:07:45.982 { 00:07:45.982 "nbd_device": "/dev/nbd3", 00:07:45.982 "bdev_name": "Malloc2p0" 00:07:45.982 }, 00:07:45.982 { 00:07:45.982 "nbd_device": "/dev/nbd4", 00:07:45.982 "bdev_name": "Malloc2p1" 00:07:45.982 }, 00:07:45.982 { 00:07:45.982 "nbd_device": "/dev/nbd5", 00:07:45.982 "bdev_name": "Malloc2p2" 00:07:45.982 }, 00:07:45.982 { 00:07:45.982 "nbd_device": "/dev/nbd6", 00:07:45.982 "bdev_name": "Malloc2p3" 00:07:45.982 }, 00:07:45.982 { 00:07:45.982 "nbd_device": "/dev/nbd7", 00:07:45.982 "bdev_name": "Malloc2p4" 00:07:45.982 }, 00:07:45.982 { 00:07:45.982 "nbd_device": "/dev/nbd8", 00:07:45.982 "bdev_name": "Malloc2p5" 00:07:45.982 }, 00:07:45.982 { 00:07:45.982 "nbd_device": "/dev/nbd9", 00:07:45.982 "bdev_name": "Malloc2p6" 00:07:45.982 }, 00:07:45.982 { 00:07:45.982 "nbd_device": "/dev/nbd10", 00:07:45.982 "bdev_name": "Malloc2p7" 00:07:45.982 }, 00:07:45.982 { 00:07:45.982 "nbd_device": "/dev/nbd11", 00:07:45.982 "bdev_name": "TestPT" 00:07:45.982 }, 00:07:45.982 { 00:07:45.982 "nbd_device": "/dev/nbd12", 00:07:45.982 "bdev_name": "raid0" 00:07:45.982 }, 00:07:45.982 { 00:07:45.982 "nbd_device": "/dev/nbd13", 00:07:45.982 "bdev_name": "concat0" 00:07:45.982 }, 00:07:45.982 { 00:07:45.982 "nbd_device": "/dev/nbd14", 00:07:45.982 "bdev_name": "raid1" 00:07:45.982 }, 00:07:45.982 { 00:07:45.982 "nbd_device": "/dev/nbd15", 00:07:45.982 "bdev_name": "AIO0" 00:07:45.982 } 00:07:45.982 ]' 00:07:45.982 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:07:46.242 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15' 00:07:46.242 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:46.242 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15') 00:07:46.242 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:46.242 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:07:46.242 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:46.242 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:46.242 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:46.242 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:46.242 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:46.242 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:46.242 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:46.242 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:46.242 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:46.242 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:46.242 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:46.242 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:46.501 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:46.501 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:46.501 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:46.501 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:46.501 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:46.501 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:46.501 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:46.501 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:46.501 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:46.501 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:07:46.762 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:07:46.762 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:07:46.762 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:07:46.762 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:46.762 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:46.762 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:07:46.762 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:46.762 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:46.762 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:46.762 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:07:46.762 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:07:46.762 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:07:46.762 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:07:46.762 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:46.762 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:46.762 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:07:46.762 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:46.762 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:46.762 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:46.762 11:48:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:07:47.020 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:07:47.020 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:07:47.020 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:07:47.020 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:47.020 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:47.020 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:07:47.020 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:47.020 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:47.021 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:47.021 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:07:47.280 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:07:47.280 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:07:47.280 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:07:47.280 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:47.280 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:47.280 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:07:47.280 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:47.280 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:47.280 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:47.280 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:07:47.280 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:07:47.280 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:07:47.280 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:07:47.280 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:47.280 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:47.280 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:07:47.280 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:47.280 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:47.280 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:47.280 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:07:47.539 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:07:47.539 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:07:47.539 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:07:47.539 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:47.539 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:47.539 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:07:47.539 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:47.539 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:47.539 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:47.539 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:07:47.822 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:07:47.822 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:07:47.822 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:07:47.822 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:47.822 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:47.822 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:07:47.822 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:47.822 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:47.822 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:47.822 11:48:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:07:47.822 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:07:47.822 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:07:47.822 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:07:47.822 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:47.822 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:47.822 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:07:47.822 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:47.822 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:47.822 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:47.822 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:07:48.080 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:07:48.080 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:07:48.080 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:07:48.080 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:48.080 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:48.080 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:07:48.080 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:48.080 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:48.080 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:48.080 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:07:48.339 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:07:48.339 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:07:48.339 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:07:48.339 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:48.339 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:48.339 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:07:48.339 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:48.339 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:48.339 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:48.339 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:07:48.339 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:07:48.339 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:07:48.339 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:07:48.339 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:48.339 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:48.339 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:07:48.339 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:48.339 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:48.339 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:48.339 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:07:48.598 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:07:48.598 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:07:48.598 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:07:48.598 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:48.598 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:48.598 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:07:48.598 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:48.598 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:48.598 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:48.598 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:07:48.855 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:07:48.855 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:07:48.856 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:07:48.856 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:48.856 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:48.856 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:07:48.856 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:48.856 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:48.856 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:48.856 11:48:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:07:49.114 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:07:49.114 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:07:49.114 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:07:49.114 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:49.114 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:49.114 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:07:49.114 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:49.114 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:49.114 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:49.114 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:49.114 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:49.114 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:49.114 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:49.114 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:49.114 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:49.114 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:49.114 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:07:49.114 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:07:49.114 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:07:49.114 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:07:49.114 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:07:49.114 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:07:49.114 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:07:49.114 11:48:39 blockdev_general.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:07:49.114 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:49.374 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:07:49.374 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:49.374 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:49.374 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:49.374 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:07:49.374 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:49.374 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:07:49.374 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:49.374 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:49.374 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:49.374 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:07:49.374 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:49.374 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:49.374 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:49.374 /dev/nbd0 00:07:49.374 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:49.374 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:49.374 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:07:49.374 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:49.374 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:49.374 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:49.374 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:07:49.374 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:49.374 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:49.374 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:49.374 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:49.374 1+0 records in 00:07:49.374 1+0 records out 00:07:49.374 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000151958 s, 27.0 MB/s 00:07:49.374 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:49.374 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:49.374 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:49.374 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:49.374 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:49.374 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:49.374 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:49.374 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 /dev/nbd1 00:07:49.634 /dev/nbd1 00:07:49.634 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:49.634 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:49.634 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:07:49.634 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:49.634 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:49.634 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:49.634 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:07:49.634 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:49.634 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:49.634 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:49.634 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:49.634 1+0 records in 00:07:49.634 1+0 records out 00:07:49.634 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00023533 s, 17.4 MB/s 00:07:49.634 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:49.634 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:49.634 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:49.634 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:49.634 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:49.634 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:49.634 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:49.634 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 /dev/nbd10 00:07:49.893 /dev/nbd10 00:07:49.893 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:07:49.893 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:07:49.893 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:07:49.893 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:49.893 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:49.893 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:49.893 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:07:49.893 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:49.893 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:49.893 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:49.893 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:49.893 1+0 records in 00:07:49.893 1+0 records out 00:07:49.893 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000216623 s, 18.9 MB/s 00:07:49.893 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:49.893 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:49.893 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:49.893 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:49.893 11:48:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:49.893 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:49.893 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:49.893 11:48:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 /dev/nbd11 00:07:49.893 /dev/nbd11 00:07:49.893 11:48:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:07:50.151 11:48:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:07:50.151 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:07:50.151 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:50.151 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:50.151 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:50.151 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:07:50.151 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:50.151 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:50.151 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:50.151 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:50.151 1+0 records in 00:07:50.151 1+0 records out 00:07:50.151 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000246973 s, 16.6 MB/s 00:07:50.151 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:50.151 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:50.151 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:50.151 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:50.151 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:50.151 11:48:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:50.151 11:48:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:50.151 11:48:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 /dev/nbd12 00:07:50.151 /dev/nbd12 00:07:50.151 11:48:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd12 00:07:50.151 11:48:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd12 00:07:50.151 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd12 00:07:50.151 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:50.151 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:50.151 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:50.151 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd12 /proc/partitions 00:07:50.151 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:50.151 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:50.151 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:50.151 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:50.151 1+0 records in 00:07:50.151 1+0 records out 00:07:50.151 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000265802 s, 15.4 MB/s 00:07:50.151 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:50.151 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:50.151 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:50.151 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:50.151 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:50.151 11:48:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:50.151 11:48:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:50.152 11:48:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 /dev/nbd13 00:07:50.410 /dev/nbd13 00:07:50.410 11:48:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd13 00:07:50.410 11:48:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd13 00:07:50.410 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd13 00:07:50.410 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:50.410 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:50.410 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:50.410 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd13 /proc/partitions 00:07:50.410 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:50.410 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:50.410 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:50.410 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:50.410 1+0 records in 00:07:50.410 1+0 records out 00:07:50.410 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00023819 s, 17.2 MB/s 00:07:50.410 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:50.410 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:50.410 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:50.410 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:50.410 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:50.410 11:48:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:50.410 11:48:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:50.410 11:48:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 /dev/nbd14 00:07:50.669 /dev/nbd14 00:07:50.669 11:48:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd14 00:07:50.669 11:48:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd14 00:07:50.669 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd14 00:07:50.669 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:50.669 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:50.669 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:50.669 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd14 /proc/partitions 00:07:50.669 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:50.669 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:50.669 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:50.669 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:50.669 1+0 records in 00:07:50.669 1+0 records out 00:07:50.669 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000242406 s, 16.9 MB/s 00:07:50.669 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:50.669 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:50.669 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:50.669 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:50.669 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:50.669 11:48:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:50.669 11:48:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:50.669 11:48:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 /dev/nbd15 00:07:50.929 /dev/nbd15 00:07:50.929 11:48:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd15 00:07:50.929 11:48:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd15 00:07:50.929 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd15 00:07:50.929 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:50.929 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:50.929 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:50.929 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd15 /proc/partitions 00:07:50.929 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:50.929 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:50.929 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:50.929 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:50.929 1+0 records in 00:07:50.929 1+0 records out 00:07:50.929 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000287647 s, 14.2 MB/s 00:07:50.929 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:50.929 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:50.929 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:50.929 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:50.929 11:48:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:50.929 11:48:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:50.929 11:48:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:50.929 11:48:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 /dev/nbd2 00:07:50.929 /dev/nbd2 00:07:50.929 11:48:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd2 00:07:50.929 11:48:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd2 00:07:50.929 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:07:50.929 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:50.929 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:50.929 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:50.929 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:07:50.929 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:50.929 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:50.929 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:50.929 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:50.929 1+0 records in 00:07:50.929 1+0 records out 00:07:50.929 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000286954 s, 14.3 MB/s 00:07:50.929 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:50.929 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:50.929 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:50.929 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:50.929 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:50.929 11:48:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:50.929 11:48:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:50.929 11:48:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 /dev/nbd3 00:07:51.188 /dev/nbd3 00:07:51.188 11:48:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd3 00:07:51.188 11:48:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd3 00:07:51.188 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:07:51.188 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:51.188 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:51.188 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:51.188 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:07:51.188 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:51.188 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:51.188 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:51.188 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:51.188 1+0 records in 00:07:51.188 1+0 records out 00:07:51.188 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000285396 s, 14.4 MB/s 00:07:51.188 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:51.188 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:51.188 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:51.188 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:51.188 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:51.188 11:48:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:51.188 11:48:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:51.188 11:48:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 /dev/nbd4 00:07:51.446 /dev/nbd4 00:07:51.446 11:48:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd4 00:07:51.446 11:48:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd4 00:07:51.446 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd4 00:07:51.446 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:51.446 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:51.446 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:51.446 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd4 /proc/partitions 00:07:51.446 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:51.446 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:51.446 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:51.446 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:51.446 1+0 records in 00:07:51.446 1+0 records out 00:07:51.446 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000269518 s, 15.2 MB/s 00:07:51.446 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:51.446 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:51.446 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:51.446 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:51.446 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:51.446 11:48:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:51.446 11:48:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:51.446 11:48:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT /dev/nbd5 00:07:51.705 /dev/nbd5 00:07:51.705 11:48:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd5 00:07:51.705 11:48:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd5 00:07:51.705 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd5 00:07:51.705 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:51.705 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:51.705 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:51.705 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd5 /proc/partitions 00:07:51.705 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:51.705 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:51.705 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:51.705 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:51.705 1+0 records in 00:07:51.705 1+0 records out 00:07:51.705 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000266587 s, 15.4 MB/s 00:07:51.705 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:51.705 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:51.705 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:51.705 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:51.705 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:51.705 11:48:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:51.705 11:48:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:51.705 11:48:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 /dev/nbd6 00:07:51.705 /dev/nbd6 00:07:51.705 11:48:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd6 00:07:51.964 11:48:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd6 00:07:51.964 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd6 00:07:51.964 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:51.964 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:51.964 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:51.964 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd6 /proc/partitions 00:07:51.964 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:51.964 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:51.964 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:51.965 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:51.965 1+0 records in 00:07:51.965 1+0 records out 00:07:51.965 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00029356 s, 14.0 MB/s 00:07:51.965 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:51.965 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:51.965 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:51.965 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:51.965 11:48:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:51.965 11:48:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:51.965 11:48:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:51.965 11:48:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 /dev/nbd7 00:07:51.965 /dev/nbd7 00:07:51.965 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd7 00:07:51.965 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd7 00:07:51.965 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd7 00:07:51.965 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:51.965 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:51.965 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:51.965 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd7 /proc/partitions 00:07:51.965 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:51.965 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:51.965 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:51.965 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:51.965 1+0 records in 00:07:51.965 1+0 records out 00:07:51.965 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000313025 s, 13.1 MB/s 00:07:51.965 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:51.965 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:51.965 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:51.965 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:51.965 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:51.965 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:51.965 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:51.965 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 /dev/nbd8 00:07:52.224 /dev/nbd8 00:07:52.224 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd8 00:07:52.224 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd8 00:07:52.224 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd8 00:07:52.224 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:52.224 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:52.224 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:52.224 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd8 /proc/partitions 00:07:52.224 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:52.225 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:52.225 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:52.225 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:52.225 1+0 records in 00:07:52.225 1+0 records out 00:07:52.225 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000390897 s, 10.5 MB/s 00:07:52.225 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:52.225 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:52.225 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:52.225 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:52.225 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:52.225 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:52.225 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:52.225 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 /dev/nbd9 00:07:52.484 /dev/nbd9 00:07:52.484 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd9 00:07:52.484 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd9 00:07:52.484 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd9 00:07:52.484 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:07:52.484 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:52.484 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:52.484 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd9 /proc/partitions 00:07:52.484 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:07:52.484 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:52.484 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:52.484 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:52.484 1+0 records in 00:07:52.484 1+0 records out 00:07:52.484 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000340219 s, 12.0 MB/s 00:07:52.484 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:52.484 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:07:52.484 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:52.484 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:52.484 11:48:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:07:52.484 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:52.484 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:52.484 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:52.484 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:52.484 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:52.484 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:52.484 { 00:07:52.484 "nbd_device": "/dev/nbd0", 00:07:52.484 "bdev_name": "Malloc0" 00:07:52.484 }, 00:07:52.484 { 00:07:52.484 "nbd_device": "/dev/nbd1", 00:07:52.484 "bdev_name": "Malloc1p0" 00:07:52.484 }, 00:07:52.484 { 00:07:52.484 "nbd_device": "/dev/nbd10", 00:07:52.484 "bdev_name": "Malloc1p1" 00:07:52.484 }, 00:07:52.484 { 00:07:52.484 "nbd_device": "/dev/nbd11", 00:07:52.484 "bdev_name": "Malloc2p0" 00:07:52.484 }, 00:07:52.484 { 00:07:52.484 "nbd_device": "/dev/nbd12", 00:07:52.484 "bdev_name": "Malloc2p1" 00:07:52.484 }, 00:07:52.484 { 00:07:52.484 "nbd_device": "/dev/nbd13", 00:07:52.484 "bdev_name": "Malloc2p2" 00:07:52.484 }, 00:07:52.484 { 00:07:52.484 "nbd_device": "/dev/nbd14", 00:07:52.484 "bdev_name": "Malloc2p3" 00:07:52.484 }, 00:07:52.484 { 00:07:52.484 "nbd_device": "/dev/nbd15", 00:07:52.484 "bdev_name": "Malloc2p4" 00:07:52.484 }, 00:07:52.484 { 00:07:52.484 "nbd_device": "/dev/nbd2", 00:07:52.484 "bdev_name": "Malloc2p5" 00:07:52.484 }, 00:07:52.484 { 00:07:52.484 "nbd_device": "/dev/nbd3", 00:07:52.484 "bdev_name": "Malloc2p6" 00:07:52.484 }, 00:07:52.485 { 00:07:52.485 "nbd_device": "/dev/nbd4", 00:07:52.485 "bdev_name": "Malloc2p7" 00:07:52.485 }, 00:07:52.485 { 00:07:52.485 "nbd_device": "/dev/nbd5", 00:07:52.485 "bdev_name": "TestPT" 00:07:52.485 }, 00:07:52.485 { 00:07:52.485 "nbd_device": "/dev/nbd6", 00:07:52.485 "bdev_name": "raid0" 00:07:52.485 }, 00:07:52.485 { 00:07:52.485 "nbd_device": "/dev/nbd7", 00:07:52.485 "bdev_name": "concat0" 00:07:52.485 }, 00:07:52.485 { 00:07:52.485 "nbd_device": "/dev/nbd8", 00:07:52.485 "bdev_name": "raid1" 00:07:52.485 }, 00:07:52.485 { 00:07:52.485 "nbd_device": "/dev/nbd9", 00:07:52.485 "bdev_name": "AIO0" 00:07:52.485 } 00:07:52.485 ]' 00:07:52.485 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:52.485 { 00:07:52.485 "nbd_device": "/dev/nbd0", 00:07:52.485 "bdev_name": "Malloc0" 00:07:52.485 }, 00:07:52.485 { 00:07:52.485 "nbd_device": "/dev/nbd1", 00:07:52.485 "bdev_name": "Malloc1p0" 00:07:52.485 }, 00:07:52.485 { 00:07:52.485 "nbd_device": "/dev/nbd10", 00:07:52.485 "bdev_name": "Malloc1p1" 00:07:52.485 }, 00:07:52.485 { 00:07:52.485 "nbd_device": "/dev/nbd11", 00:07:52.485 "bdev_name": "Malloc2p0" 00:07:52.485 }, 00:07:52.485 { 00:07:52.485 "nbd_device": "/dev/nbd12", 00:07:52.485 "bdev_name": "Malloc2p1" 00:07:52.485 }, 00:07:52.485 { 00:07:52.485 "nbd_device": "/dev/nbd13", 00:07:52.485 "bdev_name": "Malloc2p2" 00:07:52.485 }, 00:07:52.485 { 00:07:52.485 "nbd_device": "/dev/nbd14", 00:07:52.485 "bdev_name": "Malloc2p3" 00:07:52.485 }, 00:07:52.485 { 00:07:52.485 "nbd_device": "/dev/nbd15", 00:07:52.485 "bdev_name": "Malloc2p4" 00:07:52.485 }, 00:07:52.485 { 00:07:52.485 "nbd_device": "/dev/nbd2", 00:07:52.485 "bdev_name": "Malloc2p5" 00:07:52.485 }, 00:07:52.485 { 00:07:52.485 "nbd_device": "/dev/nbd3", 00:07:52.485 "bdev_name": "Malloc2p6" 00:07:52.485 }, 00:07:52.485 { 00:07:52.485 "nbd_device": "/dev/nbd4", 00:07:52.485 "bdev_name": "Malloc2p7" 00:07:52.485 }, 00:07:52.485 { 00:07:52.485 "nbd_device": "/dev/nbd5", 00:07:52.485 "bdev_name": "TestPT" 00:07:52.485 }, 00:07:52.485 { 00:07:52.485 "nbd_device": "/dev/nbd6", 00:07:52.485 "bdev_name": "raid0" 00:07:52.485 }, 00:07:52.485 { 00:07:52.485 "nbd_device": "/dev/nbd7", 00:07:52.485 "bdev_name": "concat0" 00:07:52.485 }, 00:07:52.485 { 00:07:52.485 "nbd_device": "/dev/nbd8", 00:07:52.485 "bdev_name": "raid1" 00:07:52.485 }, 00:07:52.485 { 00:07:52.485 "nbd_device": "/dev/nbd9", 00:07:52.485 "bdev_name": "AIO0" 00:07:52.485 } 00:07:52.485 ]' 00:07:52.485 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:52.743 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:52.743 /dev/nbd1 00:07:52.743 /dev/nbd10 00:07:52.743 /dev/nbd11 00:07:52.743 /dev/nbd12 00:07:52.743 /dev/nbd13 00:07:52.743 /dev/nbd14 00:07:52.743 /dev/nbd15 00:07:52.743 /dev/nbd2 00:07:52.743 /dev/nbd3 00:07:52.743 /dev/nbd4 00:07:52.743 /dev/nbd5 00:07:52.743 /dev/nbd6 00:07:52.743 /dev/nbd7 00:07:52.743 /dev/nbd8 00:07:52.743 /dev/nbd9' 00:07:52.744 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:52.744 /dev/nbd1 00:07:52.744 /dev/nbd10 00:07:52.744 /dev/nbd11 00:07:52.744 /dev/nbd12 00:07:52.744 /dev/nbd13 00:07:52.744 /dev/nbd14 00:07:52.744 /dev/nbd15 00:07:52.744 /dev/nbd2 00:07:52.744 /dev/nbd3 00:07:52.744 /dev/nbd4 00:07:52.744 /dev/nbd5 00:07:52.744 /dev/nbd6 00:07:52.744 /dev/nbd7 00:07:52.744 /dev/nbd8 00:07:52.744 /dev/nbd9' 00:07:52.744 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:52.744 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=16 00:07:52.744 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 16 00:07:52.744 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=16 00:07:52.744 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 16 -ne 16 ']' 00:07:52.744 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' write 00:07:52.744 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:52.744 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:52.744 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:52.744 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:07:52.744 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:52.744 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:07:52.744 256+0 records in 00:07:52.744 256+0 records out 00:07:52.744 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0103826 s, 101 MB/s 00:07:52.744 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:52.744 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:52.744 256+0 records in 00:07:52.744 256+0 records out 00:07:52.744 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0640183 s, 16.4 MB/s 00:07:52.744 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:52.744 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:52.744 256+0 records in 00:07:52.744 256+0 records out 00:07:52.744 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0660106 s, 15.9 MB/s 00:07:52.744 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:52.744 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:07:53.001 256+0 records in 00:07:53.001 256+0 records out 00:07:53.001 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0655006 s, 16.0 MB/s 00:07:53.001 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:53.001 11:48:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:07:53.001 256+0 records in 00:07:53.001 256+0 records out 00:07:53.002 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0683682 s, 15.3 MB/s 00:07:53.002 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:53.002 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd12 bs=4096 count=256 oflag=direct 00:07:53.002 256+0 records in 00:07:53.002 256+0 records out 00:07:53.002 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0657384 s, 16.0 MB/s 00:07:53.002 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:53.002 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd13 bs=4096 count=256 oflag=direct 00:07:53.002 256+0 records in 00:07:53.002 256+0 records out 00:07:53.002 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0654115 s, 16.0 MB/s 00:07:53.002 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:53.002 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd14 bs=4096 count=256 oflag=direct 00:07:53.260 256+0 records in 00:07:53.260 256+0 records out 00:07:53.260 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0652017 s, 16.1 MB/s 00:07:53.260 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:53.260 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd15 bs=4096 count=256 oflag=direct 00:07:53.260 256+0 records in 00:07:53.260 256+0 records out 00:07:53.260 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0646456 s, 16.2 MB/s 00:07:53.260 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:53.260 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd2 bs=4096 count=256 oflag=direct 00:07:53.260 256+0 records in 00:07:53.260 256+0 records out 00:07:53.260 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0654645 s, 16.0 MB/s 00:07:53.260 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:53.260 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd3 bs=4096 count=256 oflag=direct 00:07:53.260 256+0 records in 00:07:53.260 256+0 records out 00:07:53.260 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0653661 s, 16.0 MB/s 00:07:53.260 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:53.260 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd4 bs=4096 count=256 oflag=direct 00:07:53.519 256+0 records in 00:07:53.519 256+0 records out 00:07:53.519 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0646172 s, 16.2 MB/s 00:07:53.519 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:53.519 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd5 bs=4096 count=256 oflag=direct 00:07:53.519 256+0 records in 00:07:53.519 256+0 records out 00:07:53.519 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0660822 s, 15.9 MB/s 00:07:53.519 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:53.519 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd6 bs=4096 count=256 oflag=direct 00:07:53.519 256+0 records in 00:07:53.519 256+0 records out 00:07:53.519 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0656766 s, 16.0 MB/s 00:07:53.519 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:53.519 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd7 bs=4096 count=256 oflag=direct 00:07:53.519 256+0 records in 00:07:53.519 256+0 records out 00:07:53.519 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0663074 s, 15.8 MB/s 00:07:53.519 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:53.519 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd8 bs=4096 count=256 oflag=direct 00:07:53.785 256+0 records in 00:07:53.785 256+0 records out 00:07:53.785 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0679191 s, 15.4 MB/s 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd9 bs=4096 count=256 oflag=direct 00:07:53.785 256+0 records in 00:07:53.785 256+0 records out 00:07:53.785 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0646348 s, 16.2 MB/s 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' verify 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd12 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd13 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd14 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd15 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd2 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd3 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd4 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd5 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd6 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd7 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd8 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd9 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:53.785 11:48:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:54.044 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:54.044 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:54.044 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:54.044 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:54.044 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:54.044 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:54.044 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:54.044 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:54.044 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:54.044 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:54.303 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:54.303 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:54.303 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:54.303 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:54.303 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:54.303 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:54.303 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:54.303 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:54.303 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:54.303 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:07:54.303 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:07:54.303 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:07:54.303 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:07:54.303 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:54.303 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:54.303 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:07:54.303 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:54.303 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:54.303 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:54.303 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:07:54.563 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:07:54.563 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:07:54.563 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:07:54.563 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:54.563 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:54.563 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:07:54.563 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:54.563 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:54.563 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:54.563 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:07:54.822 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:07:54.822 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:07:54.822 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:07:54.822 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:54.822 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:54.822 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:07:54.822 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:54.822 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:54.822 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:54.822 11:48:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:07:55.081 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:07:55.081 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:07:55.081 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:07:55.081 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:55.081 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:55.081 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:07:55.081 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:55.081 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:55.081 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:55.081 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:07:55.081 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:07:55.081 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:07:55.081 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:07:55.081 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:55.081 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:55.081 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:07:55.081 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:55.081 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:55.081 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:55.081 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:07:55.339 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:07:55.339 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:07:55.339 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:07:55.339 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:55.339 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:55.339 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:07:55.339 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:55.339 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:55.339 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:55.339 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:07:55.597 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:07:55.597 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:07:55.597 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:07:55.597 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:55.597 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:55.597 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:07:55.597 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:55.597 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:55.597 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:55.597 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:07:55.597 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:07:55.597 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:07:55.597 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:07:55.597 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:55.597 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:55.597 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:07:55.597 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:55.597 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:55.597 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:55.597 11:48:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:07:55.856 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:07:55.856 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:07:55.856 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:07:55.856 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:55.856 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:55.856 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:07:55.856 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:55.856 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:55.856 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:55.857 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:07:56.115 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:07:56.115 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:07:56.115 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:07:56.115 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:56.115 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:56.115 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:07:56.115 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:56.115 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:56.115 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:56.115 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:07:56.374 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:07:56.374 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:07:56.374 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:07:56.374 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:56.374 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:56.374 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:07:56.374 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:56.374 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:56.374 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:56.374 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:07:56.374 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:07:56.374 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:07:56.374 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:07:56.374 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:56.374 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:56.374 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:07:56.374 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:56.374 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:56.374 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:56.374 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:07:56.633 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:07:56.633 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:07:56.633 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:07:56.633 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:56.633 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:56.633 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:07:56.633 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:56.633 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:56.633 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:56.633 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:07:56.891 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:07:56.891 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:07:56.891 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:07:56.891 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:56.891 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:56.891 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:07:56.891 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:56.891 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:56.891 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:56.891 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:56.891 11:48:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:56.891 11:48:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:56.891 11:48:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:56.891 11:48:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:57.150 11:48:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:57.150 11:48:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:07:57.150 11:48:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:57.150 11:48:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:07:57.150 11:48:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:07:57.150 11:48:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:07:57.150 11:48:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:07:57.150 11:48:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:57.150 11:48:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:07:57.150 11:48:47 blockdev_general.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:07:57.150 11:48:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:57.150 11:48:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:57.150 11:48:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:07:57.150 11:48:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:07:57.150 11:48:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:07:57.150 malloc_lvol_verify 00:07:57.150 11:48:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:07:57.409 b693a0ba-5f36-4629-97d6-aea220f9cf00 00:07:57.409 11:48:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:07:57.409 f2a7fada-7adc-491a-8f57-c4ceeb315576 00:07:57.409 11:48:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:07:57.669 /dev/nbd0 00:07:57.669 11:48:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:07:57.669 mke2fs 1.46.5 (30-Dec-2021) 00:07:57.669 Discarding device blocks: 0/4096 done 00:07:57.669 Creating filesystem with 4096 1k blocks and 1024 inodes 00:07:57.669 00:07:57.669 Allocating group tables: 0/1 done 00:07:57.669 Writing inode tables: 0/1 done 00:07:57.669 Creating journal (1024 blocks): done 00:07:57.669 Writing superblocks and filesystem accounting information: 0/1 done 00:07:57.669 00:07:57.669 11:48:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:07:57.669 11:48:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:07:57.669 11:48:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:57.669 11:48:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:07:57.669 11:48:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:57.669 11:48:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:07:57.669 11:48:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:57.669 11:48:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:57.928 11:48:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:57.928 11:48:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:57.928 11:48:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:57.928 11:48:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:57.928 11:48:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:57.928 11:48:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:57.928 11:48:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:57.928 11:48:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:57.928 11:48:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:07:57.928 11:48:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:07:57.928 11:48:48 blockdev_general.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 570573 00:07:57.928 11:48:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 570573 ']' 00:07:57.928 11:48:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 570573 00:07:57.928 11:48:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:07:57.928 11:48:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:57.928 11:48:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 570573 00:07:57.928 11:48:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:57.928 11:48:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:57.928 11:48:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 570573' 00:07:57.928 killing process with pid 570573 00:07:57.928 11:48:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@967 -- # kill 570573 00:07:57.928 11:48:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@972 -- # wait 570573 00:07:58.186 11:48:48 blockdev_general.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:07:58.186 00:07:58.186 real 0m16.333s 00:07:58.186 user 0m21.765s 00:07:58.186 sys 0m7.985s 00:07:58.186 11:48:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:58.186 11:48:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:07:58.186 ************************************ 00:07:58.186 END TEST bdev_nbd 00:07:58.186 ************************************ 00:07:58.186 11:48:48 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:07:58.186 11:48:48 blockdev_general -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:07:58.186 11:48:48 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = nvme ']' 00:07:58.186 11:48:48 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = gpt ']' 00:07:58.187 11:48:48 blockdev_general -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:07:58.187 11:48:48 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:07:58.187 11:48:48 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:58.187 11:48:48 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:58.187 ************************************ 00:07:58.187 START TEST bdev_fio 00:07:58.187 ************************************ 00:07:58.187 11:48:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:07:58.187 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:07:58.187 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:07:58.187 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:07:58.187 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:07:58.187 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:07:58.187 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:07:58.187 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:07:58.187 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:07:58.187 11:48:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:07:58.187 11:48:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:07:58.187 11:48:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:07:58.187 11:48:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:07:58.187 11:48:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:07:58.187 11:48:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:07:58.187 11:48:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:07:58.187 11:48:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:07:58.187 11:48:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:07:58.187 11:48:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:07:58.187 11:48:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:07:58.187 11:48:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:07:58.187 11:48:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:07:58.187 11:48:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc0]' 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc0 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p0]' 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p0 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p1]' 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p1 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p0]' 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p0 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p1]' 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p1 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p2]' 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p2 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p3]' 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p3 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p4]' 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p4 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p5]' 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p5 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p6]' 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p6 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p7]' 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p7 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_TestPT]' 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=TestPT 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid0]' 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid0 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_concat0]' 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=concat0 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid1]' 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid1 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_AIO0]' 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=AIO0 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:58.446 11:48:48 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:07:58.446 ************************************ 00:07:58.446 START TEST bdev_fio_rw_verify 00:07:58.446 ************************************ 00:07:58.446 11:48:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:58.446 11:48:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:58.446 11:48:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:07:58.446 11:48:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:07:58.446 11:48:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:07:58.446 11:48:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:07:58.446 11:48:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:07:58.446 11:48:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:07:58.446 11:48:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:07:58.446 11:48:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:07:58.446 11:48:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:07:58.446 11:48:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:07:58.446 11:48:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:07:58.446 11:48:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:07:58.446 11:48:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:07:58.446 11:48:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:07:58.446 11:48:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:07:58.446 11:48:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:07:58.446 11:48:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:07:58.446 11:48:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:07:58.446 11:48:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:07:58.446 11:48:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:58.705 job_Malloc0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:58.705 job_Malloc1p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:58.705 job_Malloc1p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:58.705 job_Malloc2p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:58.705 job_Malloc2p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:58.705 job_Malloc2p2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:58.705 job_Malloc2p3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:58.705 job_Malloc2p4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:58.705 job_Malloc2p5: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:58.705 job_Malloc2p6: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:58.705 job_Malloc2p7: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:58.705 job_TestPT: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:58.705 job_raid0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:58.705 job_concat0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:58.705 job_raid1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:58.705 job_AIO0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:58.705 fio-3.35 00:07:58.705 Starting 16 threads 00:08:10.919 00:08:10.919 job_Malloc0: (groupid=0, jobs=16): err= 0: pid=574186: Fri Jul 12 11:48:59 2024 00:08:10.919 read: IOPS=105k, BW=408MiB/s (428MB/s)(4084MiB/10001msec) 00:08:10.919 slat (nsec): min=1908, max=2957.9k, avg=30817.62, stdev=13804.55 00:08:10.919 clat (usec): min=8, max=3232, avg=260.75, stdev=124.66 00:08:10.919 lat (usec): min=12, max=3256, avg=291.57, stdev=131.96 00:08:10.919 clat percentiles (usec): 00:08:10.919 | 50.000th=[ 258], 99.000th=[ 523], 99.900th=[ 586], 99.990th=[ 717], 00:08:10.919 | 99.999th=[ 840] 00:08:10.919 write: IOPS=162k, BW=634MiB/s (665MB/s)(6253MiB/9864msec); 0 zone resets 00:08:10.919 slat (usec): min=4, max=289, avg=41.52, stdev=13.41 00:08:10.919 clat (usec): min=8, max=1198, avg=304.13, stdev=138.77 00:08:10.919 lat (usec): min=25, max=1321, avg=345.65, stdev=145.62 00:08:10.919 clat percentiles (usec): 00:08:10.919 | 50.000th=[ 297], 99.000th=[ 619], 99.900th=[ 766], 99.990th=[ 857], 00:08:10.919 | 99.999th=[ 1057] 00:08:10.920 bw ( KiB/s): min=542576, max=899322, per=98.88%, avg=641849.37, stdev=5019.07, samples=304 00:08:10.920 iops : min=135644, max=224828, avg=160462.21, stdev=1254.74, samples=304 00:08:10.920 lat (usec) : 10=0.01%, 20=0.07%, 50=1.24%, 100=7.07%, 250=34.39% 00:08:10.920 lat (usec) : 500=51.10%, 750=6.04%, 1000=0.08% 00:08:10.920 lat (msec) : 2=0.01%, 4=0.01% 00:08:10.920 cpu : usr=99.38%, sys=0.26%, ctx=632, majf=0, minf=2813 00:08:10.920 IO depths : 1=12.4%, 2=24.7%, 4=50.3%, 8=12.6%, 16=0.0%, 32=0.0%, >=64=0.0% 00:08:10.920 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:10.920 complete : 0=0.0%, 4=89.1%, 8=10.9%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:10.920 issued rwts: total=1045410,1600651,0,0 short=0,0,0,0 dropped=0,0,0,0 00:08:10.920 latency : target=0, window=0, percentile=100.00%, depth=8 00:08:10.920 00:08:10.920 Run status group 0 (all jobs): 00:08:10.920 READ: bw=408MiB/s (428MB/s), 408MiB/s-408MiB/s (428MB/s-428MB/s), io=4084MiB (4282MB), run=10001-10001msec 00:08:10.920 WRITE: bw=634MiB/s (665MB/s), 634MiB/s-634MiB/s (665MB/s-665MB/s), io=6253MiB (6556MB), run=9864-9864msec 00:08:10.920 00:08:10.920 real 0m11.264s 00:08:10.920 user 2m48.135s 00:08:10.920 sys 0m1.062s 00:08:10.920 11:48:59 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:10.920 11:48:59 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:08:10.920 ************************************ 00:08:10.920 END TEST bdev_fio_rw_verify 00:08:10.920 ************************************ 00:08:10.920 11:48:59 blockdev_general.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:08:10.920 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:08:10.920 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:08:10.920 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:08:10.920 11:48:59 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:08:10.920 11:48:59 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:08:10.920 11:48:59 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:08:10.920 11:48:59 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:08:10.920 11:48:59 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:08:10.920 11:48:59 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:08:10.920 11:48:59 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:08:10.920 11:48:59 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:08:10.920 11:48:59 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:08:10.920 11:48:59 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:08:10.920 11:48:59 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:08:10.920 11:48:59 blockdev_general.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:08:10.920 11:48:59 blockdev_general.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:08:10.920 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:08:10.921 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "29a8fdaa-3220-4071-b25e-738ed86662a7"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "29a8fdaa-3220-4071-b25e-738ed86662a7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "beaefb4e-f654-55cf-bb7e-e49b737552c1"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "beaefb4e-f654-55cf-bb7e-e49b737552c1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "dbb42538-0ae6-52d9-9bcf-133bf8425691"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "dbb42538-0ae6-52d9-9bcf-133bf8425691",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "d17783e3-052b-52cb-bd66-9b2ce9ea6276"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "d17783e3-052b-52cb-bd66-9b2ce9ea6276",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "8ad10dcd-f415-55d1-ba1b-65c6d55f91ba"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "8ad10dcd-f415-55d1-ba1b-65c6d55f91ba",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "7028417e-6dc8-5281-9f6f-bb788ffbd1e6"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "7028417e-6dc8-5281-9f6f-bb788ffbd1e6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "b371e7e7-051d-5780-8690-87a00573951f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b371e7e7-051d-5780-8690-87a00573951f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "83c5d8fb-d379-549f-804b-f70859975ba1"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "83c5d8fb-d379-549f-804b-f70859975ba1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "f34ea871-47b6-55b6-8ef3-58b44e0cef72"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f34ea871-47b6-55b6-8ef3-58b44e0cef72",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "d0e84be1-e6cd-57b9-af72-7a772f020081"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "d0e84be1-e6cd-57b9-af72-7a772f020081",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "8bf9b11e-eff3-5d98-b73f-2922458a6197"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "8bf9b11e-eff3-5d98-b73f-2922458a6197",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "18849010-fbb6-51f4-b18b-cc7a42f9e5cc"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "18849010-fbb6-51f4-b18b-cc7a42f9e5cc",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "222481d5-1dab-4aae-bbba-3ca76a0366df"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "222481d5-1dab-4aae-bbba-3ca76a0366df",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "222481d5-1dab-4aae-bbba-3ca76a0366df",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "ddb60a11-543b-4525-941d-473c8f0fa170",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "eea1506b-9bbb-4965-9ae6-f6a0326ffcfa",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "9cb2fc3c-9bea-4d08-be6a-d1d368617882"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "9cb2fc3c-9bea-4d08-be6a-d1d368617882",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "9cb2fc3c-9bea-4d08-be6a-d1d368617882",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "2cfba0c8-8a4c-4205-b5b1-f9484a6d0a31",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "b0cdac0c-786d-43d6-ba0d-47919827d96b",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "93c54a51-ea1d-4613-9c81-5ed60b054b00"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "93c54a51-ea1d-4613-9c81-5ed60b054b00",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "93c54a51-ea1d-4613-9c81-5ed60b054b00",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "a93dba3e-c01c-4afb-a638-ac5eab3927c4",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "491d6243-7f14-4d7a-95f3-8a9857aaf76b",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "c3585356-201b-4eb4-af9b-c8f07b4a8910"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "c3585356-201b-4eb4-af9b-c8f07b4a8910",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:08:10.921 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n Malloc0 00:08:10.921 Malloc1p0 00:08:10.921 Malloc1p1 00:08:10.921 Malloc2p0 00:08:10.921 Malloc2p1 00:08:10.921 Malloc2p2 00:08:10.921 Malloc2p3 00:08:10.921 Malloc2p4 00:08:10.921 Malloc2p5 00:08:10.921 Malloc2p6 00:08:10.921 Malloc2p7 00:08:10.921 TestPT 00:08:10.921 raid0 00:08:10.921 concat0 ]] 00:08:10.921 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:08:10.922 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "29a8fdaa-3220-4071-b25e-738ed86662a7"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "29a8fdaa-3220-4071-b25e-738ed86662a7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "beaefb4e-f654-55cf-bb7e-e49b737552c1"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "beaefb4e-f654-55cf-bb7e-e49b737552c1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "dbb42538-0ae6-52d9-9bcf-133bf8425691"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "dbb42538-0ae6-52d9-9bcf-133bf8425691",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "d17783e3-052b-52cb-bd66-9b2ce9ea6276"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "d17783e3-052b-52cb-bd66-9b2ce9ea6276",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "8ad10dcd-f415-55d1-ba1b-65c6d55f91ba"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "8ad10dcd-f415-55d1-ba1b-65c6d55f91ba",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "7028417e-6dc8-5281-9f6f-bb788ffbd1e6"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "7028417e-6dc8-5281-9f6f-bb788ffbd1e6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "b371e7e7-051d-5780-8690-87a00573951f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b371e7e7-051d-5780-8690-87a00573951f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "83c5d8fb-d379-549f-804b-f70859975ba1"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "83c5d8fb-d379-549f-804b-f70859975ba1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "f34ea871-47b6-55b6-8ef3-58b44e0cef72"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f34ea871-47b6-55b6-8ef3-58b44e0cef72",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "d0e84be1-e6cd-57b9-af72-7a772f020081"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "d0e84be1-e6cd-57b9-af72-7a772f020081",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "8bf9b11e-eff3-5d98-b73f-2922458a6197"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "8bf9b11e-eff3-5d98-b73f-2922458a6197",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "18849010-fbb6-51f4-b18b-cc7a42f9e5cc"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "18849010-fbb6-51f4-b18b-cc7a42f9e5cc",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "222481d5-1dab-4aae-bbba-3ca76a0366df"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "222481d5-1dab-4aae-bbba-3ca76a0366df",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "222481d5-1dab-4aae-bbba-3ca76a0366df",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "ddb60a11-543b-4525-941d-473c8f0fa170",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "eea1506b-9bbb-4965-9ae6-f6a0326ffcfa",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "9cb2fc3c-9bea-4d08-be6a-d1d368617882"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "9cb2fc3c-9bea-4d08-be6a-d1d368617882",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "9cb2fc3c-9bea-4d08-be6a-d1d368617882",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "2cfba0c8-8a4c-4205-b5b1-f9484a6d0a31",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "b0cdac0c-786d-43d6-ba0d-47919827d96b",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "93c54a51-ea1d-4613-9c81-5ed60b054b00"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "93c54a51-ea1d-4613-9c81-5ed60b054b00",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "93c54a51-ea1d-4613-9c81-5ed60b054b00",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "a93dba3e-c01c-4afb-a638-ac5eab3927c4",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "491d6243-7f14-4d7a-95f3-8a9857aaf76b",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "c3585356-201b-4eb4-af9b-c8f07b4a8910"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "c3585356-201b-4eb4-af9b-c8f07b4a8910",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:08:10.922 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:10.922 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc0]' 00:08:10.922 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc0 00:08:10.922 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:10.922 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p0]' 00:08:10.922 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p0 00:08:10.922 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:10.922 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p1]' 00:08:10.922 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p1 00:08:10.922 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:10.922 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p0]' 00:08:10.922 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p0 00:08:10.922 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:10.922 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p1]' 00:08:10.922 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p1 00:08:10.922 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:10.922 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p2]' 00:08:10.922 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p2 00:08:10.922 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:10.923 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p3]' 00:08:10.923 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p3 00:08:10.923 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:10.923 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p4]' 00:08:10.923 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p4 00:08:10.923 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:10.923 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p5]' 00:08:10.923 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p5 00:08:10.923 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:10.923 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p6]' 00:08:10.923 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p6 00:08:10.923 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:10.923 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p7]' 00:08:10.923 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p7 00:08:10.923 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:10.923 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_TestPT]' 00:08:10.923 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=TestPT 00:08:10.923 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:10.923 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_raid0]' 00:08:10.923 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=raid0 00:08:10.923 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:10.923 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_concat0]' 00:08:10.923 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=concat0 00:08:10.923 11:48:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:10.923 11:48:59 blockdev_general.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:10.923 11:48:59 blockdev_general.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:10.923 11:48:59 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:08:10.923 ************************************ 00:08:10.923 START TEST bdev_fio_trim 00:08:10.923 ************************************ 00:08:10.923 11:48:59 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:10.923 11:48:59 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:10.923 11:48:59 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:08:10.923 11:48:59 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:08:10.923 11:48:59 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:08:10.923 11:48:59 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:08:10.923 11:48:59 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:08:10.923 11:48:59 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:08:10.923 11:48:59 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:08:10.923 11:48:59 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:08:10.923 11:48:59 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:08:10.923 11:48:59 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:08:10.923 11:48:59 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:08:10.923 11:48:59 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:08:10.923 11:48:59 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:08:10.923 11:48:59 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:08:10.923 11:48:59 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:08:10.923 11:48:59 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:08:10.923 11:48:59 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:08:10.923 11:48:59 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:08:10.923 11:48:59 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:08:10.923 11:48:59 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:10.923 job_Malloc0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:10.923 job_Malloc1p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:10.923 job_Malloc1p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:10.923 job_Malloc2p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:10.923 job_Malloc2p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:10.923 job_Malloc2p2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:10.923 job_Malloc2p3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:10.923 job_Malloc2p4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:10.923 job_Malloc2p5: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:10.923 job_Malloc2p6: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:10.923 job_Malloc2p7: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:10.923 job_TestPT: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:10.923 job_raid0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:10.923 job_concat0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:10.923 fio-3.35 00:08:10.923 Starting 14 threads 00:08:20.891 00:08:20.891 job_Malloc0: (groupid=0, jobs=14): err= 0: pid=576211: Fri Jul 12 11:49:10 2024 00:08:20.891 write: IOPS=142k, BW=556MiB/s (583MB/s)(5562MiB/10001msec); 0 zone resets 00:08:20.891 slat (usec): min=2, max=250, avg=34.71, stdev=10.49 00:08:20.891 clat (usec): min=22, max=2005, avg=247.34, stdev=86.97 00:08:20.891 lat (usec): min=39, max=2187, avg=282.05, stdev=91.16 00:08:20.891 clat percentiles (usec): 00:08:20.891 | 50.000th=[ 239], 99.000th=[ 433], 99.900th=[ 660], 99.990th=[ 783], 00:08:20.891 | 99.999th=[ 1762] 00:08:20.891 bw ( KiB/s): min=501248, max=736996, per=100.00%, avg=571146.74, stdev=4645.55, samples=266 00:08:20.891 iops : min=125312, max=184248, avg=142786.74, stdev=1161.38, samples=266 00:08:20.891 trim: IOPS=142k, BW=556MiB/s (583MB/s)(5562MiB/10001msec); 0 zone resets 00:08:20.891 slat (usec): min=4, max=3139, avg=23.68, stdev= 7.21 00:08:20.891 clat (usec): min=4, max=2188, avg=277.97, stdev=95.48 00:08:20.891 lat (usec): min=15, max=3699, avg=301.64, stdev=98.64 00:08:20.891 clat percentiles (usec): 00:08:20.891 | 50.000th=[ 273], 99.000th=[ 478], 99.900th=[ 725], 99.990th=[ 865], 00:08:20.891 | 99.999th=[ 1958] 00:08:20.891 bw ( KiB/s): min=501248, max=737004, per=100.00%, avg=571147.16, stdev=4645.66, samples=266 00:08:20.891 iops : min=125312, max=184250, avg=142786.74, stdev=1161.40, samples=266 00:08:20.891 lat (usec) : 10=0.01%, 20=0.03%, 50=0.17%, 100=1.83%, 250=46.54% 00:08:20.891 lat (usec) : 500=50.89%, 750=0.49%, 1000=0.03% 00:08:20.891 lat (msec) : 2=0.01%, 4=0.01% 00:08:20.891 cpu : usr=99.66%, sys=0.00%, ctx=581, majf=0, minf=793 00:08:20.891 IO depths : 1=12.5%, 2=24.9%, 4=50.0%, 8=12.6%, 16=0.0%, 32=0.0%, >=64=0.0% 00:08:20.891 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:20.891 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:20.891 issued rwts: total=0,1423865,1423870,0 short=0,0,0,0 dropped=0,0,0,0 00:08:20.891 latency : target=0, window=0, percentile=100.00%, depth=8 00:08:20.891 00:08:20.891 Run status group 0 (all jobs): 00:08:20.891 WRITE: bw=556MiB/s (583MB/s), 556MiB/s-556MiB/s (583MB/s-583MB/s), io=5562MiB (5832MB), run=10001-10001msec 00:08:20.891 TRIM: bw=556MiB/s (583MB/s), 556MiB/s-556MiB/s (583MB/s-583MB/s), io=5562MiB (5832MB), run=10001-10001msec 00:08:21.151 00:08:21.151 real 0m11.285s 00:08:21.151 user 2m28.085s 00:08:21.151 sys 0m0.715s 00:08:21.151 11:49:11 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:21.151 11:49:11 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:08:21.151 ************************************ 00:08:21.151 END TEST bdev_fio_trim 00:08:21.151 ************************************ 00:08:21.151 11:49:11 blockdev_general.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:08:21.151 11:49:11 blockdev_general.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:08:21.151 11:49:11 blockdev_general.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:08:21.151 11:49:11 blockdev_general.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:08:21.151 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:08:21.151 11:49:11 blockdev_general.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:08:21.151 00:08:21.151 real 0m22.847s 00:08:21.151 user 5m16.398s 00:08:21.151 sys 0m1.917s 00:08:21.151 11:49:11 blockdev_general.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:21.151 11:49:11 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:08:21.151 ************************************ 00:08:21.151 END TEST bdev_fio 00:08:21.151 ************************************ 00:08:21.151 11:49:11 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:08:21.151 11:49:11 blockdev_general -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:08:21.151 11:49:11 blockdev_general -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:08:21.151 11:49:11 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:08:21.151 11:49:11 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:21.151 11:49:11 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:21.151 ************************************ 00:08:21.151 START TEST bdev_verify 00:08:21.151 ************************************ 00:08:21.151 11:49:11 blockdev_general.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:08:21.151 [2024-07-12 11:49:11.345497] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:08:21.151 [2024-07-12 11:49:11.345537] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid578016 ] 00:08:21.411 [2024-07-12 11:49:11.408315] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:21.411 [2024-07-12 11:49:11.485540] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:08:21.411 [2024-07-12 11:49:11.485542] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:21.411 [2024-07-12 11:49:11.624442] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:21.411 [2024-07-12 11:49:11.624487] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:21.411 [2024-07-12 11:49:11.624494] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:21.411 [2024-07-12 11:49:11.632451] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:21.411 [2024-07-12 11:49:11.632468] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:21.411 [2024-07-12 11:49:11.640468] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:21.411 [2024-07-12 11:49:11.640483] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:21.670 [2024-07-12 11:49:11.707768] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:21.670 [2024-07-12 11:49:11.707806] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:21.670 [2024-07-12 11:49:11.707814] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd7d0b0 00:08:21.670 [2024-07-12 11:49:11.707820] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:21.670 [2024-07-12 11:49:11.708864] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:21.670 [2024-07-12 11:49:11.708885] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:21.670 Running I/O for 5 seconds... 00:08:28.244 00:08:28.244 Latency(us) 00:08:28.244 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:28.244 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:28.244 Verification LBA range: start 0x0 length 0x1000 00:08:28.244 Malloc0 : 5.15 1641.21 6.41 0.00 0.00 77863.21 310.13 187745.04 00:08:28.244 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:28.244 Verification LBA range: start 0x1000 length 0x1000 00:08:28.244 Malloc0 : 5.15 1616.50 6.31 0.00 0.00 79054.08 456.41 287609.42 00:08:28.244 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:28.244 Verification LBA range: start 0x0 length 0x800 00:08:28.244 Malloc1p0 : 5.19 838.64 3.28 0.00 0.00 151979.95 2637.04 175761.31 00:08:28.244 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:28.244 Verification LBA range: start 0x800 length 0x800 00:08:28.244 Malloc1p0 : 5.19 839.08 3.28 0.00 0.00 151916.94 2637.04 165774.87 00:08:28.244 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:28.244 Verification LBA range: start 0x0 length 0x800 00:08:28.244 Malloc1p1 : 5.19 838.29 3.27 0.00 0.00 151716.63 2668.25 171766.74 00:08:28.244 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:28.244 Verification LBA range: start 0x800 length 0x800 00:08:28.244 Malloc1p1 : 5.19 838.82 3.28 0.00 0.00 151621.35 2637.04 163777.58 00:08:28.244 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:28.244 Verification LBA range: start 0x0 length 0x200 00:08:28.244 Malloc2p0 : 5.19 837.95 3.27 0.00 0.00 151446.32 2637.04 166773.52 00:08:28.244 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:28.244 Verification LBA range: start 0x200 length 0x200 00:08:28.244 Malloc2p0 : 5.19 838.57 3.28 0.00 0.00 151337.61 2637.04 158784.37 00:08:28.244 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:28.244 Verification LBA range: start 0x0 length 0x200 00:08:28.244 Malloc2p1 : 5.20 837.61 3.27 0.00 0.00 151176.19 2730.67 164776.23 00:08:28.244 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:28.244 Verification LBA range: start 0x200 length 0x200 00:08:28.244 Malloc2p1 : 5.19 838.23 3.27 0.00 0.00 151067.19 2699.46 154789.79 00:08:28.244 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:28.244 Verification LBA range: start 0x0 length 0x200 00:08:28.244 Malloc2p2 : 5.20 837.26 3.27 0.00 0.00 150909.76 2637.04 161780.30 00:08:28.244 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:28.244 Verification LBA range: start 0x200 length 0x200 00:08:28.244 Malloc2p2 : 5.19 837.88 3.27 0.00 0.00 150808.87 2637.04 151793.86 00:08:28.244 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:28.244 Verification LBA range: start 0x0 length 0x200 00:08:28.244 Malloc2p3 : 5.20 836.92 3.27 0.00 0.00 150640.07 2590.23 156787.08 00:08:28.244 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:28.244 Verification LBA range: start 0x200 length 0x200 00:08:28.244 Malloc2p3 : 5.20 837.54 3.27 0.00 0.00 150535.39 2574.63 148797.93 00:08:28.244 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:28.244 Verification LBA range: start 0x0 length 0x200 00:08:28.244 Malloc2p4 : 5.20 836.57 3.27 0.00 0.00 150376.99 2512.21 152792.50 00:08:28.244 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:28.244 Verification LBA range: start 0x200 length 0x200 00:08:28.244 Malloc2p4 : 5.20 837.20 3.27 0.00 0.00 150270.35 2527.82 143804.71 00:08:28.244 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:28.244 Verification LBA range: start 0x0 length 0x200 00:08:28.244 Malloc2p5 : 5.20 836.23 3.27 0.00 0.00 150113.12 2590.23 149796.57 00:08:28.244 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:28.244 Verification LBA range: start 0x200 length 0x200 00:08:28.244 Malloc2p5 : 5.20 836.86 3.27 0.00 0.00 150008.36 2590.23 140808.78 00:08:28.244 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:28.244 Verification LBA range: start 0x0 length 0x200 00:08:28.244 Malloc2p6 : 5.21 835.89 3.27 0.00 0.00 149857.02 2621.44 145802.00 00:08:28.244 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:28.244 Verification LBA range: start 0x200 length 0x200 00:08:28.244 Malloc2p6 : 5.20 836.51 3.27 0.00 0.00 149749.71 2637.04 137812.85 00:08:28.244 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:28.244 Verification LBA range: start 0x0 length 0x200 00:08:28.244 Malloc2p7 : 5.21 835.59 3.26 0.00 0.00 149583.94 2699.46 140808.78 00:08:28.244 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:28.244 Verification LBA range: start 0x200 length 0x200 00:08:28.244 Malloc2p7 : 5.20 836.17 3.27 0.00 0.00 149478.96 2652.65 132819.63 00:08:28.244 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:28.244 Verification LBA range: start 0x0 length 0x1000 00:08:28.244 TestPT : 5.22 833.65 3.26 0.00 0.00 149537.32 8488.47 139810.13 00:08:28.244 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:28.244 Verification LBA range: start 0x1000 length 0x1000 00:08:28.244 TestPT : 5.22 812.27 3.17 0.00 0.00 153219.55 9299.87 189742.32 00:08:28.244 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:28.244 Verification LBA range: start 0x0 length 0x2000 00:08:28.244 raid0 : 5.21 834.81 3.26 0.00 0.00 148845.16 2543.42 120835.90 00:08:28.244 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:28.244 Verification LBA range: start 0x2000 length 0x2000 00:08:28.244 raid0 : 5.21 835.61 3.26 0.00 0.00 148739.69 2574.63 110849.46 00:08:28.244 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:28.244 Verification LBA range: start 0x0 length 0x2000 00:08:28.244 concat0 : 5.22 834.43 3.26 0.00 0.00 148612.51 2559.02 116841.33 00:08:28.244 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:28.244 Verification LBA range: start 0x2000 length 0x2000 00:08:28.244 concat0 : 5.21 835.15 3.26 0.00 0.00 148501.71 2543.42 107354.21 00:08:28.244 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:28.244 Verification LBA range: start 0x0 length 0x1000 00:08:28.244 raid1 : 5.22 834.03 3.26 0.00 0.00 148376.17 2995.93 113346.07 00:08:28.244 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:28.244 Verification LBA range: start 0x1000 length 0x1000 00:08:28.244 raid1 : 5.21 834.84 3.26 0.00 0.00 148246.62 3058.35 112846.75 00:08:28.244 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:28.244 Verification LBA range: start 0x0 length 0x4e2 00:08:28.244 AIO0 : 5.22 833.87 3.26 0.00 0.00 148091.41 1100.07 118339.29 00:08:28.245 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:28.245 Verification LBA range: start 0x4e2 length 0x4e2 00:08:28.245 AIO0 : 5.22 834.51 3.26 0.00 0.00 148001.56 1092.27 117340.65 00:08:28.245 =================================================================================================================== 00:08:28.245 Total : 28328.66 110.66 0.00 0.00 141989.10 310.13 287609.42 00:08:28.245 00:08:28.245 real 0m6.208s 00:08:28.245 user 0m11.714s 00:08:28.245 sys 0m0.281s 00:08:28.245 11:49:17 blockdev_general.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:28.245 11:49:17 blockdev_general.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:08:28.245 ************************************ 00:08:28.245 END TEST bdev_verify 00:08:28.245 ************************************ 00:08:28.245 11:49:17 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:08:28.245 11:49:17 blockdev_general -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:08:28.245 11:49:17 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:08:28.245 11:49:17 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:28.245 11:49:17 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:28.245 ************************************ 00:08:28.245 START TEST bdev_verify_big_io 00:08:28.245 ************************************ 00:08:28.245 11:49:17 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:08:28.245 [2024-07-12 11:49:17.606933] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:08:28.245 [2024-07-12 11:49:17.606968] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid579161 ] 00:08:28.245 [2024-07-12 11:49:17.662160] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:28.245 [2024-07-12 11:49:17.734260] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:08:28.245 [2024-07-12 11:49:17.734262] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:28.245 [2024-07-12 11:49:17.868382] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:28.245 [2024-07-12 11:49:17.868426] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:28.245 [2024-07-12 11:49:17.868437] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:28.245 [2024-07-12 11:49:17.876392] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:28.245 [2024-07-12 11:49:17.876407] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:28.245 [2024-07-12 11:49:17.884406] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:28.245 [2024-07-12 11:49:17.884419] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:28.245 [2024-07-12 11:49:17.951643] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:28.245 [2024-07-12 11:49:17.951679] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:28.245 [2024-07-12 11:49:17.951687] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2a380b0 00:08:28.245 [2024-07-12 11:49:17.951693] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:28.245 [2024-07-12 11:49:17.952714] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:28.245 [2024-07-12 11:49:17.952733] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:28.245 [2024-07-12 11:49:18.097770] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:08:28.245 [2024-07-12 11:49:18.098549] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:08:28.245 [2024-07-12 11:49:18.099726] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:08:28.245 [2024-07-12 11:49:18.100476] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:08:28.245 [2024-07-12 11:49:18.101693] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:08:28.245 [2024-07-12 11:49:18.102448] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:08:28.245 [2024-07-12 11:49:18.103652] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:08:28.245 [2024-07-12 11:49:18.104870] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:08:28.245 [2024-07-12 11:49:18.105630] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:08:28.245 [2024-07-12 11:49:18.106833] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:08:28.245 [2024-07-12 11:49:18.107598] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:08:28.245 [2024-07-12 11:49:18.108820] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:08:28.245 [2024-07-12 11:49:18.109586] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:08:28.245 [2024-07-12 11:49:18.110805] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:08:28.245 [2024-07-12 11:49:18.111495] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:08:28.245 [2024-07-12 11:49:18.112558] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:08:28.245 [2024-07-12 11:49:18.130758] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:08:28.245 [2024-07-12 11:49:18.132286] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:08:28.245 Running I/O for 5 seconds... 00:08:34.919 00:08:34.919 Latency(us) 00:08:34.919 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:34.919 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:34.919 Verification LBA range: start 0x0 length 0x100 00:08:34.919 Malloc0 : 5.75 267.31 16.71 0.00 0.00 472395.74 573.44 1462014.54 00:08:34.919 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:34.919 Verification LBA range: start 0x100 length 0x100 00:08:34.919 Malloc0 : 5.62 273.31 17.08 0.00 0.00 461701.27 561.74 1661743.30 00:08:34.919 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:34.919 Verification LBA range: start 0x0 length 0x80 00:08:34.919 Malloc1p0 : 5.86 144.75 9.05 0.00 0.00 842453.13 1755.43 1725656.50 00:08:34.919 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:34.919 Verification LBA range: start 0x80 length 0x80 00:08:34.919 Malloc1p0 : 6.18 54.36 3.40 0.00 0.00 2183500.99 1061.06 3355443.20 00:08:34.919 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:34.919 Verification LBA range: start 0x0 length 0x80 00:08:34.919 Malloc1p1 : 6.10 52.42 3.28 0.00 0.00 2242374.77 1022.05 3627074.32 00:08:34.919 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:34.919 Verification LBA range: start 0x80 length 0x80 00:08:34.919 Malloc1p1 : 6.18 54.35 3.40 0.00 0.00 2135381.41 1022.05 3243595.09 00:08:34.919 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:34.919 Verification LBA range: start 0x0 length 0x20 00:08:34.919 Malloc2p0 : 5.80 38.59 2.41 0.00 0.00 767414.30 454.46 1246307.47 00:08:34.919 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:34.919 Verification LBA range: start 0x20 length 0x20 00:08:34.919 Malloc2p0 : 5.77 41.58 2.60 0.00 0.00 704325.33 475.92 1070546.16 00:08:34.919 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:34.919 Verification LBA range: start 0x0 length 0x20 00:08:34.919 Malloc2p1 : 5.81 38.58 2.41 0.00 0.00 762818.76 452.51 1230329.17 00:08:34.919 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:34.919 Verification LBA range: start 0x20 length 0x20 00:08:34.919 Malloc2p1 : 5.77 41.58 2.60 0.00 0.00 699926.63 518.83 1054567.86 00:08:34.919 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:34.919 Verification LBA range: start 0x0 length 0x20 00:08:34.919 Malloc2p2 : 5.81 38.58 2.41 0.00 0.00 758743.63 462.26 1214350.87 00:08:34.919 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:34.919 Verification LBA range: start 0x20 length 0x20 00:08:34.919 Malloc2p2 : 5.82 43.99 2.75 0.00 0.00 664507.51 460.31 1038589.56 00:08:34.919 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:34.919 Verification LBA range: start 0x0 length 0x20 00:08:34.919 Malloc2p3 : 5.81 38.57 2.41 0.00 0.00 754248.94 456.41 1198372.57 00:08:34.919 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:34.919 Verification LBA range: start 0x20 length 0x20 00:08:34.919 Malloc2p3 : 5.82 43.98 2.75 0.00 0.00 660258.44 464.21 1018616.69 00:08:34.919 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:34.919 Verification LBA range: start 0x0 length 0x20 00:08:34.919 Malloc2p4 : 5.81 38.57 2.41 0.00 0.00 749881.09 456.41 1182394.27 00:08:34.919 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:34.919 Verification LBA range: start 0x20 length 0x20 00:08:34.919 Malloc2p4 : 5.82 43.97 2.75 0.00 0.00 656382.28 468.11 1002638.38 00:08:34.919 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:34.919 Verification LBA range: start 0x0 length 0x20 00:08:34.919 Malloc2p5 : 5.81 38.56 2.41 0.00 0.00 745408.60 468.11 1166415.97 00:08:34.919 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:34.919 Verification LBA range: start 0x20 length 0x20 00:08:34.919 Malloc2p5 : 5.82 43.97 2.75 0.00 0.00 652525.07 485.67 986660.08 00:08:34.920 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:34.920 Verification LBA range: start 0x0 length 0x20 00:08:34.920 Malloc2p6 : 5.81 38.55 2.41 0.00 0.00 741015.41 456.41 1150437.67 00:08:34.920 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:34.920 Verification LBA range: start 0x20 length 0x20 00:08:34.920 Malloc2p6 : 5.82 43.96 2.75 0.00 0.00 648601.17 470.06 970681.78 00:08:34.920 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:34.920 Verification LBA range: start 0x0 length 0x20 00:08:34.920 Malloc2p7 : 5.81 38.55 2.41 0.00 0.00 736770.54 454.46 1134459.37 00:08:34.920 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:34.920 Verification LBA range: start 0x20 length 0x20 00:08:34.920 Malloc2p7 : 5.82 43.95 2.75 0.00 0.00 644772.32 460.31 954703.48 00:08:34.920 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:34.920 Verification LBA range: start 0x0 length 0x100 00:08:34.920 TestPT : 6.10 50.12 3.13 0.00 0.00 2188263.56 70903.71 3115768.69 00:08:34.920 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:34.920 Verification LBA range: start 0x100 length 0x100 00:08:34.920 TestPT : 6.19 51.66 3.23 0.00 0.00 2116553.00 48683.89 2876094.17 00:08:34.920 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:34.920 Verification LBA range: start 0x0 length 0x200 00:08:34.920 raid0 : 6.17 57.02 3.56 0.00 0.00 1891905.49 1076.66 3275551.70 00:08:34.920 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:34.920 Verification LBA range: start 0x200 length 0x200 00:08:34.920 raid0 : 6.18 62.10 3.88 0.00 0.00 1736915.58 1100.07 2876094.17 00:08:34.920 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:34.920 Verification LBA range: start 0x0 length 0x200 00:08:34.920 concat0 : 6.18 62.16 3.89 0.00 0.00 1716715.31 1084.46 3163703.59 00:08:34.920 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:34.920 Verification LBA range: start 0x200 length 0x200 00:08:34.920 concat0 : 6.15 76.07 4.75 0.00 0.00 1401217.44 1107.87 2764246.06 00:08:34.920 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:34.920 Verification LBA range: start 0x0 length 0x100 00:08:34.920 raid1 : 6.15 72.84 4.55 0.00 0.00 1445787.14 1435.55 3051855.48 00:08:34.920 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:34.920 Verification LBA range: start 0x100 length 0x100 00:08:34.920 raid1 : 6.18 85.37 5.34 0.00 0.00 1226956.39 1388.74 2652397.96 00:08:34.920 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 78, IO size: 65536) 00:08:34.920 Verification LBA range: start 0x0 length 0x4e 00:08:34.920 AIO0 : 6.18 85.33 5.33 0.00 0.00 739589.78 569.54 1821526.31 00:08:34.920 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 78, IO size: 65536) 00:08:34.920 Verification LBA range: start 0x4e length 0x4e 00:08:34.920 AIO0 : 6.20 76.82 4.80 0.00 0.00 817125.40 329.63 1517938.59 00:08:34.920 =================================================================================================================== 00:08:34.920 Total : 2181.52 136.35 0.00 0.00 1012696.90 329.63 3627074.32 00:08:34.920 00:08:34.920 real 0m7.155s 00:08:34.920 user 0m13.638s 00:08:34.920 sys 0m0.296s 00:08:34.920 11:49:24 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:34.920 11:49:24 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:08:34.920 ************************************ 00:08:34.920 END TEST bdev_verify_big_io 00:08:34.920 ************************************ 00:08:34.920 11:49:24 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:08:34.920 11:49:24 blockdev_general -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:08:34.920 11:49:24 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:34.920 11:49:24 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:34.920 11:49:24 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:34.920 ************************************ 00:08:34.920 START TEST bdev_write_zeroes 00:08:34.920 ************************************ 00:08:34.920 11:49:24 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:08:34.920 [2024-07-12 11:49:24.840645] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:08:34.920 [2024-07-12 11:49:24.840680] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid580324 ] 00:08:34.920 [2024-07-12 11:49:24.901379] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:34.920 [2024-07-12 11:49:24.972513] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:34.920 [2024-07-12 11:49:25.109692] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:34.920 [2024-07-12 11:49:25.109738] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:34.920 [2024-07-12 11:49:25.109745] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:34.920 [2024-07-12 11:49:25.117706] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:34.920 [2024-07-12 11:49:25.117723] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:34.920 [2024-07-12 11:49:25.125716] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:34.920 [2024-07-12 11:49:25.125730] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:35.183 [2024-07-12 11:49:25.193196] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:35.183 [2024-07-12 11:49:25.193233] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:35.183 [2024-07-12 11:49:25.193241] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x101fda0 00:08:35.183 [2024-07-12 11:49:25.193247] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:35.183 [2024-07-12 11:49:25.194264] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:35.183 [2024-07-12 11:49:25.194285] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:35.183 Running I/O for 1 seconds... 00:08:36.558 00:08:36.558 Latency(us) 00:08:36.558 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:36.558 Job: Malloc0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:36.558 Malloc0 : 1.03 7567.83 29.56 0.00 0.00 16909.76 487.62 31332.45 00:08:36.558 Job: Malloc1p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:36.558 Malloc1p0 : 1.03 7561.11 29.54 0.00 0.00 16903.90 670.96 30708.30 00:08:36.558 Job: Malloc1p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:36.558 Malloc1p1 : 1.03 7554.41 29.51 0.00 0.00 16883.86 682.67 29959.31 00:08:36.558 Job: Malloc2p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:36.559 Malloc2p0 : 1.03 7547.64 29.48 0.00 0.00 16866.49 713.87 29210.33 00:08:36.559 Job: Malloc2p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:36.559 Malloc2p1 : 1.04 7540.93 29.46 0.00 0.00 16852.41 670.96 28461.35 00:08:36.559 Job: Malloc2p2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:36.559 Malloc2p2 : 1.04 7534.21 29.43 0.00 0.00 16844.94 667.06 27837.20 00:08:36.559 Job: Malloc2p3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:36.559 Malloc2p3 : 1.04 7527.50 29.40 0.00 0.00 16835.57 698.27 27088.21 00:08:36.559 Job: Malloc2p4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:36.559 Malloc2p4 : 1.04 7520.82 29.38 0.00 0.00 16823.48 667.06 26339.23 00:08:36.559 Job: Malloc2p5 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:36.559 Malloc2p5 : 1.04 7514.14 29.35 0.00 0.00 16810.40 667.06 25715.08 00:08:36.559 Job: Malloc2p6 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:36.559 Malloc2p6 : 1.04 7507.30 29.33 0.00 0.00 16798.19 667.06 24966.10 00:08:36.559 Job: Malloc2p7 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:36.559 Malloc2p7 : 1.04 7500.70 29.30 0.00 0.00 16786.96 663.16 24341.94 00:08:36.559 Job: TestPT (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:36.559 TestPT : 1.04 7494.10 29.27 0.00 0.00 16771.03 694.37 23592.96 00:08:36.559 Job: raid0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:36.559 raid0 : 1.04 7486.48 29.24 0.00 0.00 16757.59 1217.10 22344.66 00:08:36.559 Job: concat0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:36.559 concat0 : 1.04 7479.02 29.21 0.00 0.00 16723.00 1224.90 20971.52 00:08:36.559 Job: raid1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:36.559 raid1 : 1.05 7469.63 29.18 0.00 0.00 16691.22 1833.45 19348.72 00:08:36.559 Job: AIO0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:36.559 AIO0 : 1.05 7463.95 29.16 0.00 0.00 16641.38 674.86 19348.72 00:08:36.559 =================================================================================================================== 00:08:36.559 Total : 120269.77 469.80 0.00 0.00 16806.26 487.62 31332.45 00:08:36.559 00:08:36.559 real 0m1.919s 00:08:36.559 user 0m1.630s 00:08:36.559 sys 0m0.235s 00:08:36.559 11:49:26 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:36.559 11:49:26 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:08:36.559 ************************************ 00:08:36.559 END TEST bdev_write_zeroes 00:08:36.559 ************************************ 00:08:36.559 11:49:26 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:08:36.559 11:49:26 blockdev_general -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:08:36.559 11:49:26 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:36.559 11:49:26 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:36.559 11:49:26 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:36.559 ************************************ 00:08:36.559 START TEST bdev_json_nonenclosed 00:08:36.559 ************************************ 00:08:36.559 11:49:26 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:08:36.818 [2024-07-12 11:49:26.831051] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:08:36.818 [2024-07-12 11:49:26.831088] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid580675 ] 00:08:36.818 [2024-07-12 11:49:26.892994] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:36.818 [2024-07-12 11:49:26.965128] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:36.818 [2024-07-12 11:49:26.965180] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:08:36.818 [2024-07-12 11:49:26.965208] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:08:36.818 [2024-07-12 11:49:26.965214] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:08:36.818 00:08:36.818 real 0m0.259s 00:08:36.818 user 0m0.165s 00:08:36.818 sys 0m0.091s 00:08:36.818 11:49:27 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:08:36.818 11:49:27 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:36.818 11:49:27 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:08:36.818 ************************************ 00:08:36.818 END TEST bdev_json_nonenclosed 00:08:36.818 ************************************ 00:08:37.076 11:49:27 blockdev_general -- common/autotest_common.sh@1142 -- # return 234 00:08:37.076 11:49:27 blockdev_general -- bdev/blockdev.sh@782 -- # true 00:08:37.076 11:49:27 blockdev_general -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:08:37.076 11:49:27 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:37.076 11:49:27 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:37.076 11:49:27 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:37.076 ************************************ 00:08:37.076 START TEST bdev_json_nonarray 00:08:37.076 ************************************ 00:08:37.076 11:49:27 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:08:37.076 [2024-07-12 11:49:27.155570] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:08:37.076 [2024-07-12 11:49:27.155603] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid580812 ] 00:08:37.076 [2024-07-12 11:49:27.216708] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:37.076 [2024-07-12 11:49:27.288380] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:37.076 [2024-07-12 11:49:27.288434] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:08:37.076 [2024-07-12 11:49:27.288445] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:08:37.077 [2024-07-12 11:49:27.288451] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:08:37.335 00:08:37.335 real 0m0.252s 00:08:37.335 user 0m0.164s 00:08:37.335 sys 0m0.086s 00:08:37.335 11:49:27 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:08:37.335 11:49:27 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:37.335 11:49:27 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:08:37.335 ************************************ 00:08:37.335 END TEST bdev_json_nonarray 00:08:37.335 ************************************ 00:08:37.335 11:49:27 blockdev_general -- common/autotest_common.sh@1142 -- # return 234 00:08:37.335 11:49:27 blockdev_general -- bdev/blockdev.sh@785 -- # true 00:08:37.335 11:49:27 blockdev_general -- bdev/blockdev.sh@787 -- # [[ bdev == bdev ]] 00:08:37.335 11:49:27 blockdev_general -- bdev/blockdev.sh@788 -- # run_test bdev_qos qos_test_suite '' 00:08:37.335 11:49:27 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:08:37.335 11:49:27 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:37.335 11:49:27 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:37.335 ************************************ 00:08:37.335 START TEST bdev_qos 00:08:37.335 ************************************ 00:08:37.335 11:49:27 blockdev_general.bdev_qos -- common/autotest_common.sh@1123 -- # qos_test_suite '' 00:08:37.335 11:49:27 blockdev_general.bdev_qos -- bdev/blockdev.sh@446 -- # QOS_PID=580842 00:08:37.335 11:49:27 blockdev_general.bdev_qos -- bdev/blockdev.sh@447 -- # echo 'Process qos testing pid: 580842' 00:08:37.335 Process qos testing pid: 580842 00:08:37.335 11:49:27 blockdev_general.bdev_qos -- bdev/blockdev.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 256 -o 4096 -w randread -t 60 '' 00:08:37.335 11:49:27 blockdev_general.bdev_qos -- bdev/blockdev.sh@448 -- # trap 'cleanup; killprocess $QOS_PID; exit 1' SIGINT SIGTERM EXIT 00:08:37.335 11:49:27 blockdev_general.bdev_qos -- bdev/blockdev.sh@449 -- # waitforlisten 580842 00:08:37.335 11:49:27 blockdev_general.bdev_qos -- common/autotest_common.sh@829 -- # '[' -z 580842 ']' 00:08:37.335 11:49:27 blockdev_general.bdev_qos -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:37.335 11:49:27 blockdev_general.bdev_qos -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:37.335 11:49:27 blockdev_general.bdev_qos -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:37.335 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:37.335 11:49:27 blockdev_general.bdev_qos -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:37.335 11:49:27 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:37.335 [2024-07-12 11:49:27.467453] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:08:37.335 [2024-07-12 11:49:27.467489] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid580842 ] 00:08:37.335 [2024-07-12 11:49:27.530562] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:37.594 [2024-07-12 11:49:27.608290] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:08:38.162 11:49:28 blockdev_general.bdev_qos -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- common/autotest_common.sh@862 -- # return 0 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- bdev/blockdev.sh@451 -- # rpc_cmd bdev_malloc_create -b Malloc_0 128 512 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:38.163 Malloc_0 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- bdev/blockdev.sh@452 -- # waitforbdev Malloc_0 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_0 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local i 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_0 -t 2000 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:38.163 [ 00:08:38.163 { 00:08:38.163 "name": "Malloc_0", 00:08:38.163 "aliases": [ 00:08:38.163 "2ed38a8a-cdc0-40e9-8c36-16fa6d92412a" 00:08:38.163 ], 00:08:38.163 "product_name": "Malloc disk", 00:08:38.163 "block_size": 512, 00:08:38.163 "num_blocks": 262144, 00:08:38.163 "uuid": "2ed38a8a-cdc0-40e9-8c36-16fa6d92412a", 00:08:38.163 "assigned_rate_limits": { 00:08:38.163 "rw_ios_per_sec": 0, 00:08:38.163 "rw_mbytes_per_sec": 0, 00:08:38.163 "r_mbytes_per_sec": 0, 00:08:38.163 "w_mbytes_per_sec": 0 00:08:38.163 }, 00:08:38.163 "claimed": false, 00:08:38.163 "zoned": false, 00:08:38.163 "supported_io_types": { 00:08:38.163 "read": true, 00:08:38.163 "write": true, 00:08:38.163 "unmap": true, 00:08:38.163 "flush": true, 00:08:38.163 "reset": true, 00:08:38.163 "nvme_admin": false, 00:08:38.163 "nvme_io": false, 00:08:38.163 "nvme_io_md": false, 00:08:38.163 "write_zeroes": true, 00:08:38.163 "zcopy": true, 00:08:38.163 "get_zone_info": false, 00:08:38.163 "zone_management": false, 00:08:38.163 "zone_append": false, 00:08:38.163 "compare": false, 00:08:38.163 "compare_and_write": false, 00:08:38.163 "abort": true, 00:08:38.163 "seek_hole": false, 00:08:38.163 "seek_data": false, 00:08:38.163 "copy": true, 00:08:38.163 "nvme_iov_md": false 00:08:38.163 }, 00:08:38.163 "memory_domains": [ 00:08:38.163 { 00:08:38.163 "dma_device_id": "system", 00:08:38.163 "dma_device_type": 1 00:08:38.163 }, 00:08:38.163 { 00:08:38.163 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:38.163 "dma_device_type": 2 00:08:38.163 } 00:08:38.163 ], 00:08:38.163 "driver_specific": {} 00:08:38.163 } 00:08:38.163 ] 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # return 0 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- bdev/blockdev.sh@453 -- # rpc_cmd bdev_null_create Null_1 128 512 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:38.163 Null_1 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- bdev/blockdev.sh@454 -- # waitforbdev Null_1 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local bdev_name=Null_1 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local i 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Null_1 -t 2000 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:38.163 [ 00:08:38.163 { 00:08:38.163 "name": "Null_1", 00:08:38.163 "aliases": [ 00:08:38.163 "6a113a53-4a0a-448a-a9d7-a415bd5441ea" 00:08:38.163 ], 00:08:38.163 "product_name": "Null disk", 00:08:38.163 "block_size": 512, 00:08:38.163 "num_blocks": 262144, 00:08:38.163 "uuid": "6a113a53-4a0a-448a-a9d7-a415bd5441ea", 00:08:38.163 "assigned_rate_limits": { 00:08:38.163 "rw_ios_per_sec": 0, 00:08:38.163 "rw_mbytes_per_sec": 0, 00:08:38.163 "r_mbytes_per_sec": 0, 00:08:38.163 "w_mbytes_per_sec": 0 00:08:38.163 }, 00:08:38.163 "claimed": false, 00:08:38.163 "zoned": false, 00:08:38.163 "supported_io_types": { 00:08:38.163 "read": true, 00:08:38.163 "write": true, 00:08:38.163 "unmap": false, 00:08:38.163 "flush": false, 00:08:38.163 "reset": true, 00:08:38.163 "nvme_admin": false, 00:08:38.163 "nvme_io": false, 00:08:38.163 "nvme_io_md": false, 00:08:38.163 "write_zeroes": true, 00:08:38.163 "zcopy": false, 00:08:38.163 "get_zone_info": false, 00:08:38.163 "zone_management": false, 00:08:38.163 "zone_append": false, 00:08:38.163 "compare": false, 00:08:38.163 "compare_and_write": false, 00:08:38.163 "abort": true, 00:08:38.163 "seek_hole": false, 00:08:38.163 "seek_data": false, 00:08:38.163 "copy": false, 00:08:38.163 "nvme_iov_md": false 00:08:38.163 }, 00:08:38.163 "driver_specific": {} 00:08:38.163 } 00:08:38.163 ] 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # return 0 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- bdev/blockdev.sh@457 -- # qos_function_test 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- bdev/blockdev.sh@410 -- # local qos_lower_iops_limit=1000 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- bdev/blockdev.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- bdev/blockdev.sh@411 -- # local qos_lower_bw_limit=2 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- bdev/blockdev.sh@412 -- # local io_result=0 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- bdev/blockdev.sh@413 -- # local iops_limit=0 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- bdev/blockdev.sh@414 -- # local bw_limit=0 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # get_io_result IOPS Malloc_0 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:08:38.163 11:49:28 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:08:38.423 Running I/O for 60 seconds... 00:08:43.699 11:49:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 98575.98 394303.90 0.00 0.00 397312.00 0.00 0.00 ' 00:08:43.699 11:49:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:08:43.699 11:49:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:08:43.699 11:49:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # iostat_result=98575.98 00:08:43.699 11:49:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 98575 00:08:43.699 11:49:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # io_result=98575 00:08:43.699 11:49:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@418 -- # iops_limit=24000 00:08:43.699 11:49:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@419 -- # '[' 24000 -gt 1000 ']' 00:08:43.699 11:49:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@422 -- # rpc_cmd bdev_set_qos_limit --rw_ios_per_sec 24000 Malloc_0 00:08:43.699 11:49:33 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:43.699 11:49:33 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:43.699 11:49:33 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:43.699 11:49:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@423 -- # run_test bdev_qos_iops run_qos_test 24000 IOPS Malloc_0 00:08:43.699 11:49:33 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:08:43.699 11:49:33 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:43.699 11:49:33 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:43.699 ************************************ 00:08:43.699 START TEST bdev_qos_iops 00:08:43.699 ************************************ 00:08:43.699 11:49:33 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1123 -- # run_qos_test 24000 IOPS Malloc_0 00:08:43.699 11:49:33 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@389 -- # local qos_limit=24000 00:08:43.699 11:49:33 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@390 -- # local qos_result=0 00:08:43.699 11:49:33 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # get_io_result IOPS Malloc_0 00:08:43.699 11:49:33 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:08:43.699 11:49:33 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:08:43.699 11:49:33 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # local iostat_result 00:08:43.699 11:49:33 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:08:43.699 11:49:33 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:08:43.699 11:49:33 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # tail -1 00:08:48.976 11:49:38 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 23989.87 95959.47 0.00 0.00 96960.00 0.00 0.00 ' 00:08:48.976 11:49:38 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:08:48.976 11:49:38 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:08:48.976 11:49:38 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # iostat_result=23989.87 00:08:48.976 11:49:38 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@385 -- # echo 23989 00:08:48.976 11:49:38 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # qos_result=23989 00:08:48.976 11:49:38 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@393 -- # '[' IOPS = BANDWIDTH ']' 00:08:48.976 11:49:38 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@396 -- # lower_limit=21600 00:08:48.976 11:49:38 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@397 -- # upper_limit=26400 00:08:48.976 11:49:38 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 23989 -lt 21600 ']' 00:08:48.976 11:49:38 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 23989 -gt 26400 ']' 00:08:48.976 00:08:48.976 real 0m5.186s 00:08:48.976 user 0m0.096s 00:08:48.976 sys 0m0.030s 00:08:48.976 11:49:38 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:48.976 11:49:38 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@10 -- # set +x 00:08:48.976 ************************************ 00:08:48.976 END TEST bdev_qos_iops 00:08:48.976 ************************************ 00:08:48.976 11:49:38 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:08:48.976 11:49:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # get_io_result BANDWIDTH Null_1 00:08:48.976 11:49:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:08:48.976 11:49:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:08:48.976 11:49:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:08:48.976 11:49:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:08:48.976 11:49:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Null_1 00:08:48.976 11:49:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:08:54.251 11:49:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 30794.53 123178.11 0.00 0.00 124928.00 0.00 0.00 ' 00:08:54.251 11:49:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:08:54.251 11:49:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:08:54.251 11:49:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:08:54.251 11:49:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # iostat_result=124928.00 00:08:54.251 11:49:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 124928 00:08:54.251 11:49:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # bw_limit=124928 00:08:54.251 11:49:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@428 -- # bw_limit=12 00:08:54.251 11:49:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@429 -- # '[' 12 -lt 2 ']' 00:08:54.251 11:49:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@432 -- # rpc_cmd bdev_set_qos_limit --rw_mbytes_per_sec 12 Null_1 00:08:54.251 11:49:43 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:54.251 11:49:43 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:54.251 11:49:43 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:54.251 11:49:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@433 -- # run_test bdev_qos_bw run_qos_test 12 BANDWIDTH Null_1 00:08:54.251 11:49:43 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:08:54.251 11:49:43 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:54.251 11:49:43 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:54.251 ************************************ 00:08:54.251 START TEST bdev_qos_bw 00:08:54.251 ************************************ 00:08:54.251 11:49:44 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1123 -- # run_qos_test 12 BANDWIDTH Null_1 00:08:54.251 11:49:44 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@389 -- # local qos_limit=12 00:08:54.251 11:49:44 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:08:54.251 11:49:44 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Null_1 00:08:54.251 11:49:44 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:08:54.251 11:49:44 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:08:54.251 11:49:44 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:08:54.251 11:49:44 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:08:54.251 11:49:44 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # grep Null_1 00:08:54.251 11:49:44 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # tail -1 00:08:59.527 11:49:49 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 3071.49 12285.96 0.00 0.00 12444.00 0.00 0.00 ' 00:08:59.527 11:49:49 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:08:59.527 11:49:49 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:08:59.527 11:49:49 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:08:59.527 11:49:49 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # iostat_result=12444.00 00:08:59.527 11:49:49 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@385 -- # echo 12444 00:08:59.527 11:49:49 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # qos_result=12444 00:08:59.527 11:49:49 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:08:59.527 11:49:49 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@394 -- # qos_limit=12288 00:08:59.527 11:49:49 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@396 -- # lower_limit=11059 00:08:59.527 11:49:49 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@397 -- # upper_limit=13516 00:08:59.527 11:49:49 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 12444 -lt 11059 ']' 00:08:59.527 11:49:49 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 12444 -gt 13516 ']' 00:08:59.527 00:08:59.527 real 0m5.185s 00:08:59.527 user 0m0.081s 00:08:59.527 sys 0m0.029s 00:08:59.527 11:49:49 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:59.527 11:49:49 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@10 -- # set +x 00:08:59.527 ************************************ 00:08:59.527 END TEST bdev_qos_bw 00:08:59.527 ************************************ 00:08:59.527 11:49:49 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:08:59.527 11:49:49 blockdev_general.bdev_qos -- bdev/blockdev.sh@436 -- # rpc_cmd bdev_set_qos_limit --r_mbytes_per_sec 2 Malloc_0 00:08:59.527 11:49:49 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:59.527 11:49:49 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:59.527 11:49:49 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:59.527 11:49:49 blockdev_general.bdev_qos -- bdev/blockdev.sh@437 -- # run_test bdev_qos_ro_bw run_qos_test 2 BANDWIDTH Malloc_0 00:08:59.527 11:49:49 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:08:59.527 11:49:49 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:59.527 11:49:49 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:59.527 ************************************ 00:08:59.527 START TEST bdev_qos_ro_bw 00:08:59.527 ************************************ 00:08:59.527 11:49:49 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1123 -- # run_qos_test 2 BANDWIDTH Malloc_0 00:08:59.527 11:49:49 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@389 -- # local qos_limit=2 00:08:59.527 11:49:49 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:08:59.527 11:49:49 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Malloc_0 00:08:59.527 11:49:49 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:08:59.527 11:49:49 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:08:59.527 11:49:49 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:08:59.527 11:49:49 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:08:59.527 11:49:49 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:08:59.527 11:49:49 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # tail -1 00:09:04.803 11:49:54 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 512.32 2049.28 0.00 0.00 2060.00 0.00 0.00 ' 00:09:04.803 11:49:54 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:09:04.803 11:49:54 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:09:04.803 11:49:54 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:09:04.803 11:49:54 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # iostat_result=2060.00 00:09:04.803 11:49:54 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@385 -- # echo 2060 00:09:04.803 11:49:54 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # qos_result=2060 00:09:04.803 11:49:54 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:09:04.803 11:49:54 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@394 -- # qos_limit=2048 00:09:04.803 11:49:54 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@396 -- # lower_limit=1843 00:09:04.803 11:49:54 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@397 -- # upper_limit=2252 00:09:04.803 11:49:54 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2060 -lt 1843 ']' 00:09:04.803 11:49:54 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2060 -gt 2252 ']' 00:09:04.803 00:09:04.803 real 0m5.139s 00:09:04.803 user 0m0.082s 00:09:04.803 sys 0m0.027s 00:09:04.803 11:49:54 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:04.803 11:49:54 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@10 -- # set +x 00:09:04.803 ************************************ 00:09:04.803 END TEST bdev_qos_ro_bw 00:09:04.803 ************************************ 00:09:04.803 11:49:54 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:09:04.803 11:49:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@459 -- # rpc_cmd bdev_malloc_delete Malloc_0 00:09:04.803 11:49:54 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:04.803 11:49:54 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:04.803 11:49:54 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:04.803 11:49:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@460 -- # rpc_cmd bdev_null_delete Null_1 00:09:04.803 11:49:54 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:04.803 11:49:54 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:05.062 00:09:05.062 Latency(us) 00:09:05.062 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:05.062 Job: Malloc_0 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:09:05.062 Malloc_0 : 26.47 33239.17 129.84 0.00 0.00 7627.71 1380.94 503316.48 00:09:05.062 Job: Null_1 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:09:05.062 Null_1 : 26.57 32096.97 125.38 0.00 0.00 7961.76 538.33 102360.99 00:09:05.062 =================================================================================================================== 00:09:05.062 Total : 65336.14 255.22 0.00 0.00 7792.14 538.33 503316.48 00:09:05.062 0 00:09:05.062 11:49:55 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:05.062 11:49:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@461 -- # killprocess 580842 00:09:05.062 11:49:55 blockdev_general.bdev_qos -- common/autotest_common.sh@948 -- # '[' -z 580842 ']' 00:09:05.062 11:49:55 blockdev_general.bdev_qos -- common/autotest_common.sh@952 -- # kill -0 580842 00:09:05.062 11:49:55 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # uname 00:09:05.062 11:49:55 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:05.062 11:49:55 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 580842 00:09:05.062 11:49:55 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:09:05.062 11:49:55 blockdev_general.bdev_qos -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:09:05.062 11:49:55 blockdev_general.bdev_qos -- common/autotest_common.sh@966 -- # echo 'killing process with pid 580842' 00:09:05.062 killing process with pid 580842 00:09:05.062 11:49:55 blockdev_general.bdev_qos -- common/autotest_common.sh@967 -- # kill 580842 00:09:05.062 Received shutdown signal, test time was about 26.627803 seconds 00:09:05.062 00:09:05.062 Latency(us) 00:09:05.062 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:05.062 =================================================================================================================== 00:09:05.062 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:09:05.062 11:49:55 blockdev_general.bdev_qos -- common/autotest_common.sh@972 -- # wait 580842 00:09:05.062 11:49:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@462 -- # trap - SIGINT SIGTERM EXIT 00:09:05.062 00:09:05.062 real 0m27.883s 00:09:05.062 user 0m28.473s 00:09:05.062 sys 0m0.604s 00:09:05.062 11:49:55 blockdev_general.bdev_qos -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:05.062 11:49:55 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:05.062 ************************************ 00:09:05.062 END TEST bdev_qos 00:09:05.062 ************************************ 00:09:05.322 11:49:55 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:05.322 11:49:55 blockdev_general -- bdev/blockdev.sh@789 -- # run_test bdev_qd_sampling qd_sampling_test_suite '' 00:09:05.322 11:49:55 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:05.322 11:49:55 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:05.322 11:49:55 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:05.322 ************************************ 00:09:05.322 START TEST bdev_qd_sampling 00:09:05.322 ************************************ 00:09:05.322 11:49:55 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1123 -- # qd_sampling_test_suite '' 00:09:05.322 11:49:55 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@538 -- # QD_DEV=Malloc_QD 00:09:05.322 11:49:55 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@541 -- # QD_PID=585487 00:09:05.322 11:49:55 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@542 -- # echo 'Process bdev QD sampling period testing pid: 585487' 00:09:05.322 Process bdev QD sampling period testing pid: 585487 00:09:05.322 11:49:55 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@540 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 5 -C '' 00:09:05.322 11:49:55 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@543 -- # trap 'cleanup; killprocess $QD_PID; exit 1' SIGINT SIGTERM EXIT 00:09:05.322 11:49:55 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@544 -- # waitforlisten 585487 00:09:05.322 11:49:55 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@829 -- # '[' -z 585487 ']' 00:09:05.322 11:49:55 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:05.322 11:49:55 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:05.322 11:49:55 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:05.322 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:05.322 11:49:55 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:05.322 11:49:55 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:05.322 [2024-07-12 11:49:55.418591] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:09:05.322 [2024-07-12 11:49:55.418630] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid585487 ] 00:09:05.322 [2024-07-12 11:49:55.482054] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:05.322 [2024-07-12 11:49:55.556766] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:09:05.322 [2024-07-12 11:49:55.556767] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:06.260 11:49:56 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:06.260 11:49:56 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@862 -- # return 0 00:09:06.260 11:49:56 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@546 -- # rpc_cmd bdev_malloc_create -b Malloc_QD 128 512 00:09:06.260 11:49:56 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:06.260 11:49:56 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:06.260 Malloc_QD 00:09:06.260 11:49:56 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:06.260 11:49:56 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@547 -- # waitforbdev Malloc_QD 00:09:06.260 11:49:56 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_QD 00:09:06.260 11:49:56 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:06.260 11:49:56 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@899 -- # local i 00:09:06.260 11:49:56 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:06.260 11:49:56 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:06.260 11:49:56 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:09:06.260 11:49:56 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:06.260 11:49:56 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:06.260 11:49:56 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:06.260 11:49:56 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_QD -t 2000 00:09:06.260 11:49:56 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:06.260 11:49:56 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:06.260 [ 00:09:06.260 { 00:09:06.260 "name": "Malloc_QD", 00:09:06.260 "aliases": [ 00:09:06.260 "a793228c-3551-4a4d-8461-ec75b59514d4" 00:09:06.260 ], 00:09:06.260 "product_name": "Malloc disk", 00:09:06.260 "block_size": 512, 00:09:06.260 "num_blocks": 262144, 00:09:06.260 "uuid": "a793228c-3551-4a4d-8461-ec75b59514d4", 00:09:06.260 "assigned_rate_limits": { 00:09:06.260 "rw_ios_per_sec": 0, 00:09:06.260 "rw_mbytes_per_sec": 0, 00:09:06.260 "r_mbytes_per_sec": 0, 00:09:06.260 "w_mbytes_per_sec": 0 00:09:06.260 }, 00:09:06.260 "claimed": false, 00:09:06.260 "zoned": false, 00:09:06.260 "supported_io_types": { 00:09:06.260 "read": true, 00:09:06.260 "write": true, 00:09:06.260 "unmap": true, 00:09:06.260 "flush": true, 00:09:06.260 "reset": true, 00:09:06.260 "nvme_admin": false, 00:09:06.260 "nvme_io": false, 00:09:06.260 "nvme_io_md": false, 00:09:06.260 "write_zeroes": true, 00:09:06.260 "zcopy": true, 00:09:06.260 "get_zone_info": false, 00:09:06.260 "zone_management": false, 00:09:06.260 "zone_append": false, 00:09:06.260 "compare": false, 00:09:06.260 "compare_and_write": false, 00:09:06.260 "abort": true, 00:09:06.260 "seek_hole": false, 00:09:06.260 "seek_data": false, 00:09:06.260 "copy": true, 00:09:06.260 "nvme_iov_md": false 00:09:06.260 }, 00:09:06.260 "memory_domains": [ 00:09:06.260 { 00:09:06.260 "dma_device_id": "system", 00:09:06.260 "dma_device_type": 1 00:09:06.260 }, 00:09:06.260 { 00:09:06.260 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:06.260 "dma_device_type": 2 00:09:06.260 } 00:09:06.260 ], 00:09:06.260 "driver_specific": {} 00:09:06.260 } 00:09:06.260 ] 00:09:06.260 11:49:56 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:06.260 11:49:56 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@905 -- # return 0 00:09:06.260 11:49:56 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@550 -- # sleep 2 00:09:06.260 11:49:56 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@549 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:09:06.260 Running I/O for 5 seconds... 00:09:08.165 11:49:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@551 -- # qd_sampling_function_test Malloc_QD 00:09:08.165 11:49:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@519 -- # local bdev_name=Malloc_QD 00:09:08.165 11:49:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@520 -- # local sampling_period=10 00:09:08.165 11:49:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@521 -- # local iostats 00:09:08.165 11:49:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@523 -- # rpc_cmd bdev_set_qd_sampling_period Malloc_QD 10 00:09:08.165 11:49:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:08.165 11:49:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:08.165 11:49:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:08.165 11:49:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # rpc_cmd bdev_get_iostat -b Malloc_QD 00:09:08.165 11:49:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:08.165 11:49:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:08.165 11:49:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:08.165 11:49:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # iostats='{ 00:09:08.165 "tick_rate": 2100000000, 00:09:08.165 "ticks": 9617998971573926, 00:09:08.165 "bdevs": [ 00:09:08.165 { 00:09:08.165 "name": "Malloc_QD", 00:09:08.165 "bytes_read": 993047040, 00:09:08.165 "num_read_ops": 242436, 00:09:08.165 "bytes_written": 0, 00:09:08.165 "num_write_ops": 0, 00:09:08.165 "bytes_unmapped": 0, 00:09:08.165 "num_unmap_ops": 0, 00:09:08.165 "bytes_copied": 0, 00:09:08.165 "num_copy_ops": 0, 00:09:08.165 "read_latency_ticks": 2072807119980, 00:09:08.165 "max_read_latency_ticks": 10270858, 00:09:08.165 "min_read_latency_ticks": 169664, 00:09:08.165 "write_latency_ticks": 0, 00:09:08.165 "max_write_latency_ticks": 0, 00:09:08.165 "min_write_latency_ticks": 0, 00:09:08.165 "unmap_latency_ticks": 0, 00:09:08.165 "max_unmap_latency_ticks": 0, 00:09:08.165 "min_unmap_latency_ticks": 0, 00:09:08.165 "copy_latency_ticks": 0, 00:09:08.165 "max_copy_latency_ticks": 0, 00:09:08.165 "min_copy_latency_ticks": 0, 00:09:08.165 "io_error": {}, 00:09:08.165 "queue_depth_polling_period": 10, 00:09:08.165 "queue_depth": 512, 00:09:08.165 "io_time": 20, 00:09:08.165 "weighted_io_time": 10240 00:09:08.165 } 00:09:08.165 ] 00:09:08.165 }' 00:09:08.165 11:49:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # jq -r '.bdevs[0].queue_depth_polling_period' 00:09:08.165 11:49:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # qd_sampling_period=10 00:09:08.165 11:49:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 == null ']' 00:09:08.165 11:49:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 -ne 10 ']' 00:09:08.165 11:49:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@553 -- # rpc_cmd bdev_malloc_delete Malloc_QD 00:09:08.165 11:49:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:08.165 11:49:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:08.165 00:09:08.165 Latency(us) 00:09:08.165 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:08.165 Job: Malloc_QD (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:09:08.165 Malloc_QD : 2.00 62216.32 243.03 0.00 0.00 4105.72 1076.66 4462.69 00:09:08.165 Job: Malloc_QD (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:09:08.165 Malloc_QD : 2.00 63345.36 247.44 0.00 0.00 4033.14 616.35 4899.60 00:09:08.165 =================================================================================================================== 00:09:08.165 Total : 125561.67 490.48 0.00 0.00 4069.09 616.35 4899.60 00:09:08.165 0 00:09:08.165 11:49:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:08.165 11:49:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@554 -- # killprocess 585487 00:09:08.165 11:49:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@948 -- # '[' -z 585487 ']' 00:09:08.165 11:49:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@952 -- # kill -0 585487 00:09:08.165 11:49:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # uname 00:09:08.165 11:49:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:08.166 11:49:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 585487 00:09:08.424 11:49:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:08.424 11:49:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:08.424 11:49:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@966 -- # echo 'killing process with pid 585487' 00:09:08.424 killing process with pid 585487 00:09:08.424 11:49:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@967 -- # kill 585487 00:09:08.424 Received shutdown signal, test time was about 2.064812 seconds 00:09:08.424 00:09:08.424 Latency(us) 00:09:08.424 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:08.424 =================================================================================================================== 00:09:08.424 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:09:08.424 11:49:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@972 -- # wait 585487 00:09:08.424 11:49:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@555 -- # trap - SIGINT SIGTERM EXIT 00:09:08.424 00:09:08.424 real 0m3.225s 00:09:08.424 user 0m6.376s 00:09:08.424 sys 0m0.302s 00:09:08.424 11:49:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:08.424 11:49:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:08.424 ************************************ 00:09:08.424 END TEST bdev_qd_sampling 00:09:08.424 ************************************ 00:09:08.424 11:49:58 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:08.424 11:49:58 blockdev_general -- bdev/blockdev.sh@790 -- # run_test bdev_error error_test_suite '' 00:09:08.424 11:49:58 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:08.424 11:49:58 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:08.424 11:49:58 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:08.424 ************************************ 00:09:08.424 START TEST bdev_error 00:09:08.424 ************************************ 00:09:08.424 11:49:58 blockdev_general.bdev_error -- common/autotest_common.sh@1123 -- # error_test_suite '' 00:09:08.424 11:49:58 blockdev_general.bdev_error -- bdev/blockdev.sh@466 -- # DEV_1=Dev_1 00:09:08.424 11:49:58 blockdev_general.bdev_error -- bdev/blockdev.sh@467 -- # DEV_2=Dev_2 00:09:08.424 11:49:58 blockdev_general.bdev_error -- bdev/blockdev.sh@468 -- # ERR_DEV=EE_Dev_1 00:09:08.424 11:49:58 blockdev_general.bdev_error -- bdev/blockdev.sh@472 -- # ERR_PID=586185 00:09:08.424 11:49:58 blockdev_general.bdev_error -- bdev/blockdev.sh@473 -- # echo 'Process error testing pid: 586185' 00:09:08.424 Process error testing pid: 586185 00:09:08.424 11:49:58 blockdev_general.bdev_error -- bdev/blockdev.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 -f '' 00:09:08.424 11:49:58 blockdev_general.bdev_error -- bdev/blockdev.sh@474 -- # waitforlisten 586185 00:09:08.424 11:49:58 blockdev_general.bdev_error -- common/autotest_common.sh@829 -- # '[' -z 586185 ']' 00:09:08.424 11:49:58 blockdev_general.bdev_error -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:08.424 11:49:58 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:08.424 11:49:58 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:08.424 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:08.424 11:49:58 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:08.424 11:49:58 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:08.682 [2024-07-12 11:49:58.710359] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:09:08.682 [2024-07-12 11:49:58.710394] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid586185 ] 00:09:08.682 [2024-07-12 11:49:58.773971] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:08.682 [2024-07-12 11:49:58.849547] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:09:09.248 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:09.248 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@862 -- # return 0 00:09:09.248 11:49:59 blockdev_general.bdev_error -- bdev/blockdev.sh@476 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:09:09.248 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:09.248 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:09.506 Dev_1 00:09:09.506 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:09.506 11:49:59 blockdev_general.bdev_error -- bdev/blockdev.sh@477 -- # waitforbdev Dev_1 00:09:09.506 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_1 00:09:09.506 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:09.506 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:09:09.506 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:09.506 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:09.506 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:09:09.506 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:09.506 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:09.506 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:09.506 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:09:09.506 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:09.506 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:09.506 [ 00:09:09.506 { 00:09:09.506 "name": "Dev_1", 00:09:09.506 "aliases": [ 00:09:09.506 "1a4f45b0-60d6-4b21-bbc3-45cc50533b6c" 00:09:09.506 ], 00:09:09.506 "product_name": "Malloc disk", 00:09:09.506 "block_size": 512, 00:09:09.506 "num_blocks": 262144, 00:09:09.506 "uuid": "1a4f45b0-60d6-4b21-bbc3-45cc50533b6c", 00:09:09.507 "assigned_rate_limits": { 00:09:09.507 "rw_ios_per_sec": 0, 00:09:09.507 "rw_mbytes_per_sec": 0, 00:09:09.507 "r_mbytes_per_sec": 0, 00:09:09.507 "w_mbytes_per_sec": 0 00:09:09.507 }, 00:09:09.507 "claimed": false, 00:09:09.507 "zoned": false, 00:09:09.507 "supported_io_types": { 00:09:09.507 "read": true, 00:09:09.507 "write": true, 00:09:09.507 "unmap": true, 00:09:09.507 "flush": true, 00:09:09.507 "reset": true, 00:09:09.507 "nvme_admin": false, 00:09:09.507 "nvme_io": false, 00:09:09.507 "nvme_io_md": false, 00:09:09.507 "write_zeroes": true, 00:09:09.507 "zcopy": true, 00:09:09.507 "get_zone_info": false, 00:09:09.507 "zone_management": false, 00:09:09.507 "zone_append": false, 00:09:09.507 "compare": false, 00:09:09.507 "compare_and_write": false, 00:09:09.507 "abort": true, 00:09:09.507 "seek_hole": false, 00:09:09.507 "seek_data": false, 00:09:09.507 "copy": true, 00:09:09.507 "nvme_iov_md": false 00:09:09.507 }, 00:09:09.507 "memory_domains": [ 00:09:09.507 { 00:09:09.507 "dma_device_id": "system", 00:09:09.507 "dma_device_type": 1 00:09:09.507 }, 00:09:09.507 { 00:09:09.507 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:09.507 "dma_device_type": 2 00:09:09.507 } 00:09:09.507 ], 00:09:09.507 "driver_specific": {} 00:09:09.507 } 00:09:09.507 ] 00:09:09.507 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:09.507 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:09:09.507 11:49:59 blockdev_general.bdev_error -- bdev/blockdev.sh@478 -- # rpc_cmd bdev_error_create Dev_1 00:09:09.507 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:09.507 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:09.507 true 00:09:09.507 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:09.507 11:49:59 blockdev_general.bdev_error -- bdev/blockdev.sh@479 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:09:09.507 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:09.507 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:09.507 Dev_2 00:09:09.507 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:09.507 11:49:59 blockdev_general.bdev_error -- bdev/blockdev.sh@480 -- # waitforbdev Dev_2 00:09:09.507 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_2 00:09:09.507 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:09.507 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:09:09.507 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:09.507 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:09.507 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:09:09.507 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:09.507 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:09.507 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:09.507 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:09:09.507 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:09.507 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:09.507 [ 00:09:09.507 { 00:09:09.507 "name": "Dev_2", 00:09:09.507 "aliases": [ 00:09:09.507 "4fe430af-fc96-4263-ae7e-5456ad113e64" 00:09:09.507 ], 00:09:09.507 "product_name": "Malloc disk", 00:09:09.507 "block_size": 512, 00:09:09.507 "num_blocks": 262144, 00:09:09.507 "uuid": "4fe430af-fc96-4263-ae7e-5456ad113e64", 00:09:09.507 "assigned_rate_limits": { 00:09:09.507 "rw_ios_per_sec": 0, 00:09:09.507 "rw_mbytes_per_sec": 0, 00:09:09.507 "r_mbytes_per_sec": 0, 00:09:09.507 "w_mbytes_per_sec": 0 00:09:09.507 }, 00:09:09.507 "claimed": false, 00:09:09.507 "zoned": false, 00:09:09.507 "supported_io_types": { 00:09:09.507 "read": true, 00:09:09.507 "write": true, 00:09:09.507 "unmap": true, 00:09:09.507 "flush": true, 00:09:09.507 "reset": true, 00:09:09.507 "nvme_admin": false, 00:09:09.507 "nvme_io": false, 00:09:09.507 "nvme_io_md": false, 00:09:09.507 "write_zeroes": true, 00:09:09.507 "zcopy": true, 00:09:09.507 "get_zone_info": false, 00:09:09.507 "zone_management": false, 00:09:09.507 "zone_append": false, 00:09:09.507 "compare": false, 00:09:09.507 "compare_and_write": false, 00:09:09.507 "abort": true, 00:09:09.507 "seek_hole": false, 00:09:09.507 "seek_data": false, 00:09:09.507 "copy": true, 00:09:09.507 "nvme_iov_md": false 00:09:09.507 }, 00:09:09.507 "memory_domains": [ 00:09:09.507 { 00:09:09.507 "dma_device_id": "system", 00:09:09.507 "dma_device_type": 1 00:09:09.507 }, 00:09:09.507 { 00:09:09.507 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:09.507 "dma_device_type": 2 00:09:09.507 } 00:09:09.507 ], 00:09:09.507 "driver_specific": {} 00:09:09.507 } 00:09:09.507 ] 00:09:09.507 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:09.507 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:09:09.507 11:49:59 blockdev_general.bdev_error -- bdev/blockdev.sh@481 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:09:09.507 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:09.507 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:09.507 11:49:59 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:09.507 11:49:59 blockdev_general.bdev_error -- bdev/blockdev.sh@484 -- # sleep 1 00:09:09.507 11:49:59 blockdev_general.bdev_error -- bdev/blockdev.sh@483 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:09:09.507 Running I/O for 5 seconds... 00:09:10.443 11:50:00 blockdev_general.bdev_error -- bdev/blockdev.sh@487 -- # kill -0 586185 00:09:10.443 11:50:00 blockdev_general.bdev_error -- bdev/blockdev.sh@488 -- # echo 'Process is existed as continue on error is set. Pid: 586185' 00:09:10.443 Process is existed as continue on error is set. Pid: 586185 00:09:10.443 11:50:00 blockdev_general.bdev_error -- bdev/blockdev.sh@495 -- # rpc_cmd bdev_error_delete EE_Dev_1 00:09:10.443 11:50:00 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:10.443 11:50:00 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:10.443 11:50:00 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:10.443 11:50:00 blockdev_general.bdev_error -- bdev/blockdev.sh@496 -- # rpc_cmd bdev_malloc_delete Dev_1 00:09:10.443 11:50:00 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:10.443 11:50:00 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:10.443 11:50:00 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:10.443 11:50:00 blockdev_general.bdev_error -- bdev/blockdev.sh@497 -- # sleep 5 00:09:10.703 Timeout while waiting for response: 00:09:10.703 00:09:10.703 00:09:14.896 00:09:14.897 Latency(us) 00:09:14.897 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:14.897 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:09:14.897 EE_Dev_1 : 0.93 57465.30 224.47 5.40 0.00 276.16 91.67 522.73 00:09:14.897 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:09:14.897 Dev_2 : 5.00 127324.14 497.36 0.00 0.00 123.46 41.20 18599.74 00:09:14.897 =================================================================================================================== 00:09:14.897 Total : 184789.44 721.83 5.40 0.00 135.24 41.20 18599.74 00:09:15.465 11:50:05 blockdev_general.bdev_error -- bdev/blockdev.sh@499 -- # killprocess 586185 00:09:15.466 11:50:05 blockdev_general.bdev_error -- common/autotest_common.sh@948 -- # '[' -z 586185 ']' 00:09:15.466 11:50:05 blockdev_general.bdev_error -- common/autotest_common.sh@952 -- # kill -0 586185 00:09:15.466 11:50:05 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # uname 00:09:15.466 11:50:05 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:15.466 11:50:05 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 586185 00:09:15.466 11:50:05 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:09:15.466 11:50:05 blockdev_general.bdev_error -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:09:15.466 11:50:05 blockdev_general.bdev_error -- common/autotest_common.sh@966 -- # echo 'killing process with pid 586185' 00:09:15.466 killing process with pid 586185 00:09:15.466 11:50:05 blockdev_general.bdev_error -- common/autotest_common.sh@967 -- # kill 586185 00:09:15.466 Received shutdown signal, test time was about 5.000000 seconds 00:09:15.466 00:09:15.466 Latency(us) 00:09:15.466 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:15.466 =================================================================================================================== 00:09:15.466 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:09:15.466 11:50:05 blockdev_general.bdev_error -- common/autotest_common.sh@972 -- # wait 586185 00:09:15.725 11:50:05 blockdev_general.bdev_error -- bdev/blockdev.sh@503 -- # ERR_PID=587330 00:09:15.725 11:50:05 blockdev_general.bdev_error -- bdev/blockdev.sh@504 -- # echo 'Process error testing pid: 587330' 00:09:15.725 Process error testing pid: 587330 00:09:15.725 11:50:05 blockdev_general.bdev_error -- bdev/blockdev.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 '' 00:09:15.725 11:50:05 blockdev_general.bdev_error -- bdev/blockdev.sh@505 -- # waitforlisten 587330 00:09:15.725 11:50:05 blockdev_general.bdev_error -- common/autotest_common.sh@829 -- # '[' -z 587330 ']' 00:09:15.725 11:50:05 blockdev_general.bdev_error -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:15.725 11:50:05 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:15.725 11:50:05 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:15.725 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:15.725 11:50:05 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:15.725 11:50:05 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:15.725 [2024-07-12 11:50:05.964402] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:09:15.725 [2024-07-12 11:50:05.964442] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid587330 ] 00:09:15.984 [2024-07-12 11:50:06.028219] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:15.984 [2024-07-12 11:50:06.105665] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:09:16.552 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:16.552 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@862 -- # return 0 00:09:16.552 11:50:06 blockdev_general.bdev_error -- bdev/blockdev.sh@507 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:09:16.552 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:16.552 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:16.552 Dev_1 00:09:16.552 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:16.552 11:50:06 blockdev_general.bdev_error -- bdev/blockdev.sh@508 -- # waitforbdev Dev_1 00:09:16.552 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_1 00:09:16.552 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:16.552 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:09:16.552 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:16.552 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:16.552 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:09:16.552 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:16.552 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:16.812 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:16.812 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:09:16.812 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:16.812 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:16.812 [ 00:09:16.812 { 00:09:16.812 "name": "Dev_1", 00:09:16.812 "aliases": [ 00:09:16.812 "a2eb5265-c3f4-4b30-a8ec-bfe868961511" 00:09:16.812 ], 00:09:16.812 "product_name": "Malloc disk", 00:09:16.812 "block_size": 512, 00:09:16.812 "num_blocks": 262144, 00:09:16.812 "uuid": "a2eb5265-c3f4-4b30-a8ec-bfe868961511", 00:09:16.812 "assigned_rate_limits": { 00:09:16.812 "rw_ios_per_sec": 0, 00:09:16.812 "rw_mbytes_per_sec": 0, 00:09:16.812 "r_mbytes_per_sec": 0, 00:09:16.812 "w_mbytes_per_sec": 0 00:09:16.812 }, 00:09:16.812 "claimed": false, 00:09:16.812 "zoned": false, 00:09:16.812 "supported_io_types": { 00:09:16.812 "read": true, 00:09:16.812 "write": true, 00:09:16.812 "unmap": true, 00:09:16.812 "flush": true, 00:09:16.812 "reset": true, 00:09:16.812 "nvme_admin": false, 00:09:16.812 "nvme_io": false, 00:09:16.812 "nvme_io_md": false, 00:09:16.812 "write_zeroes": true, 00:09:16.812 "zcopy": true, 00:09:16.812 "get_zone_info": false, 00:09:16.812 "zone_management": false, 00:09:16.812 "zone_append": false, 00:09:16.812 "compare": false, 00:09:16.812 "compare_and_write": false, 00:09:16.812 "abort": true, 00:09:16.812 "seek_hole": false, 00:09:16.812 "seek_data": false, 00:09:16.812 "copy": true, 00:09:16.812 "nvme_iov_md": false 00:09:16.812 }, 00:09:16.812 "memory_domains": [ 00:09:16.812 { 00:09:16.812 "dma_device_id": "system", 00:09:16.812 "dma_device_type": 1 00:09:16.812 }, 00:09:16.812 { 00:09:16.812 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:16.812 "dma_device_type": 2 00:09:16.812 } 00:09:16.812 ], 00:09:16.813 "driver_specific": {} 00:09:16.813 } 00:09:16.813 ] 00:09:16.813 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:16.813 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:09:16.813 11:50:06 blockdev_general.bdev_error -- bdev/blockdev.sh@509 -- # rpc_cmd bdev_error_create Dev_1 00:09:16.813 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:16.813 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:16.813 true 00:09:16.813 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:16.813 11:50:06 blockdev_general.bdev_error -- bdev/blockdev.sh@510 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:09:16.813 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:16.813 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:16.813 Dev_2 00:09:16.813 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:16.813 11:50:06 blockdev_general.bdev_error -- bdev/blockdev.sh@511 -- # waitforbdev Dev_2 00:09:16.813 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_2 00:09:16.813 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:16.813 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:09:16.813 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:16.813 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:16.813 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:09:16.813 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:16.813 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:16.813 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:16.813 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:09:16.813 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:16.813 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:16.813 [ 00:09:16.813 { 00:09:16.813 "name": "Dev_2", 00:09:16.813 "aliases": [ 00:09:16.813 "ba9b5d0b-fb65-46bd-9bc5-2ae2b2d825ce" 00:09:16.813 ], 00:09:16.813 "product_name": "Malloc disk", 00:09:16.813 "block_size": 512, 00:09:16.813 "num_blocks": 262144, 00:09:16.813 "uuid": "ba9b5d0b-fb65-46bd-9bc5-2ae2b2d825ce", 00:09:16.813 "assigned_rate_limits": { 00:09:16.813 "rw_ios_per_sec": 0, 00:09:16.813 "rw_mbytes_per_sec": 0, 00:09:16.813 "r_mbytes_per_sec": 0, 00:09:16.813 "w_mbytes_per_sec": 0 00:09:16.813 }, 00:09:16.813 "claimed": false, 00:09:16.813 "zoned": false, 00:09:16.813 "supported_io_types": { 00:09:16.813 "read": true, 00:09:16.813 "write": true, 00:09:16.813 "unmap": true, 00:09:16.813 "flush": true, 00:09:16.813 "reset": true, 00:09:16.813 "nvme_admin": false, 00:09:16.813 "nvme_io": false, 00:09:16.813 "nvme_io_md": false, 00:09:16.813 "write_zeroes": true, 00:09:16.813 "zcopy": true, 00:09:16.813 "get_zone_info": false, 00:09:16.813 "zone_management": false, 00:09:16.813 "zone_append": false, 00:09:16.813 "compare": false, 00:09:16.813 "compare_and_write": false, 00:09:16.813 "abort": true, 00:09:16.813 "seek_hole": false, 00:09:16.813 "seek_data": false, 00:09:16.813 "copy": true, 00:09:16.813 "nvme_iov_md": false 00:09:16.813 }, 00:09:16.813 "memory_domains": [ 00:09:16.813 { 00:09:16.813 "dma_device_id": "system", 00:09:16.813 "dma_device_type": 1 00:09:16.813 }, 00:09:16.813 { 00:09:16.813 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:16.813 "dma_device_type": 2 00:09:16.813 } 00:09:16.813 ], 00:09:16.813 "driver_specific": {} 00:09:16.813 } 00:09:16.813 ] 00:09:16.813 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:16.813 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:09:16.813 11:50:06 blockdev_general.bdev_error -- bdev/blockdev.sh@512 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:09:16.813 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:16.813 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:16.813 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:16.813 11:50:06 blockdev_general.bdev_error -- bdev/blockdev.sh@515 -- # NOT wait 587330 00:09:16.813 11:50:06 blockdev_general.bdev_error -- bdev/blockdev.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:09:16.813 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@648 -- # local es=0 00:09:16.813 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@650 -- # valid_exec_arg wait 587330 00:09:16.813 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@636 -- # local arg=wait 00:09:16.813 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:16.813 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # type -t wait 00:09:16.813 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:16.813 11:50:06 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # wait 587330 00:09:16.813 Running I/O for 5 seconds... 00:09:16.813 task offset: 197088 on job bdev=EE_Dev_1 fails 00:09:16.813 00:09:16.813 Latency(us) 00:09:16.813 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:16.813 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:09:16.813 Job: EE_Dev_1 ended in about 0.00 seconds with error 00:09:16.813 EE_Dev_1 : 0.00 45267.49 176.83 10288.07 0.00 237.48 89.23 423.25 00:09:16.813 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:09:16.813 Dev_2 : 0.00 27947.60 109.17 0.00 0.00 423.09 85.33 784.09 00:09:16.813 =================================================================================================================== 00:09:16.813 Total : 73215.09 286.00 10288.07 0.00 338.15 85.33 784.09 00:09:16.813 [2024-07-12 11:50:06.975038] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:09:16.813 request: 00:09:16.813 { 00:09:16.813 "method": "perform_tests", 00:09:16.813 "req_id": 1 00:09:16.813 } 00:09:16.813 Got JSON-RPC error response 00:09:16.813 response: 00:09:16.813 { 00:09:16.813 "code": -32603, 00:09:16.813 "message": "bdevperf failed with error Operation not permitted" 00:09:16.813 } 00:09:17.073 11:50:07 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # es=255 00:09:17.073 11:50:07 blockdev_general.bdev_error -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:09:17.073 11:50:07 blockdev_general.bdev_error -- common/autotest_common.sh@660 -- # es=127 00:09:17.073 11:50:07 blockdev_general.bdev_error -- common/autotest_common.sh@661 -- # case "$es" in 00:09:17.073 11:50:07 blockdev_general.bdev_error -- common/autotest_common.sh@668 -- # es=1 00:09:17.073 11:50:07 blockdev_general.bdev_error -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:09:17.073 00:09:17.073 real 0m8.548s 00:09:17.073 user 0m8.826s 00:09:17.073 sys 0m0.608s 00:09:17.073 11:50:07 blockdev_general.bdev_error -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:17.073 11:50:07 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:17.073 ************************************ 00:09:17.073 END TEST bdev_error 00:09:17.073 ************************************ 00:09:17.073 11:50:07 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:17.073 11:50:07 blockdev_general -- bdev/blockdev.sh@791 -- # run_test bdev_stat stat_test_suite '' 00:09:17.073 11:50:07 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:17.073 11:50:07 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:17.073 11:50:07 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:17.073 ************************************ 00:09:17.073 START TEST bdev_stat 00:09:17.073 ************************************ 00:09:17.073 11:50:07 blockdev_general.bdev_stat -- common/autotest_common.sh@1123 -- # stat_test_suite '' 00:09:17.073 11:50:07 blockdev_general.bdev_stat -- bdev/blockdev.sh@592 -- # STAT_DEV=Malloc_STAT 00:09:17.073 11:50:07 blockdev_general.bdev_stat -- bdev/blockdev.sh@596 -- # STAT_PID=587589 00:09:17.073 11:50:07 blockdev_general.bdev_stat -- bdev/blockdev.sh@597 -- # echo 'Process Bdev IO statistics testing pid: 587589' 00:09:17.073 Process Bdev IO statistics testing pid: 587589 00:09:17.073 11:50:07 blockdev_general.bdev_stat -- bdev/blockdev.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 10 -C '' 00:09:17.073 11:50:07 blockdev_general.bdev_stat -- bdev/blockdev.sh@598 -- # trap 'cleanup; killprocess $STAT_PID; exit 1' SIGINT SIGTERM EXIT 00:09:17.073 11:50:07 blockdev_general.bdev_stat -- bdev/blockdev.sh@599 -- # waitforlisten 587589 00:09:17.073 11:50:07 blockdev_general.bdev_stat -- common/autotest_common.sh@829 -- # '[' -z 587589 ']' 00:09:17.073 11:50:07 blockdev_general.bdev_stat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:17.073 11:50:07 blockdev_general.bdev_stat -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:17.074 11:50:07 blockdev_general.bdev_stat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:17.074 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:17.074 11:50:07 blockdev_general.bdev_stat -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:17.074 11:50:07 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:17.333 [2024-07-12 11:50:07.322490] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:09:17.333 [2024-07-12 11:50:07.322535] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid587589 ] 00:09:17.333 [2024-07-12 11:50:07.385412] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:17.333 [2024-07-12 11:50:07.463936] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:09:17.333 [2024-07-12 11:50:07.463939] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:17.901 11:50:08 blockdev_general.bdev_stat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:17.901 11:50:08 blockdev_general.bdev_stat -- common/autotest_common.sh@862 -- # return 0 00:09:17.901 11:50:08 blockdev_general.bdev_stat -- bdev/blockdev.sh@601 -- # rpc_cmd bdev_malloc_create -b Malloc_STAT 128 512 00:09:17.901 11:50:08 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:17.901 11:50:08 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:17.901 Malloc_STAT 00:09:17.901 11:50:08 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:17.901 11:50:08 blockdev_general.bdev_stat -- bdev/blockdev.sh@602 -- # waitforbdev Malloc_STAT 00:09:17.901 11:50:08 blockdev_general.bdev_stat -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_STAT 00:09:17.901 11:50:08 blockdev_general.bdev_stat -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:17.901 11:50:08 blockdev_general.bdev_stat -- common/autotest_common.sh@899 -- # local i 00:09:17.901 11:50:08 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:17.901 11:50:08 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:17.901 11:50:08 blockdev_general.bdev_stat -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:09:17.901 11:50:08 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:17.901 11:50:08 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:18.161 11:50:08 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:18.161 11:50:08 blockdev_general.bdev_stat -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_STAT -t 2000 00:09:18.161 11:50:08 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:18.161 11:50:08 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:18.161 [ 00:09:18.161 { 00:09:18.161 "name": "Malloc_STAT", 00:09:18.161 "aliases": [ 00:09:18.161 "45077be8-53d3-4c29-9d9c-c07aaf8a5172" 00:09:18.161 ], 00:09:18.161 "product_name": "Malloc disk", 00:09:18.161 "block_size": 512, 00:09:18.161 "num_blocks": 262144, 00:09:18.161 "uuid": "45077be8-53d3-4c29-9d9c-c07aaf8a5172", 00:09:18.161 "assigned_rate_limits": { 00:09:18.161 "rw_ios_per_sec": 0, 00:09:18.161 "rw_mbytes_per_sec": 0, 00:09:18.161 "r_mbytes_per_sec": 0, 00:09:18.161 "w_mbytes_per_sec": 0 00:09:18.161 }, 00:09:18.161 "claimed": false, 00:09:18.161 "zoned": false, 00:09:18.161 "supported_io_types": { 00:09:18.161 "read": true, 00:09:18.161 "write": true, 00:09:18.161 "unmap": true, 00:09:18.161 "flush": true, 00:09:18.161 "reset": true, 00:09:18.161 "nvme_admin": false, 00:09:18.161 "nvme_io": false, 00:09:18.161 "nvme_io_md": false, 00:09:18.161 "write_zeroes": true, 00:09:18.161 "zcopy": true, 00:09:18.161 "get_zone_info": false, 00:09:18.161 "zone_management": false, 00:09:18.161 "zone_append": false, 00:09:18.161 "compare": false, 00:09:18.161 "compare_and_write": false, 00:09:18.161 "abort": true, 00:09:18.161 "seek_hole": false, 00:09:18.161 "seek_data": false, 00:09:18.161 "copy": true, 00:09:18.161 "nvme_iov_md": false 00:09:18.161 }, 00:09:18.161 "memory_domains": [ 00:09:18.161 { 00:09:18.161 "dma_device_id": "system", 00:09:18.161 "dma_device_type": 1 00:09:18.161 }, 00:09:18.161 { 00:09:18.161 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:18.161 "dma_device_type": 2 00:09:18.161 } 00:09:18.161 ], 00:09:18.161 "driver_specific": {} 00:09:18.161 } 00:09:18.161 ] 00:09:18.161 11:50:08 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:18.161 11:50:08 blockdev_general.bdev_stat -- common/autotest_common.sh@905 -- # return 0 00:09:18.161 11:50:08 blockdev_general.bdev_stat -- bdev/blockdev.sh@605 -- # sleep 2 00:09:18.161 11:50:08 blockdev_general.bdev_stat -- bdev/blockdev.sh@604 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:09:18.161 Running I/O for 10 seconds... 00:09:20.068 11:50:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@606 -- # stat_function_test Malloc_STAT 00:09:20.068 11:50:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@559 -- # local bdev_name=Malloc_STAT 00:09:20.068 11:50:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@560 -- # local iostats 00:09:20.068 11:50:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@561 -- # local io_count1 00:09:20.068 11:50:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@562 -- # local io_count2 00:09:20.068 11:50:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@563 -- # local iostats_per_channel 00:09:20.068 11:50:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@564 -- # local io_count_per_channel1 00:09:20.068 11:50:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@565 -- # local io_count_per_channel2 00:09:20.068 11:50:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@566 -- # local io_count_per_channel_all=0 00:09:20.069 11:50:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:09:20.069 11:50:10 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:20.069 11:50:10 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:20.069 11:50:10 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:20.069 11:50:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # iostats='{ 00:09:20.069 "tick_rate": 2100000000, 00:09:20.069 "ticks": 9618023901086350, 00:09:20.069 "bdevs": [ 00:09:20.069 { 00:09:20.069 "name": "Malloc_STAT", 00:09:20.069 "bytes_read": 1023455744, 00:09:20.069 "num_read_ops": 249860, 00:09:20.069 "bytes_written": 0, 00:09:20.069 "num_write_ops": 0, 00:09:20.069 "bytes_unmapped": 0, 00:09:20.069 "num_unmap_ops": 0, 00:09:20.069 "bytes_copied": 0, 00:09:20.069 "num_copy_ops": 0, 00:09:20.069 "read_latency_ticks": 2071004492330, 00:09:20.069 "max_read_latency_ticks": 9812736, 00:09:20.069 "min_read_latency_ticks": 175524, 00:09:20.069 "write_latency_ticks": 0, 00:09:20.069 "max_write_latency_ticks": 0, 00:09:20.069 "min_write_latency_ticks": 0, 00:09:20.069 "unmap_latency_ticks": 0, 00:09:20.069 "max_unmap_latency_ticks": 0, 00:09:20.069 "min_unmap_latency_ticks": 0, 00:09:20.069 "copy_latency_ticks": 0, 00:09:20.069 "max_copy_latency_ticks": 0, 00:09:20.069 "min_copy_latency_ticks": 0, 00:09:20.069 "io_error": {} 00:09:20.069 } 00:09:20.069 ] 00:09:20.069 }' 00:09:20.069 11:50:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # jq -r '.bdevs[0].num_read_ops' 00:09:20.069 11:50:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # io_count1=249860 00:09:20.069 11:50:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT -c 00:09:20.069 11:50:10 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:20.069 11:50:10 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:20.069 11:50:10 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:20.069 11:50:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # iostats_per_channel='{ 00:09:20.069 "tick_rate": 2100000000, 00:09:20.069 "ticks": 9618024003346564, 00:09:20.069 "name": "Malloc_STAT", 00:09:20.069 "channels": [ 00:09:20.069 { 00:09:20.069 "thread_id": 2, 00:09:20.069 "bytes_read": 522190848, 00:09:20.069 "num_read_ops": 127488, 00:09:20.069 "bytes_written": 0, 00:09:20.069 "num_write_ops": 0, 00:09:20.069 "bytes_unmapped": 0, 00:09:20.069 "num_unmap_ops": 0, 00:09:20.069 "bytes_copied": 0, 00:09:20.069 "num_copy_ops": 0, 00:09:20.069 "read_latency_ticks": 1061244832246, 00:09:20.069 "max_read_latency_ticks": 9021560, 00:09:20.069 "min_read_latency_ticks": 5402444, 00:09:20.069 "write_latency_ticks": 0, 00:09:20.069 "max_write_latency_ticks": 0, 00:09:20.069 "min_write_latency_ticks": 0, 00:09:20.069 "unmap_latency_ticks": 0, 00:09:20.069 "max_unmap_latency_ticks": 0, 00:09:20.069 "min_unmap_latency_ticks": 0, 00:09:20.069 "copy_latency_ticks": 0, 00:09:20.069 "max_copy_latency_ticks": 0, 00:09:20.069 "min_copy_latency_ticks": 0 00:09:20.069 }, 00:09:20.069 { 00:09:20.069 "thread_id": 3, 00:09:20.069 "bytes_read": 526385152, 00:09:20.069 "num_read_ops": 128512, 00:09:20.069 "bytes_written": 0, 00:09:20.069 "num_write_ops": 0, 00:09:20.069 "bytes_unmapped": 0, 00:09:20.069 "num_unmap_ops": 0, 00:09:20.069 "bytes_copied": 0, 00:09:20.069 "num_copy_ops": 0, 00:09:20.069 "read_latency_ticks": 1061584685584, 00:09:20.069 "max_read_latency_ticks": 9812736, 00:09:20.069 "min_read_latency_ticks": 5422142, 00:09:20.069 "write_latency_ticks": 0, 00:09:20.069 "max_write_latency_ticks": 0, 00:09:20.069 "min_write_latency_ticks": 0, 00:09:20.069 "unmap_latency_ticks": 0, 00:09:20.069 "max_unmap_latency_ticks": 0, 00:09:20.069 "min_unmap_latency_ticks": 0, 00:09:20.069 "copy_latency_ticks": 0, 00:09:20.069 "max_copy_latency_ticks": 0, 00:09:20.069 "min_copy_latency_ticks": 0 00:09:20.069 } 00:09:20.069 ] 00:09:20.069 }' 00:09:20.069 11:50:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # jq -r '.channels[0].num_read_ops' 00:09:20.069 11:50:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # io_count_per_channel1=127488 00:09:20.069 11:50:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # io_count_per_channel_all=127488 00:09:20.069 11:50:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # jq -r '.channels[1].num_read_ops' 00:09:20.328 11:50:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # io_count_per_channel2=128512 00:09:20.329 11:50:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@575 -- # io_count_per_channel_all=256000 00:09:20.329 11:50:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:09:20.329 11:50:10 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:20.329 11:50:10 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:20.329 11:50:10 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:20.329 11:50:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # iostats='{ 00:09:20.329 "tick_rate": 2100000000, 00:09:20.329 "ticks": 9618024230050166, 00:09:20.329 "bdevs": [ 00:09:20.329 { 00:09:20.329 "name": "Malloc_STAT", 00:09:20.329 "bytes_read": 1106293248, 00:09:20.329 "num_read_ops": 270084, 00:09:20.329 "bytes_written": 0, 00:09:20.329 "num_write_ops": 0, 00:09:20.329 "bytes_unmapped": 0, 00:09:20.329 "num_unmap_ops": 0, 00:09:20.329 "bytes_copied": 0, 00:09:20.329 "num_copy_ops": 0, 00:09:20.329 "read_latency_ticks": 2240474986868, 00:09:20.329 "max_read_latency_ticks": 9812736, 00:09:20.329 "min_read_latency_ticks": 175524, 00:09:20.329 "write_latency_ticks": 0, 00:09:20.329 "max_write_latency_ticks": 0, 00:09:20.329 "min_write_latency_ticks": 0, 00:09:20.329 "unmap_latency_ticks": 0, 00:09:20.329 "max_unmap_latency_ticks": 0, 00:09:20.329 "min_unmap_latency_ticks": 0, 00:09:20.329 "copy_latency_ticks": 0, 00:09:20.329 "max_copy_latency_ticks": 0, 00:09:20.329 "min_copy_latency_ticks": 0, 00:09:20.329 "io_error": {} 00:09:20.329 } 00:09:20.329 ] 00:09:20.329 }' 00:09:20.329 11:50:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # jq -r '.bdevs[0].num_read_ops' 00:09:20.329 11:50:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # io_count2=270084 00:09:20.329 11:50:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 256000 -lt 249860 ']' 00:09:20.329 11:50:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 256000 -gt 270084 ']' 00:09:20.329 11:50:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@608 -- # rpc_cmd bdev_malloc_delete Malloc_STAT 00:09:20.329 11:50:10 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:20.329 11:50:10 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:20.329 00:09:20.329 Latency(us) 00:09:20.329 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:20.329 Job: Malloc_STAT (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:09:20.329 Malloc_STAT : 2.15 64445.45 251.74 0.00 0.00 3964.34 975.24 4306.65 00:09:20.329 Job: Malloc_STAT (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:09:20.329 Malloc_STAT : 2.15 64996.32 253.89 0.00 0.00 3930.93 655.36 4681.14 00:09:20.329 =================================================================================================================== 00:09:20.329 Total : 129441.77 505.63 0.00 0.00 3947.56 655.36 4681.14 00:09:20.329 0 00:09:20.329 11:50:10 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:20.329 11:50:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@609 -- # killprocess 587589 00:09:20.329 11:50:10 blockdev_general.bdev_stat -- common/autotest_common.sh@948 -- # '[' -z 587589 ']' 00:09:20.329 11:50:10 blockdev_general.bdev_stat -- common/autotest_common.sh@952 -- # kill -0 587589 00:09:20.329 11:50:10 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # uname 00:09:20.329 11:50:10 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:20.329 11:50:10 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 587589 00:09:20.329 11:50:10 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:20.329 11:50:10 blockdev_general.bdev_stat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:20.329 11:50:10 blockdev_general.bdev_stat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 587589' 00:09:20.329 killing process with pid 587589 00:09:20.329 11:50:10 blockdev_general.bdev_stat -- common/autotest_common.sh@967 -- # kill 587589 00:09:20.329 Received shutdown signal, test time was about 2.216013 seconds 00:09:20.329 00:09:20.329 Latency(us) 00:09:20.329 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:20.329 =================================================================================================================== 00:09:20.329 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:09:20.329 11:50:10 blockdev_general.bdev_stat -- common/autotest_common.sh@972 -- # wait 587589 00:09:20.588 11:50:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@610 -- # trap - SIGINT SIGTERM EXIT 00:09:20.588 00:09:20.588 real 0m3.367s 00:09:20.588 user 0m6.802s 00:09:20.588 sys 0m0.301s 00:09:20.588 11:50:10 blockdev_general.bdev_stat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:20.588 11:50:10 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:20.588 ************************************ 00:09:20.588 END TEST bdev_stat 00:09:20.588 ************************************ 00:09:20.588 11:50:10 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:20.588 11:50:10 blockdev_general -- bdev/blockdev.sh@794 -- # [[ bdev == gpt ]] 00:09:20.588 11:50:10 blockdev_general -- bdev/blockdev.sh@798 -- # [[ bdev == crypto_sw ]] 00:09:20.588 11:50:10 blockdev_general -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:09:20.588 11:50:10 blockdev_general -- bdev/blockdev.sh@811 -- # cleanup 00:09:20.588 11:50:10 blockdev_general -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:09:20.588 11:50:10 blockdev_general -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:09:20.589 11:50:10 blockdev_general -- bdev/blockdev.sh@26 -- # [[ bdev == rbd ]] 00:09:20.589 11:50:10 blockdev_general -- bdev/blockdev.sh@30 -- # [[ bdev == daos ]] 00:09:20.589 11:50:10 blockdev_general -- bdev/blockdev.sh@34 -- # [[ bdev = \g\p\t ]] 00:09:20.589 11:50:10 blockdev_general -- bdev/blockdev.sh@40 -- # [[ bdev == xnvme ]] 00:09:20.589 00:09:20.589 real 1m43.018s 00:09:20.589 user 7m2.495s 00:09:20.589 sys 0m14.355s 00:09:20.589 11:50:10 blockdev_general -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:20.589 11:50:10 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:20.589 ************************************ 00:09:20.589 END TEST blockdev_general 00:09:20.589 ************************************ 00:09:20.589 11:50:10 -- common/autotest_common.sh@1142 -- # return 0 00:09:20.589 11:50:10 -- spdk/autotest.sh@190 -- # run_test bdev_raid /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:09:20.589 11:50:10 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:20.589 11:50:10 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:20.589 11:50:10 -- common/autotest_common.sh@10 -- # set +x 00:09:20.589 ************************************ 00:09:20.589 START TEST bdev_raid 00:09:20.589 ************************************ 00:09:20.589 11:50:10 bdev_raid -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:09:20.589 * Looking for test storage... 00:09:20.589 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:09:20.589 11:50:10 bdev_raid -- bdev/bdev_raid.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:09:20.589 11:50:10 bdev_raid -- bdev/nbd_common.sh@6 -- # set -e 00:09:20.589 11:50:10 bdev_raid -- bdev/bdev_raid.sh@15 -- # rpc_py='/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock' 00:09:20.589 11:50:10 bdev_raid -- bdev/bdev_raid.sh@851 -- # mkdir -p /raidtest 00:09:20.849 11:50:10 bdev_raid -- bdev/bdev_raid.sh@852 -- # trap 'cleanup; exit 1' EXIT 00:09:20.849 11:50:10 bdev_raid -- bdev/bdev_raid.sh@854 -- # base_blocklen=512 00:09:20.849 11:50:10 bdev_raid -- bdev/bdev_raid.sh@856 -- # uname -s 00:09:20.849 11:50:10 bdev_raid -- bdev/bdev_raid.sh@856 -- # '[' Linux = Linux ']' 00:09:20.849 11:50:10 bdev_raid -- bdev/bdev_raid.sh@856 -- # modprobe -n nbd 00:09:20.849 11:50:10 bdev_raid -- bdev/bdev_raid.sh@857 -- # has_nbd=true 00:09:20.849 11:50:10 bdev_raid -- bdev/bdev_raid.sh@858 -- # modprobe nbd 00:09:20.849 11:50:10 bdev_raid -- bdev/bdev_raid.sh@859 -- # run_test raid_function_test_raid0 raid_function_test raid0 00:09:20.849 11:50:10 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:20.849 11:50:10 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:20.849 11:50:10 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:20.849 ************************************ 00:09:20.849 START TEST raid_function_test_raid0 00:09:20.849 ************************************ 00:09:20.849 11:50:10 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1123 -- # raid_function_test raid0 00:09:20.849 11:50:10 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@80 -- # local raid_level=raid0 00:09:20.849 11:50:10 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:09:20.849 11:50:10 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:09:20.849 11:50:10 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@85 -- # raid_pid=588266 00:09:20.849 11:50:10 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 588266' 00:09:20.849 Process raid pid: 588266 00:09:20.849 11:50:10 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:09:20.849 11:50:10 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@87 -- # waitforlisten 588266 /var/tmp/spdk-raid.sock 00:09:20.849 11:50:10 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@829 -- # '[' -z 588266 ']' 00:09:20.849 11:50:10 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:20.849 11:50:10 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:20.849 11:50:10 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:20.849 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:20.849 11:50:10 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:20.849 11:50:10 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:09:20.849 [2024-07-12 11:50:10.936856] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:09:20.849 [2024-07-12 11:50:10.936895] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:20.849 [2024-07-12 11:50:11.003781] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:20.849 [2024-07-12 11:50:11.074621] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:21.108 [2024-07-12 11:50:11.124082] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:21.108 [2024-07-12 11:50:11.124105] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:21.676 11:50:11 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:21.676 11:50:11 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@862 -- # return 0 00:09:21.676 11:50:11 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev raid0 00:09:21.676 11:50:11 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@66 -- # local raid_level=raid0 00:09:21.677 11:50:11 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:09:21.677 11:50:11 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@69 -- # cat 00:09:21.677 11:50:11 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:09:21.677 [2024-07-12 11:50:11.902581] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:09:21.677 [2024-07-12 11:50:11.903349] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:09:21.677 [2024-07-12 11:50:11.903391] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e743f0 00:09:21.677 [2024-07-12 11:50:11.903396] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:09:21.677 [2024-07-12 11:50:11.903560] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e77e30 00:09:21.677 [2024-07-12 11:50:11.903636] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e743f0 00:09:21.677 [2024-07-12 11:50:11.903641] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x1e743f0 00:09:21.677 [2024-07-12 11:50:11.903707] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:21.677 Base_1 00:09:21.677 Base_2 00:09:21.936 11:50:11 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:09:21.936 11:50:11 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:09:21.936 11:50:11 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:09:21.936 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:09:21.936 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:09:21.936 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:09:21.936 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:21.936 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:09:21.936 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:09:21.936 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:09:21.936 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:09:21.936 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@12 -- # local i 00:09:21.936 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:09:21.936 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:09:21.936 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:09:22.196 [2024-07-12 11:50:12.247465] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e782c0 00:09:22.196 /dev/nbd0 00:09:22.196 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:09:22.196 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:09:22.196 11:50:12 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:09:22.196 11:50:12 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@867 -- # local i 00:09:22.196 11:50:12 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:22.196 11:50:12 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:22.196 11:50:12 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:09:22.196 11:50:12 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@871 -- # break 00:09:22.196 11:50:12 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:22.196 11:50:12 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:22.196 11:50:12 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:22.196 1+0 records in 00:09:22.196 1+0 records out 00:09:22.196 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000217514 s, 18.8 MB/s 00:09:22.196 11:50:12 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:22.196 11:50:12 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # size=4096 00:09:22.196 11:50:12 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:22.196 11:50:12 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:22.196 11:50:12 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@887 -- # return 0 00:09:22.196 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:22.196 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:09:22.196 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:09:22.196 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:22.196 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:09:22.455 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:09:22.455 { 00:09:22.455 "nbd_device": "/dev/nbd0", 00:09:22.455 "bdev_name": "raid" 00:09:22.455 } 00:09:22.455 ]' 00:09:22.455 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[ 00:09:22.455 { 00:09:22.455 "nbd_device": "/dev/nbd0", 00:09:22.455 "bdev_name": "raid" 00:09:22.455 } 00:09:22.455 ]' 00:09:22.455 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:22.455 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:09:22.455 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:09:22.455 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:22.455 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=1 00:09:22.455 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 1 00:09:22.455 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # count=1 00:09:22.455 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:09:22.455 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:09:22.455 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:09:22.455 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:09:22.455 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:22.455 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # local blksize 00:09:22.455 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:09:22.455 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:09:22.455 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:09:22.455 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # blksize=512 00:09:22.455 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:09:22.455 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:09:22.455 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:09:22.455 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:09:22.455 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:09:22.455 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:09:22.455 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:09:22.455 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:09:22.455 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:09:22.455 4096+0 records in 00:09:22.455 4096+0 records out 00:09:22.455 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0255111 s, 82.2 MB/s 00:09:22.455 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:09:22.715 4096+0 records in 00:09:22.715 4096+0 records out 00:09:22.715 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.14495 s, 14.5 MB/s 00:09:22.715 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:09:22.715 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:09:22.715 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:09:22.715 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:09:22.715 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:09:22.715 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:09:22.715 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:09:22.715 128+0 records in 00:09:22.715 128+0 records out 00:09:22.715 65536 bytes (66 kB, 64 KiB) copied, 0.000178023 s, 368 MB/s 00:09:22.715 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:09:22.715 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:09:22.715 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:09:22.715 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:09:22.715 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:09:22.715 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:09:22.715 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:09:22.715 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:09:22.715 2035+0 records in 00:09:22.715 2035+0 records out 00:09:22.715 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.00493597 s, 211 MB/s 00:09:22.715 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:09:22.715 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:09:22.715 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:09:22.715 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:09:22.715 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:09:22.715 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:09:22.715 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:09:22.715 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:09:22.715 456+0 records in 00:09:22.715 456+0 records out 00:09:22.715 233472 bytes (233 kB, 228 KiB) copied, 0.0011414 s, 205 MB/s 00:09:22.715 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:09:22.715 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:09:22.715 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:09:22.715 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:09:22.715 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:09:22.715 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@54 -- # return 0 00:09:22.715 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:09:22.715 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:22.715 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:09:22.715 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:22.715 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@51 -- # local i 00:09:22.715 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:22.715 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:09:22.973 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:22.973 [2024-07-12 11:50:12.967101] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:22.973 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:22.973 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:22.973 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:22.973 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:22.973 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:22.973 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@41 -- # break 00:09:22.973 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@45 -- # return 0 00:09:22.973 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:09:22.973 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:22.973 11:50:12 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:09:22.973 11:50:13 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:09:22.973 11:50:13 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:09:22.974 11:50:13 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:22.974 11:50:13 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:09:22.974 11:50:13 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo '' 00:09:22.974 11:50:13 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:22.974 11:50:13 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # true 00:09:22.974 11:50:13 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=0 00:09:22.974 11:50:13 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 0 00:09:22.974 11:50:13 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # count=0 00:09:22.974 11:50:13 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:09:22.974 11:50:13 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@110 -- # killprocess 588266 00:09:22.974 11:50:13 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@948 -- # '[' -z 588266 ']' 00:09:22.974 11:50:13 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@952 -- # kill -0 588266 00:09:22.974 11:50:13 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # uname 00:09:22.974 11:50:13 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:22.974 11:50:13 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 588266 00:09:23.233 11:50:13 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:23.233 11:50:13 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:23.233 11:50:13 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@966 -- # echo 'killing process with pid 588266' 00:09:23.233 killing process with pid 588266 00:09:23.233 11:50:13 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@967 -- # kill 588266 00:09:23.233 [2024-07-12 11:50:13.235114] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:23.233 [2024-07-12 11:50:13.235162] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:23.233 [2024-07-12 11:50:13.235187] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:23.233 [2024-07-12 11:50:13.235194] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e743f0 name raid, state offline 00:09:23.233 11:50:13 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@972 -- # wait 588266 00:09:23.233 [2024-07-12 11:50:13.250283] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:23.233 11:50:13 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@112 -- # return 0 00:09:23.233 00:09:23.233 real 0m2.536s 00:09:23.233 user 0m3.410s 00:09:23.233 sys 0m0.766s 00:09:23.233 11:50:13 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:23.233 11:50:13 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:09:23.233 ************************************ 00:09:23.233 END TEST raid_function_test_raid0 00:09:23.233 ************************************ 00:09:23.233 11:50:13 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:09:23.233 11:50:13 bdev_raid -- bdev/bdev_raid.sh@860 -- # run_test raid_function_test_concat raid_function_test concat 00:09:23.233 11:50:13 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:23.233 11:50:13 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:23.233 11:50:13 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:23.492 ************************************ 00:09:23.492 START TEST raid_function_test_concat 00:09:23.492 ************************************ 00:09:23.492 11:50:13 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1123 -- # raid_function_test concat 00:09:23.492 11:50:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@80 -- # local raid_level=concat 00:09:23.492 11:50:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:09:23.492 11:50:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:09:23.492 11:50:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@85 -- # raid_pid=588730 00:09:23.492 11:50:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:09:23.492 11:50:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 588730' 00:09:23.492 Process raid pid: 588730 00:09:23.492 11:50:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@87 -- # waitforlisten 588730 /var/tmp/spdk-raid.sock 00:09:23.492 11:50:13 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@829 -- # '[' -z 588730 ']' 00:09:23.492 11:50:13 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:23.492 11:50:13 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:23.492 11:50:13 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:23.492 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:23.492 11:50:13 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:23.492 11:50:13 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:09:23.492 [2024-07-12 11:50:13.518245] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:09:23.492 [2024-07-12 11:50:13.518278] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:23.492 [2024-07-12 11:50:13.581965] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:23.492 [2024-07-12 11:50:13.660654] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:23.492 [2024-07-12 11:50:13.711088] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:23.492 [2024-07-12 11:50:13.711111] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:24.431 11:50:14 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:24.431 11:50:14 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@862 -- # return 0 00:09:24.431 11:50:14 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev concat 00:09:24.432 11:50:14 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@66 -- # local raid_level=concat 00:09:24.432 11:50:14 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:09:24.432 11:50:14 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@69 -- # cat 00:09:24.432 11:50:14 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:09:24.432 [2024-07-12 11:50:14.501809] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:09:24.432 [2024-07-12 11:50:14.502557] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:09:24.432 [2024-07-12 11:50:14.502598] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x11293f0 00:09:24.432 [2024-07-12 11:50:14.502603] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:09:24.432 [2024-07-12 11:50:14.502756] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x112ce60 00:09:24.432 [2024-07-12 11:50:14.502830] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11293f0 00:09:24.432 [2024-07-12 11:50:14.502835] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x11293f0 00:09:24.432 [2024-07-12 11:50:14.502900] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:24.432 Base_1 00:09:24.432 Base_2 00:09:24.432 11:50:14 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:09:24.432 11:50:14 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:09:24.432 11:50:14 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:09:24.691 11:50:14 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:09:24.691 11:50:14 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:09:24.691 11:50:14 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:09:24.691 11:50:14 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:24.691 11:50:14 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:09:24.691 11:50:14 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:09:24.691 11:50:14 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:09:24.691 11:50:14 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:09:24.691 11:50:14 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@12 -- # local i 00:09:24.691 11:50:14 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:09:24.691 11:50:14 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:09:24.691 11:50:14 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:09:24.691 [2024-07-12 11:50:14.846711] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x112b220 00:09:24.691 /dev/nbd0 00:09:24.691 11:50:14 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:09:24.691 11:50:14 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:09:24.691 11:50:14 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:09:24.691 11:50:14 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@867 -- # local i 00:09:24.691 11:50:14 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:24.691 11:50:14 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:24.691 11:50:14 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:09:24.691 11:50:14 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@871 -- # break 00:09:24.691 11:50:14 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:24.691 11:50:14 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:24.691 11:50:14 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:24.691 1+0 records in 00:09:24.691 1+0 records out 00:09:24.691 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00018008 s, 22.7 MB/s 00:09:24.691 11:50:14 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:24.691 11:50:14 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # size=4096 00:09:24.691 11:50:14 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:24.691 11:50:14 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:24.691 11:50:14 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@887 -- # return 0 00:09:24.691 11:50:14 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:24.691 11:50:14 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:09:24.691 11:50:14 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:09:24.691 11:50:14 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:24.691 11:50:14 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:09:24.950 11:50:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:09:24.950 { 00:09:24.950 "nbd_device": "/dev/nbd0", 00:09:24.950 "bdev_name": "raid" 00:09:24.950 } 00:09:24.950 ]' 00:09:24.951 11:50:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[ 00:09:24.951 { 00:09:24.951 "nbd_device": "/dev/nbd0", 00:09:24.951 "bdev_name": "raid" 00:09:24.951 } 00:09:24.951 ]' 00:09:24.951 11:50:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:24.951 11:50:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:09:24.951 11:50:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:09:24.951 11:50:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:24.951 11:50:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=1 00:09:24.951 11:50:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 1 00:09:24.951 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # count=1 00:09:24.951 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:09:24.951 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:09:24.951 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:09:24.951 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:09:24.951 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:24.951 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # local blksize 00:09:24.951 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:09:24.951 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:09:24.951 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:09:24.951 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # blksize=512 00:09:24.951 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:09:24.951 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:09:24.951 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:09:24.951 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:09:24.951 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:09:24.951 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:09:24.951 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:09:24.951 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:09:24.951 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:09:24.951 4096+0 records in 00:09:24.951 4096+0 records out 00:09:24.951 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.022872 s, 91.7 MB/s 00:09:24.951 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:09:25.210 4096+0 records in 00:09:25.210 4096+0 records out 00:09:25.210 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.14591 s, 14.4 MB/s 00:09:25.210 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:09:25.210 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:09:25.210 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:09:25.210 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:09:25.210 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:09:25.210 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:09:25.210 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:09:25.210 128+0 records in 00:09:25.210 128+0 records out 00:09:25.210 65536 bytes (66 kB, 64 KiB) copied, 0.000371721 s, 176 MB/s 00:09:25.210 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:09:25.210 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:09:25.210 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:09:25.210 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:09:25.210 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:09:25.210 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:09:25.210 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:09:25.210 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:09:25.210 2035+0 records in 00:09:25.210 2035+0 records out 00:09:25.210 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.0050514 s, 206 MB/s 00:09:25.210 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:09:25.210 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:09:25.210 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:09:25.210 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:09:25.210 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:09:25.210 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:09:25.210 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:09:25.211 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:09:25.211 456+0 records in 00:09:25.211 456+0 records out 00:09:25.211 233472 bytes (233 kB, 228 KiB) copied, 0.00118344 s, 197 MB/s 00:09:25.211 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:09:25.211 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:09:25.211 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:09:25.211 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:09:25.211 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:09:25.211 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@54 -- # return 0 00:09:25.211 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:09:25.211 11:50:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:25.211 11:50:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:09:25.211 11:50:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:25.211 11:50:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@51 -- # local i 00:09:25.211 11:50:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:25.211 11:50:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:09:25.470 11:50:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:25.470 [2024-07-12 11:50:15.557998] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:25.470 11:50:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:25.470 11:50:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:25.470 11:50:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:25.470 11:50:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:25.470 11:50:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:25.470 11:50:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@41 -- # break 00:09:25.470 11:50:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@45 -- # return 0 00:09:25.470 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:09:25.470 11:50:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:25.470 11:50:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:09:25.730 11:50:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:09:25.730 11:50:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:09:25.730 11:50:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:25.730 11:50:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:09:25.730 11:50:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo '' 00:09:25.730 11:50:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:25.730 11:50:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # true 00:09:25.730 11:50:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=0 00:09:25.730 11:50:15 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 0 00:09:25.730 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # count=0 00:09:25.730 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:09:25.730 11:50:15 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@110 -- # killprocess 588730 00:09:25.730 11:50:15 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@948 -- # '[' -z 588730 ']' 00:09:25.730 11:50:15 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@952 -- # kill -0 588730 00:09:25.730 11:50:15 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # uname 00:09:25.730 11:50:15 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:25.730 11:50:15 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 588730 00:09:25.730 11:50:15 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:25.730 11:50:15 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:25.730 11:50:15 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 588730' 00:09:25.730 killing process with pid 588730 00:09:25.730 11:50:15 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@967 -- # kill 588730 00:09:25.730 [2024-07-12 11:50:15.824209] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:25.730 [2024-07-12 11:50:15.824256] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:25.730 [2024-07-12 11:50:15.824283] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:25.730 [2024-07-12 11:50:15.824288] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11293f0 name raid, state offline 00:09:25.730 11:50:15 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@972 -- # wait 588730 00:09:25.730 [2024-07-12 11:50:15.839493] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:25.988 11:50:16 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@112 -- # return 0 00:09:25.988 00:09:25.988 real 0m2.524s 00:09:25.988 user 0m3.425s 00:09:25.988 sys 0m0.739s 00:09:25.988 11:50:16 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:25.988 11:50:16 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:09:25.988 ************************************ 00:09:25.988 END TEST raid_function_test_concat 00:09:25.988 ************************************ 00:09:25.988 11:50:16 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:09:25.988 11:50:16 bdev_raid -- bdev/bdev_raid.sh@863 -- # run_test raid0_resize_test raid0_resize_test 00:09:25.988 11:50:16 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:25.988 11:50:16 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:25.989 11:50:16 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:25.989 ************************************ 00:09:25.989 START TEST raid0_resize_test 00:09:25.989 ************************************ 00:09:25.989 11:50:16 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1123 -- # raid0_resize_test 00:09:25.989 11:50:16 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@347 -- # local blksize=512 00:09:25.989 11:50:16 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@348 -- # local bdev_size_mb=32 00:09:25.989 11:50:16 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@349 -- # local new_bdev_size_mb=64 00:09:25.989 11:50:16 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@350 -- # local blkcnt 00:09:25.989 11:50:16 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@351 -- # local raid_size_mb 00:09:25.989 11:50:16 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@352 -- # local new_raid_size_mb 00:09:25.989 11:50:16 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@355 -- # raid_pid=589220 00:09:25.989 11:50:16 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@356 -- # echo 'Process raid pid: 589220' 00:09:25.989 Process raid pid: 589220 00:09:25.989 11:50:16 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@354 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:09:25.989 11:50:16 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@357 -- # waitforlisten 589220 /var/tmp/spdk-raid.sock 00:09:25.989 11:50:16 bdev_raid.raid0_resize_test -- common/autotest_common.sh@829 -- # '[' -z 589220 ']' 00:09:25.989 11:50:16 bdev_raid.raid0_resize_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:25.989 11:50:16 bdev_raid.raid0_resize_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:25.989 11:50:16 bdev_raid.raid0_resize_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:25.989 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:25.989 11:50:16 bdev_raid.raid0_resize_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:25.989 11:50:16 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:09:25.989 [2024-07-12 11:50:16.129934] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:09:25.989 [2024-07-12 11:50:16.129979] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:25.989 [2024-07-12 11:50:16.199077] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:26.247 [2024-07-12 11:50:16.274386] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:26.247 [2024-07-12 11:50:16.328101] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:26.247 [2024-07-12 11:50:16.328125] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:26.817 11:50:16 bdev_raid.raid0_resize_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:26.817 11:50:16 bdev_raid.raid0_resize_test -- common/autotest_common.sh@862 -- # return 0 00:09:26.817 11:50:16 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@359 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_1 32 512 00:09:27.077 Base_1 00:09:27.077 11:50:17 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@360 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_2 32 512 00:09:27.077 Base_2 00:09:27.077 11:50:17 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@362 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r 0 -b 'Base_1 Base_2' -n Raid 00:09:27.336 [2024-07-12 11:50:17.396067] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:09:27.336 [2024-07-12 11:50:17.396977] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:09:27.336 [2024-07-12 11:50:17.397010] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a9a390 00:09:27.336 [2024-07-12 11:50:17.397015] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:09:27.336 [2024-07-12 11:50:17.397131] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c4c970 00:09:27.336 [2024-07-12 11:50:17.397187] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a9a390 00:09:27.336 [2024-07-12 11:50:17.397192] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x1a9a390 00:09:27.336 [2024-07-12 11:50:17.397253] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:27.336 11:50:17 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@365 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_1 64 00:09:27.336 [2024-07-12 11:50:17.568494] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:09:27.336 [2024-07-12 11:50:17.568505] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_1' was resized: old size 65536, new size 131072 00:09:27.336 true 00:09:27.595 11:50:17 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:09:27.595 11:50:17 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # jq '.[].num_blocks' 00:09:27.595 [2024-07-12 11:50:17.745048] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:27.595 11:50:17 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # blkcnt=131072 00:09:27.595 11:50:17 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@369 -- # raid_size_mb=64 00:09:27.595 11:50:17 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@370 -- # '[' 64 '!=' 64 ']' 00:09:27.595 11:50:17 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@376 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_2 64 00:09:27.854 [2024-07-12 11:50:17.921401] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:09:27.854 [2024-07-12 11:50:17.921414] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_2' was resized: old size 65536, new size 131072 00:09:27.854 [2024-07-12 11:50:17.921430] bdev_raid.c:2289:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 131072 to 262144 00:09:27.854 true 00:09:27.854 11:50:17 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:09:27.854 11:50:17 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # jq '.[].num_blocks' 00:09:27.854 [2024-07-12 11:50:18.089941] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:28.114 11:50:18 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # blkcnt=262144 00:09:28.114 11:50:18 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@380 -- # raid_size_mb=128 00:09:28.114 11:50:18 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@381 -- # '[' 128 '!=' 128 ']' 00:09:28.114 11:50:18 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@386 -- # killprocess 589220 00:09:28.114 11:50:18 bdev_raid.raid0_resize_test -- common/autotest_common.sh@948 -- # '[' -z 589220 ']' 00:09:28.114 11:50:18 bdev_raid.raid0_resize_test -- common/autotest_common.sh@952 -- # kill -0 589220 00:09:28.114 11:50:18 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # uname 00:09:28.114 11:50:18 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:28.114 11:50:18 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 589220 00:09:28.114 11:50:18 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:28.114 11:50:18 bdev_raid.raid0_resize_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:28.114 11:50:18 bdev_raid.raid0_resize_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 589220' 00:09:28.114 killing process with pid 589220 00:09:28.114 11:50:18 bdev_raid.raid0_resize_test -- common/autotest_common.sh@967 -- # kill 589220 00:09:28.114 [2024-07-12 11:50:18.131281] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:28.114 [2024-07-12 11:50:18.131325] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:28.114 [2024-07-12 11:50:18.131353] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:28.114 [2024-07-12 11:50:18.131360] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a9a390 name Raid, state offline 00:09:28.114 11:50:18 bdev_raid.raid0_resize_test -- common/autotest_common.sh@972 -- # wait 589220 00:09:28.114 [2024-07-12 11:50:18.132408] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:28.114 11:50:18 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@388 -- # return 0 00:09:28.114 00:09:28.114 real 0m2.214s 00:09:28.114 user 0m3.317s 00:09:28.114 sys 0m0.427s 00:09:28.114 11:50:18 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:28.114 11:50:18 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:09:28.114 ************************************ 00:09:28.114 END TEST raid0_resize_test 00:09:28.114 ************************************ 00:09:28.114 11:50:18 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:09:28.114 11:50:18 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:09:28.114 11:50:18 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:09:28.114 11:50:18 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 2 false 00:09:28.114 11:50:18 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:09:28.114 11:50:18 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:28.114 11:50:18 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:28.114 ************************************ 00:09:28.114 START TEST raid_state_function_test 00:09:28.114 ************************************ 00:09:28.114 11:50:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 2 false 00:09:28.114 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:09:28.114 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:09:28.114 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:09:28.114 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:09:28.114 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:09:28.114 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:28.114 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:09:28.114 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:09:28.114 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:28.114 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:09:28.114 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:09:28.114 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:28.114 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:09:28.114 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:09:28.114 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:09:28.114 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:09:28.114 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:09:28.114 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:09:28.114 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:09:28.114 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:09:28.114 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:09:28.114 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:09:28.114 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:09:28.114 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:09:28.114 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=589708 00:09:28.114 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 589708' 00:09:28.114 Process raid pid: 589708 00:09:28.114 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 589708 /var/tmp/spdk-raid.sock 00:09:28.114 11:50:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 589708 ']' 00:09:28.114 11:50:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:28.114 11:50:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:28.373 11:50:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:28.373 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:28.373 11:50:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:28.374 11:50:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:28.374 [2024-07-12 11:50:18.388311] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:09:28.374 [2024-07-12 11:50:18.388345] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:28.374 [2024-07-12 11:50:18.453839] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:28.374 [2024-07-12 11:50:18.531849] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:28.374 [2024-07-12 11:50:18.583668] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:28.374 [2024-07-12 11:50:18.583690] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:29.311 11:50:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:29.311 11:50:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:09:29.311 11:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:29.311 [2024-07-12 11:50:19.334690] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:29.311 [2024-07-12 11:50:19.334718] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:29.311 [2024-07-12 11:50:19.334723] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:29.311 [2024-07-12 11:50:19.334729] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:29.311 11:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:09:29.311 11:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:29.311 11:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:29.311 11:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:29.311 11:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:29.311 11:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:29.311 11:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:29.311 11:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:29.311 11:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:29.311 11:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:29.311 11:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:29.311 11:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:29.311 11:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:29.311 "name": "Existed_Raid", 00:09:29.311 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:29.311 "strip_size_kb": 64, 00:09:29.311 "state": "configuring", 00:09:29.311 "raid_level": "raid0", 00:09:29.311 "superblock": false, 00:09:29.311 "num_base_bdevs": 2, 00:09:29.311 "num_base_bdevs_discovered": 0, 00:09:29.311 "num_base_bdevs_operational": 2, 00:09:29.311 "base_bdevs_list": [ 00:09:29.311 { 00:09:29.311 "name": "BaseBdev1", 00:09:29.311 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:29.311 "is_configured": false, 00:09:29.311 "data_offset": 0, 00:09:29.311 "data_size": 0 00:09:29.311 }, 00:09:29.311 { 00:09:29.311 "name": "BaseBdev2", 00:09:29.311 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:29.311 "is_configured": false, 00:09:29.311 "data_offset": 0, 00:09:29.311 "data_size": 0 00:09:29.311 } 00:09:29.311 ] 00:09:29.311 }' 00:09:29.311 11:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:29.311 11:50:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:29.879 11:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:29.879 [2024-07-12 11:50:20.100591] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:29.879 [2024-07-12 11:50:20.100616] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a141b0 name Existed_Raid, state configuring 00:09:29.879 11:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:30.138 [2024-07-12 11:50:20.281064] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:30.138 [2024-07-12 11:50:20.281082] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:30.138 [2024-07-12 11:50:20.281087] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:30.138 [2024-07-12 11:50:20.281092] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:30.138 11:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:09:30.397 [2024-07-12 11:50:20.466100] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:30.397 BaseBdev1 00:09:30.397 11:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:09:30.397 11:50:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:09:30.397 11:50:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:30.397 11:50:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:09:30.397 11:50:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:30.397 11:50:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:30.397 11:50:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:30.656 11:50:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:09:30.656 [ 00:09:30.656 { 00:09:30.656 "name": "BaseBdev1", 00:09:30.656 "aliases": [ 00:09:30.656 "7689be0b-493a-4e07-bded-eaf1faf6293e" 00:09:30.656 ], 00:09:30.656 "product_name": "Malloc disk", 00:09:30.656 "block_size": 512, 00:09:30.656 "num_blocks": 65536, 00:09:30.656 "uuid": "7689be0b-493a-4e07-bded-eaf1faf6293e", 00:09:30.656 "assigned_rate_limits": { 00:09:30.656 "rw_ios_per_sec": 0, 00:09:30.656 "rw_mbytes_per_sec": 0, 00:09:30.656 "r_mbytes_per_sec": 0, 00:09:30.656 "w_mbytes_per_sec": 0 00:09:30.656 }, 00:09:30.656 "claimed": true, 00:09:30.656 "claim_type": "exclusive_write", 00:09:30.656 "zoned": false, 00:09:30.656 "supported_io_types": { 00:09:30.656 "read": true, 00:09:30.656 "write": true, 00:09:30.656 "unmap": true, 00:09:30.656 "flush": true, 00:09:30.656 "reset": true, 00:09:30.656 "nvme_admin": false, 00:09:30.656 "nvme_io": false, 00:09:30.656 "nvme_io_md": false, 00:09:30.656 "write_zeroes": true, 00:09:30.656 "zcopy": true, 00:09:30.656 "get_zone_info": false, 00:09:30.656 "zone_management": false, 00:09:30.656 "zone_append": false, 00:09:30.656 "compare": false, 00:09:30.656 "compare_and_write": false, 00:09:30.656 "abort": true, 00:09:30.656 "seek_hole": false, 00:09:30.656 "seek_data": false, 00:09:30.656 "copy": true, 00:09:30.656 "nvme_iov_md": false 00:09:30.656 }, 00:09:30.656 "memory_domains": [ 00:09:30.656 { 00:09:30.656 "dma_device_id": "system", 00:09:30.656 "dma_device_type": 1 00:09:30.656 }, 00:09:30.656 { 00:09:30.656 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:30.656 "dma_device_type": 2 00:09:30.656 } 00:09:30.656 ], 00:09:30.656 "driver_specific": {} 00:09:30.656 } 00:09:30.656 ] 00:09:30.656 11:50:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:09:30.656 11:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:09:30.656 11:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:30.656 11:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:30.656 11:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:30.656 11:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:30.656 11:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:30.656 11:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:30.656 11:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:30.656 11:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:30.656 11:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:30.656 11:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:30.656 11:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:30.914 11:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:30.914 "name": "Existed_Raid", 00:09:30.914 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:30.914 "strip_size_kb": 64, 00:09:30.914 "state": "configuring", 00:09:30.914 "raid_level": "raid0", 00:09:30.914 "superblock": false, 00:09:30.914 "num_base_bdevs": 2, 00:09:30.914 "num_base_bdevs_discovered": 1, 00:09:30.914 "num_base_bdevs_operational": 2, 00:09:30.914 "base_bdevs_list": [ 00:09:30.914 { 00:09:30.914 "name": "BaseBdev1", 00:09:30.914 "uuid": "7689be0b-493a-4e07-bded-eaf1faf6293e", 00:09:30.914 "is_configured": true, 00:09:30.914 "data_offset": 0, 00:09:30.914 "data_size": 65536 00:09:30.914 }, 00:09:30.914 { 00:09:30.914 "name": "BaseBdev2", 00:09:30.914 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:30.914 "is_configured": false, 00:09:30.914 "data_offset": 0, 00:09:30.914 "data_size": 0 00:09:30.914 } 00:09:30.914 ] 00:09:30.914 }' 00:09:30.914 11:50:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:30.914 11:50:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:31.482 11:50:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:31.482 [2024-07-12 11:50:21.597037] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:31.482 [2024-07-12 11:50:21.597064] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a13aa0 name Existed_Raid, state configuring 00:09:31.482 11:50:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:31.740 [2024-07-12 11:50:21.765494] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:31.740 [2024-07-12 11:50:21.766574] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:31.740 [2024-07-12 11:50:21.766597] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:31.740 11:50:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:09:31.740 11:50:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:09:31.740 11:50:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:09:31.740 11:50:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:31.740 11:50:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:31.740 11:50:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:31.740 11:50:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:31.740 11:50:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:31.740 11:50:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:31.740 11:50:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:31.740 11:50:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:31.740 11:50:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:31.740 11:50:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:31.740 11:50:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:31.740 11:50:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:31.740 "name": "Existed_Raid", 00:09:31.740 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:31.740 "strip_size_kb": 64, 00:09:31.740 "state": "configuring", 00:09:31.740 "raid_level": "raid0", 00:09:31.740 "superblock": false, 00:09:31.740 "num_base_bdevs": 2, 00:09:31.740 "num_base_bdevs_discovered": 1, 00:09:31.740 "num_base_bdevs_operational": 2, 00:09:31.740 "base_bdevs_list": [ 00:09:31.740 { 00:09:31.740 "name": "BaseBdev1", 00:09:31.740 "uuid": "7689be0b-493a-4e07-bded-eaf1faf6293e", 00:09:31.740 "is_configured": true, 00:09:31.740 "data_offset": 0, 00:09:31.740 "data_size": 65536 00:09:31.740 }, 00:09:31.740 { 00:09:31.740 "name": "BaseBdev2", 00:09:31.740 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:31.740 "is_configured": false, 00:09:31.740 "data_offset": 0, 00:09:31.740 "data_size": 0 00:09:31.740 } 00:09:31.740 ] 00:09:31.740 }' 00:09:31.740 11:50:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:31.741 11:50:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:32.307 11:50:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:09:32.567 [2024-07-12 11:50:22.598325] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:09:32.567 [2024-07-12 11:50:22.598353] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a14890 00:09:32.567 [2024-07-12 11:50:22.598357] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:09:32.567 [2024-07-12 11:50:22.598483] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a12c20 00:09:32.567 [2024-07-12 11:50:22.598571] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a14890 00:09:32.567 [2024-07-12 11:50:22.598576] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1a14890 00:09:32.567 [2024-07-12 11:50:22.598692] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:32.567 BaseBdev2 00:09:32.567 11:50:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:09:32.567 11:50:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:09:32.567 11:50:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:32.567 11:50:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:09:32.567 11:50:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:32.567 11:50:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:32.567 11:50:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:32.567 11:50:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:09:32.826 [ 00:09:32.826 { 00:09:32.826 "name": "BaseBdev2", 00:09:32.826 "aliases": [ 00:09:32.826 "0bdfab46-1799-44e3-8112-711c6d59547b" 00:09:32.826 ], 00:09:32.826 "product_name": "Malloc disk", 00:09:32.826 "block_size": 512, 00:09:32.826 "num_blocks": 65536, 00:09:32.826 "uuid": "0bdfab46-1799-44e3-8112-711c6d59547b", 00:09:32.826 "assigned_rate_limits": { 00:09:32.826 "rw_ios_per_sec": 0, 00:09:32.826 "rw_mbytes_per_sec": 0, 00:09:32.826 "r_mbytes_per_sec": 0, 00:09:32.826 "w_mbytes_per_sec": 0 00:09:32.826 }, 00:09:32.826 "claimed": true, 00:09:32.826 "claim_type": "exclusive_write", 00:09:32.826 "zoned": false, 00:09:32.826 "supported_io_types": { 00:09:32.826 "read": true, 00:09:32.826 "write": true, 00:09:32.826 "unmap": true, 00:09:32.826 "flush": true, 00:09:32.826 "reset": true, 00:09:32.826 "nvme_admin": false, 00:09:32.826 "nvme_io": false, 00:09:32.826 "nvme_io_md": false, 00:09:32.826 "write_zeroes": true, 00:09:32.826 "zcopy": true, 00:09:32.826 "get_zone_info": false, 00:09:32.826 "zone_management": false, 00:09:32.826 "zone_append": false, 00:09:32.826 "compare": false, 00:09:32.826 "compare_and_write": false, 00:09:32.826 "abort": true, 00:09:32.826 "seek_hole": false, 00:09:32.826 "seek_data": false, 00:09:32.826 "copy": true, 00:09:32.826 "nvme_iov_md": false 00:09:32.826 }, 00:09:32.826 "memory_domains": [ 00:09:32.826 { 00:09:32.826 "dma_device_id": "system", 00:09:32.826 "dma_device_type": 1 00:09:32.826 }, 00:09:32.826 { 00:09:32.826 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:32.826 "dma_device_type": 2 00:09:32.826 } 00:09:32.826 ], 00:09:32.826 "driver_specific": {} 00:09:32.826 } 00:09:32.826 ] 00:09:32.826 11:50:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:09:32.826 11:50:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:09:32.826 11:50:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:09:32.826 11:50:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:09:32.826 11:50:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:32.826 11:50:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:32.826 11:50:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:32.826 11:50:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:32.826 11:50:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:32.826 11:50:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:32.826 11:50:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:32.826 11:50:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:32.826 11:50:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:32.826 11:50:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:32.826 11:50:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:33.085 11:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:33.085 "name": "Existed_Raid", 00:09:33.085 "uuid": "931ec18d-ab22-4a3c-9399-35e6603ce3f6", 00:09:33.085 "strip_size_kb": 64, 00:09:33.085 "state": "online", 00:09:33.085 "raid_level": "raid0", 00:09:33.085 "superblock": false, 00:09:33.085 "num_base_bdevs": 2, 00:09:33.085 "num_base_bdevs_discovered": 2, 00:09:33.085 "num_base_bdevs_operational": 2, 00:09:33.085 "base_bdevs_list": [ 00:09:33.085 { 00:09:33.085 "name": "BaseBdev1", 00:09:33.085 "uuid": "7689be0b-493a-4e07-bded-eaf1faf6293e", 00:09:33.085 "is_configured": true, 00:09:33.085 "data_offset": 0, 00:09:33.085 "data_size": 65536 00:09:33.085 }, 00:09:33.085 { 00:09:33.085 "name": "BaseBdev2", 00:09:33.085 "uuid": "0bdfab46-1799-44e3-8112-711c6d59547b", 00:09:33.085 "is_configured": true, 00:09:33.085 "data_offset": 0, 00:09:33.085 "data_size": 65536 00:09:33.085 } 00:09:33.085 ] 00:09:33.085 }' 00:09:33.085 11:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:33.085 11:50:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:33.653 11:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:09:33.653 11:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:09:33.653 11:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:09:33.653 11:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:09:33.653 11:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:09:33.653 11:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:09:33.653 11:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:09:33.653 11:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:09:33.653 [2024-07-12 11:50:23.769523] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:33.653 11:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:09:33.653 "name": "Existed_Raid", 00:09:33.653 "aliases": [ 00:09:33.653 "931ec18d-ab22-4a3c-9399-35e6603ce3f6" 00:09:33.653 ], 00:09:33.653 "product_name": "Raid Volume", 00:09:33.653 "block_size": 512, 00:09:33.653 "num_blocks": 131072, 00:09:33.653 "uuid": "931ec18d-ab22-4a3c-9399-35e6603ce3f6", 00:09:33.653 "assigned_rate_limits": { 00:09:33.653 "rw_ios_per_sec": 0, 00:09:33.653 "rw_mbytes_per_sec": 0, 00:09:33.653 "r_mbytes_per_sec": 0, 00:09:33.653 "w_mbytes_per_sec": 0 00:09:33.653 }, 00:09:33.653 "claimed": false, 00:09:33.653 "zoned": false, 00:09:33.653 "supported_io_types": { 00:09:33.653 "read": true, 00:09:33.653 "write": true, 00:09:33.653 "unmap": true, 00:09:33.653 "flush": true, 00:09:33.654 "reset": true, 00:09:33.654 "nvme_admin": false, 00:09:33.654 "nvme_io": false, 00:09:33.654 "nvme_io_md": false, 00:09:33.654 "write_zeroes": true, 00:09:33.654 "zcopy": false, 00:09:33.654 "get_zone_info": false, 00:09:33.654 "zone_management": false, 00:09:33.654 "zone_append": false, 00:09:33.654 "compare": false, 00:09:33.654 "compare_and_write": false, 00:09:33.654 "abort": false, 00:09:33.654 "seek_hole": false, 00:09:33.654 "seek_data": false, 00:09:33.654 "copy": false, 00:09:33.654 "nvme_iov_md": false 00:09:33.654 }, 00:09:33.654 "memory_domains": [ 00:09:33.654 { 00:09:33.654 "dma_device_id": "system", 00:09:33.654 "dma_device_type": 1 00:09:33.654 }, 00:09:33.654 { 00:09:33.654 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:33.654 "dma_device_type": 2 00:09:33.654 }, 00:09:33.654 { 00:09:33.654 "dma_device_id": "system", 00:09:33.654 "dma_device_type": 1 00:09:33.654 }, 00:09:33.654 { 00:09:33.654 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:33.654 "dma_device_type": 2 00:09:33.654 } 00:09:33.654 ], 00:09:33.654 "driver_specific": { 00:09:33.654 "raid": { 00:09:33.654 "uuid": "931ec18d-ab22-4a3c-9399-35e6603ce3f6", 00:09:33.654 "strip_size_kb": 64, 00:09:33.654 "state": "online", 00:09:33.654 "raid_level": "raid0", 00:09:33.654 "superblock": false, 00:09:33.654 "num_base_bdevs": 2, 00:09:33.654 "num_base_bdevs_discovered": 2, 00:09:33.654 "num_base_bdevs_operational": 2, 00:09:33.654 "base_bdevs_list": [ 00:09:33.654 { 00:09:33.654 "name": "BaseBdev1", 00:09:33.654 "uuid": "7689be0b-493a-4e07-bded-eaf1faf6293e", 00:09:33.654 "is_configured": true, 00:09:33.654 "data_offset": 0, 00:09:33.654 "data_size": 65536 00:09:33.654 }, 00:09:33.654 { 00:09:33.654 "name": "BaseBdev2", 00:09:33.654 "uuid": "0bdfab46-1799-44e3-8112-711c6d59547b", 00:09:33.654 "is_configured": true, 00:09:33.654 "data_offset": 0, 00:09:33.654 "data_size": 65536 00:09:33.654 } 00:09:33.654 ] 00:09:33.654 } 00:09:33.654 } 00:09:33.654 }' 00:09:33.654 11:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:09:33.654 11:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:09:33.654 BaseBdev2' 00:09:33.654 11:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:33.654 11:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:09:33.654 11:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:33.911 11:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:33.911 "name": "BaseBdev1", 00:09:33.911 "aliases": [ 00:09:33.911 "7689be0b-493a-4e07-bded-eaf1faf6293e" 00:09:33.911 ], 00:09:33.911 "product_name": "Malloc disk", 00:09:33.911 "block_size": 512, 00:09:33.911 "num_blocks": 65536, 00:09:33.911 "uuid": "7689be0b-493a-4e07-bded-eaf1faf6293e", 00:09:33.911 "assigned_rate_limits": { 00:09:33.911 "rw_ios_per_sec": 0, 00:09:33.911 "rw_mbytes_per_sec": 0, 00:09:33.911 "r_mbytes_per_sec": 0, 00:09:33.911 "w_mbytes_per_sec": 0 00:09:33.911 }, 00:09:33.911 "claimed": true, 00:09:33.911 "claim_type": "exclusive_write", 00:09:33.911 "zoned": false, 00:09:33.911 "supported_io_types": { 00:09:33.911 "read": true, 00:09:33.911 "write": true, 00:09:33.911 "unmap": true, 00:09:33.911 "flush": true, 00:09:33.911 "reset": true, 00:09:33.911 "nvme_admin": false, 00:09:33.911 "nvme_io": false, 00:09:33.911 "nvme_io_md": false, 00:09:33.911 "write_zeroes": true, 00:09:33.911 "zcopy": true, 00:09:33.911 "get_zone_info": false, 00:09:33.911 "zone_management": false, 00:09:33.911 "zone_append": false, 00:09:33.911 "compare": false, 00:09:33.911 "compare_and_write": false, 00:09:33.911 "abort": true, 00:09:33.911 "seek_hole": false, 00:09:33.911 "seek_data": false, 00:09:33.911 "copy": true, 00:09:33.911 "nvme_iov_md": false 00:09:33.911 }, 00:09:33.911 "memory_domains": [ 00:09:33.911 { 00:09:33.911 "dma_device_id": "system", 00:09:33.911 "dma_device_type": 1 00:09:33.911 }, 00:09:33.911 { 00:09:33.911 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:33.911 "dma_device_type": 2 00:09:33.911 } 00:09:33.911 ], 00:09:33.911 "driver_specific": {} 00:09:33.911 }' 00:09:33.911 11:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:33.911 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:33.911 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:33.911 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:33.911 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:33.911 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:33.911 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:34.169 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:34.169 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:34.169 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:34.169 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:34.169 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:34.169 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:34.169 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:09:34.169 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:34.427 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:34.427 "name": "BaseBdev2", 00:09:34.427 "aliases": [ 00:09:34.427 "0bdfab46-1799-44e3-8112-711c6d59547b" 00:09:34.427 ], 00:09:34.427 "product_name": "Malloc disk", 00:09:34.427 "block_size": 512, 00:09:34.427 "num_blocks": 65536, 00:09:34.427 "uuid": "0bdfab46-1799-44e3-8112-711c6d59547b", 00:09:34.427 "assigned_rate_limits": { 00:09:34.427 "rw_ios_per_sec": 0, 00:09:34.427 "rw_mbytes_per_sec": 0, 00:09:34.427 "r_mbytes_per_sec": 0, 00:09:34.427 "w_mbytes_per_sec": 0 00:09:34.427 }, 00:09:34.427 "claimed": true, 00:09:34.427 "claim_type": "exclusive_write", 00:09:34.427 "zoned": false, 00:09:34.427 "supported_io_types": { 00:09:34.427 "read": true, 00:09:34.427 "write": true, 00:09:34.427 "unmap": true, 00:09:34.427 "flush": true, 00:09:34.427 "reset": true, 00:09:34.427 "nvme_admin": false, 00:09:34.427 "nvme_io": false, 00:09:34.427 "nvme_io_md": false, 00:09:34.427 "write_zeroes": true, 00:09:34.427 "zcopy": true, 00:09:34.427 "get_zone_info": false, 00:09:34.427 "zone_management": false, 00:09:34.427 "zone_append": false, 00:09:34.427 "compare": false, 00:09:34.427 "compare_and_write": false, 00:09:34.427 "abort": true, 00:09:34.427 "seek_hole": false, 00:09:34.427 "seek_data": false, 00:09:34.427 "copy": true, 00:09:34.427 "nvme_iov_md": false 00:09:34.427 }, 00:09:34.427 "memory_domains": [ 00:09:34.427 { 00:09:34.427 "dma_device_id": "system", 00:09:34.427 "dma_device_type": 1 00:09:34.427 }, 00:09:34.427 { 00:09:34.427 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:34.427 "dma_device_type": 2 00:09:34.427 } 00:09:34.427 ], 00:09:34.427 "driver_specific": {} 00:09:34.427 }' 00:09:34.427 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:34.427 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:34.427 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:34.427 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:34.427 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:34.427 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:34.427 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:34.685 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:34.685 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:34.685 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:34.685 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:34.685 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:34.685 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:09:34.943 [2024-07-12 11:50:24.940396] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:09:34.943 [2024-07-12 11:50:24.940414] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:09:34.943 [2024-07-12 11:50:24.940445] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:34.943 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:09:34.943 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:09:34.943 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:09:34.943 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:09:34.944 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:09:34.944 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:09:34.944 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:34.944 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:09:34.944 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:34.944 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:34.944 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:09:34.944 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:34.944 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:34.944 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:34.944 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:34.944 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:34.944 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:34.944 11:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:34.944 "name": "Existed_Raid", 00:09:34.944 "uuid": "931ec18d-ab22-4a3c-9399-35e6603ce3f6", 00:09:34.944 "strip_size_kb": 64, 00:09:34.944 "state": "offline", 00:09:34.944 "raid_level": "raid0", 00:09:34.944 "superblock": false, 00:09:34.944 "num_base_bdevs": 2, 00:09:34.944 "num_base_bdevs_discovered": 1, 00:09:34.944 "num_base_bdevs_operational": 1, 00:09:34.944 "base_bdevs_list": [ 00:09:34.944 { 00:09:34.944 "name": null, 00:09:34.944 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:34.944 "is_configured": false, 00:09:34.944 "data_offset": 0, 00:09:34.944 "data_size": 65536 00:09:34.944 }, 00:09:34.944 { 00:09:34.944 "name": "BaseBdev2", 00:09:34.944 "uuid": "0bdfab46-1799-44e3-8112-711c6d59547b", 00:09:34.944 "is_configured": true, 00:09:34.944 "data_offset": 0, 00:09:34.944 "data_size": 65536 00:09:34.944 } 00:09:34.944 ] 00:09:34.944 }' 00:09:34.944 11:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:34.944 11:50:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:35.510 11:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:09:35.510 11:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:09:35.510 11:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:35.510 11:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:09:35.768 11:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:09:35.768 11:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:09:35.768 11:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:09:35.768 [2024-07-12 11:50:25.943849] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:09:35.768 [2024-07-12 11:50:25.943887] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a14890 name Existed_Raid, state offline 00:09:35.768 11:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:09:35.768 11:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:09:35.768 11:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:35.768 11:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:09:36.028 11:50:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:09:36.028 11:50:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:09:36.028 11:50:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:09:36.028 11:50:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 589708 00:09:36.028 11:50:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 589708 ']' 00:09:36.028 11:50:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 589708 00:09:36.028 11:50:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:09:36.028 11:50:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:36.028 11:50:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 589708 00:09:36.028 11:50:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:36.028 11:50:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:36.028 11:50:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 589708' 00:09:36.028 killing process with pid 589708 00:09:36.028 11:50:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 589708 00:09:36.028 [2024-07-12 11:50:26.174260] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:36.028 11:50:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 589708 00:09:36.028 [2024-07-12 11:50:26.175027] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:36.287 11:50:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:09:36.287 00:09:36.287 real 0m7.995s 00:09:36.287 user 0m14.406s 00:09:36.287 sys 0m1.246s 00:09:36.287 11:50:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:36.287 11:50:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:36.287 ************************************ 00:09:36.287 END TEST raid_state_function_test 00:09:36.287 ************************************ 00:09:36.287 11:50:26 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:09:36.287 11:50:26 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 2 true 00:09:36.287 11:50:26 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:09:36.287 11:50:26 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:36.287 11:50:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:36.287 ************************************ 00:09:36.287 START TEST raid_state_function_test_sb 00:09:36.287 ************************************ 00:09:36.287 11:50:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 2 true 00:09:36.287 11:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:09:36.287 11:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:09:36.287 11:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:09:36.287 11:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:09:36.287 11:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:09:36.287 11:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:36.287 11:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:09:36.287 11:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:09:36.287 11:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:36.287 11:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:09:36.287 11:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:09:36.288 11:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:36.288 11:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:09:36.288 11:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:09:36.288 11:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:09:36.288 11:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:09:36.288 11:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:09:36.288 11:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:09:36.288 11:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:09:36.288 11:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:09:36.288 11:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:09:36.288 11:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:09:36.288 11:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:09:36.288 11:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=591299 00:09:36.288 11:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 591299' 00:09:36.288 Process raid pid: 591299 00:09:36.288 11:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 591299 /var/tmp/spdk-raid.sock 00:09:36.288 11:50:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 591299 ']' 00:09:36.288 11:50:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:36.288 11:50:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:36.288 11:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:09:36.288 11:50:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:36.288 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:36.288 11:50:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:36.288 11:50:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:36.288 [2024-07-12 11:50:26.456228] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:09:36.288 [2024-07-12 11:50:26.456267] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:36.288 [2024-07-12 11:50:26.520541] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:36.546 [2024-07-12 11:50:26.597638] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:36.546 [2024-07-12 11:50:26.649416] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:36.547 [2024-07-12 11:50:26.649439] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:37.114 11:50:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:37.114 11:50:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:09:37.114 11:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:37.373 [2024-07-12 11:50:27.372100] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:37.373 [2024-07-12 11:50:27.372129] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:37.373 [2024-07-12 11:50:27.372135] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:37.373 [2024-07-12 11:50:27.372143] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:37.373 11:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:09:37.373 11:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:37.373 11:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:37.373 11:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:37.373 11:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:37.373 11:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:37.373 11:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:37.373 11:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:37.373 11:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:37.373 11:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:37.373 11:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:37.373 11:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:37.373 11:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:37.373 "name": "Existed_Raid", 00:09:37.373 "uuid": "c80ac126-ca55-4b52-a97e-13465f41ea16", 00:09:37.373 "strip_size_kb": 64, 00:09:37.373 "state": "configuring", 00:09:37.373 "raid_level": "raid0", 00:09:37.373 "superblock": true, 00:09:37.373 "num_base_bdevs": 2, 00:09:37.373 "num_base_bdevs_discovered": 0, 00:09:37.373 "num_base_bdevs_operational": 2, 00:09:37.373 "base_bdevs_list": [ 00:09:37.373 { 00:09:37.373 "name": "BaseBdev1", 00:09:37.373 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:37.373 "is_configured": false, 00:09:37.373 "data_offset": 0, 00:09:37.373 "data_size": 0 00:09:37.373 }, 00:09:37.373 { 00:09:37.373 "name": "BaseBdev2", 00:09:37.373 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:37.373 "is_configured": false, 00:09:37.373 "data_offset": 0, 00:09:37.373 "data_size": 0 00:09:37.373 } 00:09:37.373 ] 00:09:37.373 }' 00:09:37.373 11:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:37.373 11:50:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:37.941 11:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:37.941 [2024-07-12 11:50:28.162052] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:37.941 [2024-07-12 11:50:28.162072] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd8d1b0 name Existed_Raid, state configuring 00:09:37.941 11:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:38.199 [2024-07-12 11:50:28.330507] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:38.199 [2024-07-12 11:50:28.330528] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:38.199 [2024-07-12 11:50:28.330534] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:38.199 [2024-07-12 11:50:28.330539] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:38.199 11:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:09:38.456 [2024-07-12 11:50:28.507214] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:38.456 BaseBdev1 00:09:38.456 11:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:09:38.456 11:50:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:09:38.456 11:50:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:38.456 11:50:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:09:38.456 11:50:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:38.456 11:50:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:38.456 11:50:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:38.714 11:50:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:09:38.714 [ 00:09:38.714 { 00:09:38.714 "name": "BaseBdev1", 00:09:38.714 "aliases": [ 00:09:38.714 "2fb4628e-dbfc-4f98-bcd7-0b0634d838b3" 00:09:38.714 ], 00:09:38.714 "product_name": "Malloc disk", 00:09:38.714 "block_size": 512, 00:09:38.714 "num_blocks": 65536, 00:09:38.714 "uuid": "2fb4628e-dbfc-4f98-bcd7-0b0634d838b3", 00:09:38.714 "assigned_rate_limits": { 00:09:38.714 "rw_ios_per_sec": 0, 00:09:38.714 "rw_mbytes_per_sec": 0, 00:09:38.714 "r_mbytes_per_sec": 0, 00:09:38.714 "w_mbytes_per_sec": 0 00:09:38.714 }, 00:09:38.714 "claimed": true, 00:09:38.714 "claim_type": "exclusive_write", 00:09:38.714 "zoned": false, 00:09:38.714 "supported_io_types": { 00:09:38.714 "read": true, 00:09:38.714 "write": true, 00:09:38.714 "unmap": true, 00:09:38.714 "flush": true, 00:09:38.714 "reset": true, 00:09:38.714 "nvme_admin": false, 00:09:38.714 "nvme_io": false, 00:09:38.714 "nvme_io_md": false, 00:09:38.714 "write_zeroes": true, 00:09:38.714 "zcopy": true, 00:09:38.714 "get_zone_info": false, 00:09:38.714 "zone_management": false, 00:09:38.714 "zone_append": false, 00:09:38.714 "compare": false, 00:09:38.714 "compare_and_write": false, 00:09:38.714 "abort": true, 00:09:38.714 "seek_hole": false, 00:09:38.714 "seek_data": false, 00:09:38.714 "copy": true, 00:09:38.714 "nvme_iov_md": false 00:09:38.714 }, 00:09:38.714 "memory_domains": [ 00:09:38.714 { 00:09:38.714 "dma_device_id": "system", 00:09:38.714 "dma_device_type": 1 00:09:38.714 }, 00:09:38.714 { 00:09:38.714 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:38.714 "dma_device_type": 2 00:09:38.714 } 00:09:38.714 ], 00:09:38.714 "driver_specific": {} 00:09:38.714 } 00:09:38.714 ] 00:09:38.714 11:50:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:09:38.714 11:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:09:38.714 11:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:38.714 11:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:38.714 11:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:38.714 11:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:38.715 11:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:38.715 11:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:38.715 11:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:38.715 11:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:38.715 11:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:38.715 11:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:38.715 11:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:38.972 11:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:38.972 "name": "Existed_Raid", 00:09:38.972 "uuid": "b5d22b20-061a-4f62-a963-adbb23de819f", 00:09:38.972 "strip_size_kb": 64, 00:09:38.972 "state": "configuring", 00:09:38.972 "raid_level": "raid0", 00:09:38.972 "superblock": true, 00:09:38.972 "num_base_bdevs": 2, 00:09:38.972 "num_base_bdevs_discovered": 1, 00:09:38.972 "num_base_bdevs_operational": 2, 00:09:38.972 "base_bdevs_list": [ 00:09:38.972 { 00:09:38.972 "name": "BaseBdev1", 00:09:38.972 "uuid": "2fb4628e-dbfc-4f98-bcd7-0b0634d838b3", 00:09:38.972 "is_configured": true, 00:09:38.972 "data_offset": 2048, 00:09:38.972 "data_size": 63488 00:09:38.972 }, 00:09:38.972 { 00:09:38.972 "name": "BaseBdev2", 00:09:38.972 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:38.972 "is_configured": false, 00:09:38.972 "data_offset": 0, 00:09:38.972 "data_size": 0 00:09:38.972 } 00:09:38.972 ] 00:09:38.972 }' 00:09:38.972 11:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:38.972 11:50:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:39.537 11:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:39.537 [2024-07-12 11:50:29.686225] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:39.537 [2024-07-12 11:50:29.686257] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd8caa0 name Existed_Raid, state configuring 00:09:39.537 11:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:39.795 [2024-07-12 11:50:29.846675] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:39.795 [2024-07-12 11:50:29.847736] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:39.795 [2024-07-12 11:50:29.847760] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:39.795 11:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:09:39.795 11:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:09:39.795 11:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:09:39.795 11:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:39.795 11:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:39.795 11:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:39.795 11:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:39.795 11:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:39.795 11:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:39.795 11:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:39.795 11:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:39.795 11:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:39.795 11:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:39.795 11:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:39.795 11:50:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:39.795 "name": "Existed_Raid", 00:09:39.795 "uuid": "d7ba24eb-261e-45f5-b5b3-abebe7358f9c", 00:09:39.795 "strip_size_kb": 64, 00:09:39.795 "state": "configuring", 00:09:39.795 "raid_level": "raid0", 00:09:39.795 "superblock": true, 00:09:39.795 "num_base_bdevs": 2, 00:09:39.795 "num_base_bdevs_discovered": 1, 00:09:39.795 "num_base_bdevs_operational": 2, 00:09:39.795 "base_bdevs_list": [ 00:09:39.795 { 00:09:39.795 "name": "BaseBdev1", 00:09:39.795 "uuid": "2fb4628e-dbfc-4f98-bcd7-0b0634d838b3", 00:09:39.795 "is_configured": true, 00:09:39.795 "data_offset": 2048, 00:09:39.795 "data_size": 63488 00:09:39.795 }, 00:09:39.795 { 00:09:39.795 "name": "BaseBdev2", 00:09:39.795 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:39.795 "is_configured": false, 00:09:39.795 "data_offset": 0, 00:09:39.795 "data_size": 0 00:09:39.795 } 00:09:39.795 ] 00:09:39.795 }' 00:09:39.795 11:50:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:39.795 11:50:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:40.359 11:50:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:09:40.617 [2024-07-12 11:50:30.675379] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:09:40.617 [2024-07-12 11:50:30.675490] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd8d890 00:09:40.617 [2024-07-12 11:50:30.675499] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:09:40.617 [2024-07-12 11:50:30.675624] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd8bc20 00:09:40.617 [2024-07-12 11:50:30.675707] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd8d890 00:09:40.617 [2024-07-12 11:50:30.675712] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xd8d890 00:09:40.617 [2024-07-12 11:50:30.675776] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:40.617 BaseBdev2 00:09:40.617 11:50:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:09:40.617 11:50:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:09:40.617 11:50:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:40.617 11:50:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:09:40.617 11:50:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:40.617 11:50:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:40.617 11:50:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:40.874 11:50:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:09:40.874 [ 00:09:40.874 { 00:09:40.874 "name": "BaseBdev2", 00:09:40.874 "aliases": [ 00:09:40.874 "6a325f7b-4988-4746-83af-b3650e73d057" 00:09:40.874 ], 00:09:40.874 "product_name": "Malloc disk", 00:09:40.874 "block_size": 512, 00:09:40.874 "num_blocks": 65536, 00:09:40.874 "uuid": "6a325f7b-4988-4746-83af-b3650e73d057", 00:09:40.874 "assigned_rate_limits": { 00:09:40.874 "rw_ios_per_sec": 0, 00:09:40.874 "rw_mbytes_per_sec": 0, 00:09:40.874 "r_mbytes_per_sec": 0, 00:09:40.874 "w_mbytes_per_sec": 0 00:09:40.874 }, 00:09:40.874 "claimed": true, 00:09:40.874 "claim_type": "exclusive_write", 00:09:40.874 "zoned": false, 00:09:40.874 "supported_io_types": { 00:09:40.874 "read": true, 00:09:40.874 "write": true, 00:09:40.874 "unmap": true, 00:09:40.874 "flush": true, 00:09:40.874 "reset": true, 00:09:40.874 "nvme_admin": false, 00:09:40.874 "nvme_io": false, 00:09:40.874 "nvme_io_md": false, 00:09:40.874 "write_zeroes": true, 00:09:40.874 "zcopy": true, 00:09:40.874 "get_zone_info": false, 00:09:40.874 "zone_management": false, 00:09:40.874 "zone_append": false, 00:09:40.874 "compare": false, 00:09:40.874 "compare_and_write": false, 00:09:40.874 "abort": true, 00:09:40.874 "seek_hole": false, 00:09:40.874 "seek_data": false, 00:09:40.874 "copy": true, 00:09:40.874 "nvme_iov_md": false 00:09:40.874 }, 00:09:40.874 "memory_domains": [ 00:09:40.874 { 00:09:40.874 "dma_device_id": "system", 00:09:40.874 "dma_device_type": 1 00:09:40.874 }, 00:09:40.874 { 00:09:40.874 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:40.874 "dma_device_type": 2 00:09:40.874 } 00:09:40.874 ], 00:09:40.874 "driver_specific": {} 00:09:40.874 } 00:09:40.874 ] 00:09:40.874 11:50:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:09:40.874 11:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:09:40.874 11:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:09:40.874 11:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:09:40.874 11:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:40.874 11:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:40.874 11:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:40.874 11:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:40.874 11:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:40.874 11:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:40.874 11:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:40.874 11:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:40.875 11:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:40.875 11:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:40.875 11:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:41.133 11:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:41.133 "name": "Existed_Raid", 00:09:41.133 "uuid": "d7ba24eb-261e-45f5-b5b3-abebe7358f9c", 00:09:41.133 "strip_size_kb": 64, 00:09:41.133 "state": "online", 00:09:41.133 "raid_level": "raid0", 00:09:41.133 "superblock": true, 00:09:41.133 "num_base_bdevs": 2, 00:09:41.133 "num_base_bdevs_discovered": 2, 00:09:41.133 "num_base_bdevs_operational": 2, 00:09:41.133 "base_bdevs_list": [ 00:09:41.133 { 00:09:41.133 "name": "BaseBdev1", 00:09:41.133 "uuid": "2fb4628e-dbfc-4f98-bcd7-0b0634d838b3", 00:09:41.133 "is_configured": true, 00:09:41.133 "data_offset": 2048, 00:09:41.133 "data_size": 63488 00:09:41.133 }, 00:09:41.133 { 00:09:41.133 "name": "BaseBdev2", 00:09:41.133 "uuid": "6a325f7b-4988-4746-83af-b3650e73d057", 00:09:41.133 "is_configured": true, 00:09:41.133 "data_offset": 2048, 00:09:41.133 "data_size": 63488 00:09:41.133 } 00:09:41.133 ] 00:09:41.133 }' 00:09:41.133 11:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:41.133 11:50:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:41.702 11:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:09:41.702 11:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:09:41.702 11:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:09:41.702 11:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:09:41.702 11:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:09:41.702 11:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:09:41.702 11:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:09:41.702 11:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:09:41.702 [2024-07-12 11:50:31.826539] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:41.702 11:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:09:41.702 "name": "Existed_Raid", 00:09:41.702 "aliases": [ 00:09:41.702 "d7ba24eb-261e-45f5-b5b3-abebe7358f9c" 00:09:41.702 ], 00:09:41.702 "product_name": "Raid Volume", 00:09:41.702 "block_size": 512, 00:09:41.702 "num_blocks": 126976, 00:09:41.702 "uuid": "d7ba24eb-261e-45f5-b5b3-abebe7358f9c", 00:09:41.702 "assigned_rate_limits": { 00:09:41.702 "rw_ios_per_sec": 0, 00:09:41.702 "rw_mbytes_per_sec": 0, 00:09:41.702 "r_mbytes_per_sec": 0, 00:09:41.702 "w_mbytes_per_sec": 0 00:09:41.702 }, 00:09:41.702 "claimed": false, 00:09:41.702 "zoned": false, 00:09:41.702 "supported_io_types": { 00:09:41.702 "read": true, 00:09:41.702 "write": true, 00:09:41.702 "unmap": true, 00:09:41.702 "flush": true, 00:09:41.702 "reset": true, 00:09:41.702 "nvme_admin": false, 00:09:41.702 "nvme_io": false, 00:09:41.702 "nvme_io_md": false, 00:09:41.702 "write_zeroes": true, 00:09:41.702 "zcopy": false, 00:09:41.702 "get_zone_info": false, 00:09:41.702 "zone_management": false, 00:09:41.702 "zone_append": false, 00:09:41.702 "compare": false, 00:09:41.702 "compare_and_write": false, 00:09:41.702 "abort": false, 00:09:41.702 "seek_hole": false, 00:09:41.702 "seek_data": false, 00:09:41.702 "copy": false, 00:09:41.702 "nvme_iov_md": false 00:09:41.702 }, 00:09:41.702 "memory_domains": [ 00:09:41.702 { 00:09:41.702 "dma_device_id": "system", 00:09:41.702 "dma_device_type": 1 00:09:41.702 }, 00:09:41.702 { 00:09:41.702 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:41.702 "dma_device_type": 2 00:09:41.702 }, 00:09:41.702 { 00:09:41.702 "dma_device_id": "system", 00:09:41.702 "dma_device_type": 1 00:09:41.702 }, 00:09:41.702 { 00:09:41.702 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:41.702 "dma_device_type": 2 00:09:41.702 } 00:09:41.702 ], 00:09:41.702 "driver_specific": { 00:09:41.702 "raid": { 00:09:41.702 "uuid": "d7ba24eb-261e-45f5-b5b3-abebe7358f9c", 00:09:41.702 "strip_size_kb": 64, 00:09:41.702 "state": "online", 00:09:41.702 "raid_level": "raid0", 00:09:41.702 "superblock": true, 00:09:41.702 "num_base_bdevs": 2, 00:09:41.702 "num_base_bdevs_discovered": 2, 00:09:41.702 "num_base_bdevs_operational": 2, 00:09:41.702 "base_bdevs_list": [ 00:09:41.702 { 00:09:41.702 "name": "BaseBdev1", 00:09:41.702 "uuid": "2fb4628e-dbfc-4f98-bcd7-0b0634d838b3", 00:09:41.702 "is_configured": true, 00:09:41.702 "data_offset": 2048, 00:09:41.702 "data_size": 63488 00:09:41.702 }, 00:09:41.702 { 00:09:41.702 "name": "BaseBdev2", 00:09:41.702 "uuid": "6a325f7b-4988-4746-83af-b3650e73d057", 00:09:41.702 "is_configured": true, 00:09:41.702 "data_offset": 2048, 00:09:41.702 "data_size": 63488 00:09:41.702 } 00:09:41.702 ] 00:09:41.702 } 00:09:41.702 } 00:09:41.702 }' 00:09:41.702 11:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:09:41.702 11:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:09:41.702 BaseBdev2' 00:09:41.702 11:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:41.702 11:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:09:41.702 11:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:41.962 11:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:41.962 "name": "BaseBdev1", 00:09:41.962 "aliases": [ 00:09:41.962 "2fb4628e-dbfc-4f98-bcd7-0b0634d838b3" 00:09:41.962 ], 00:09:41.962 "product_name": "Malloc disk", 00:09:41.962 "block_size": 512, 00:09:41.962 "num_blocks": 65536, 00:09:41.962 "uuid": "2fb4628e-dbfc-4f98-bcd7-0b0634d838b3", 00:09:41.962 "assigned_rate_limits": { 00:09:41.962 "rw_ios_per_sec": 0, 00:09:41.962 "rw_mbytes_per_sec": 0, 00:09:41.962 "r_mbytes_per_sec": 0, 00:09:41.962 "w_mbytes_per_sec": 0 00:09:41.962 }, 00:09:41.962 "claimed": true, 00:09:41.962 "claim_type": "exclusive_write", 00:09:41.962 "zoned": false, 00:09:41.962 "supported_io_types": { 00:09:41.962 "read": true, 00:09:41.962 "write": true, 00:09:41.962 "unmap": true, 00:09:41.962 "flush": true, 00:09:41.962 "reset": true, 00:09:41.962 "nvme_admin": false, 00:09:41.962 "nvme_io": false, 00:09:41.962 "nvme_io_md": false, 00:09:41.962 "write_zeroes": true, 00:09:41.962 "zcopy": true, 00:09:41.962 "get_zone_info": false, 00:09:41.962 "zone_management": false, 00:09:41.962 "zone_append": false, 00:09:41.962 "compare": false, 00:09:41.962 "compare_and_write": false, 00:09:41.962 "abort": true, 00:09:41.962 "seek_hole": false, 00:09:41.962 "seek_data": false, 00:09:41.962 "copy": true, 00:09:41.962 "nvme_iov_md": false 00:09:41.962 }, 00:09:41.962 "memory_domains": [ 00:09:41.962 { 00:09:41.962 "dma_device_id": "system", 00:09:41.962 "dma_device_type": 1 00:09:41.962 }, 00:09:41.962 { 00:09:41.962 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:41.962 "dma_device_type": 2 00:09:41.962 } 00:09:41.962 ], 00:09:41.962 "driver_specific": {} 00:09:41.962 }' 00:09:41.962 11:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:41.962 11:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:41.962 11:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:41.962 11:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:41.962 11:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:42.221 11:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:42.221 11:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:42.221 11:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:42.221 11:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:42.221 11:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:42.221 11:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:42.221 11:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:42.221 11:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:42.221 11:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:09:42.221 11:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:42.479 11:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:42.479 "name": "BaseBdev2", 00:09:42.479 "aliases": [ 00:09:42.479 "6a325f7b-4988-4746-83af-b3650e73d057" 00:09:42.479 ], 00:09:42.479 "product_name": "Malloc disk", 00:09:42.479 "block_size": 512, 00:09:42.479 "num_blocks": 65536, 00:09:42.479 "uuid": "6a325f7b-4988-4746-83af-b3650e73d057", 00:09:42.479 "assigned_rate_limits": { 00:09:42.479 "rw_ios_per_sec": 0, 00:09:42.479 "rw_mbytes_per_sec": 0, 00:09:42.479 "r_mbytes_per_sec": 0, 00:09:42.479 "w_mbytes_per_sec": 0 00:09:42.479 }, 00:09:42.479 "claimed": true, 00:09:42.480 "claim_type": "exclusive_write", 00:09:42.480 "zoned": false, 00:09:42.480 "supported_io_types": { 00:09:42.480 "read": true, 00:09:42.480 "write": true, 00:09:42.480 "unmap": true, 00:09:42.480 "flush": true, 00:09:42.480 "reset": true, 00:09:42.480 "nvme_admin": false, 00:09:42.480 "nvme_io": false, 00:09:42.480 "nvme_io_md": false, 00:09:42.480 "write_zeroes": true, 00:09:42.480 "zcopy": true, 00:09:42.480 "get_zone_info": false, 00:09:42.480 "zone_management": false, 00:09:42.480 "zone_append": false, 00:09:42.480 "compare": false, 00:09:42.480 "compare_and_write": false, 00:09:42.480 "abort": true, 00:09:42.480 "seek_hole": false, 00:09:42.480 "seek_data": false, 00:09:42.480 "copy": true, 00:09:42.480 "nvme_iov_md": false 00:09:42.480 }, 00:09:42.480 "memory_domains": [ 00:09:42.480 { 00:09:42.480 "dma_device_id": "system", 00:09:42.480 "dma_device_type": 1 00:09:42.480 }, 00:09:42.480 { 00:09:42.480 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:42.480 "dma_device_type": 2 00:09:42.480 } 00:09:42.480 ], 00:09:42.480 "driver_specific": {} 00:09:42.480 }' 00:09:42.480 11:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:42.480 11:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:42.480 11:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:42.480 11:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:42.480 11:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:42.480 11:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:42.480 11:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:42.738 11:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:42.738 11:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:42.738 11:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:42.738 11:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:42.738 11:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:42.738 11:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:09:42.997 [2024-07-12 11:50:33.013483] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:09:42.997 [2024-07-12 11:50:33.013504] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:09:42.997 [2024-07-12 11:50:33.013539] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:42.997 11:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:09:42.997 11:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:09:42.997 11:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:09:42.997 11:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:09:42.997 11:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:09:42.997 11:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:09:42.997 11:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:42.997 11:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:09:42.997 11:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:42.997 11:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:42.997 11:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:09:42.997 11:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:42.997 11:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:42.997 11:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:42.997 11:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:42.997 11:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:42.997 11:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:42.997 11:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:42.997 "name": "Existed_Raid", 00:09:42.997 "uuid": "d7ba24eb-261e-45f5-b5b3-abebe7358f9c", 00:09:42.997 "strip_size_kb": 64, 00:09:42.997 "state": "offline", 00:09:42.997 "raid_level": "raid0", 00:09:42.997 "superblock": true, 00:09:42.997 "num_base_bdevs": 2, 00:09:42.997 "num_base_bdevs_discovered": 1, 00:09:42.997 "num_base_bdevs_operational": 1, 00:09:42.997 "base_bdevs_list": [ 00:09:42.997 { 00:09:42.997 "name": null, 00:09:42.997 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:42.997 "is_configured": false, 00:09:42.997 "data_offset": 2048, 00:09:42.997 "data_size": 63488 00:09:42.997 }, 00:09:42.997 { 00:09:42.997 "name": "BaseBdev2", 00:09:42.997 "uuid": "6a325f7b-4988-4746-83af-b3650e73d057", 00:09:42.997 "is_configured": true, 00:09:42.997 "data_offset": 2048, 00:09:42.997 "data_size": 63488 00:09:42.997 } 00:09:42.997 ] 00:09:42.997 }' 00:09:42.997 11:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:42.997 11:50:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:43.563 11:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:09:43.563 11:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:09:43.563 11:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:43.563 11:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:09:43.822 11:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:09:43.822 11:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:09:43.822 11:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:09:43.822 [2024-07-12 11:50:33.992867] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:09:43.822 [2024-07-12 11:50:33.992907] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd8d890 name Existed_Raid, state offline 00:09:43.822 11:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:09:43.822 11:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:09:43.822 11:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:43.822 11:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:09:44.082 11:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:09:44.082 11:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:09:44.082 11:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:09:44.082 11:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 591299 00:09:44.082 11:50:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 591299 ']' 00:09:44.082 11:50:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 591299 00:09:44.082 11:50:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:09:44.082 11:50:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:44.082 11:50:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 591299 00:09:44.082 11:50:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:44.082 11:50:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:44.082 11:50:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 591299' 00:09:44.082 killing process with pid 591299 00:09:44.082 11:50:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 591299 00:09:44.082 [2024-07-12 11:50:34.226380] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:44.082 11:50:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 591299 00:09:44.082 [2024-07-12 11:50:34.227172] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:44.341 11:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:09:44.341 00:09:44.341 real 0m7.997s 00:09:44.341 user 0m14.419s 00:09:44.341 sys 0m1.228s 00:09:44.341 11:50:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:44.341 11:50:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:44.341 ************************************ 00:09:44.341 END TEST raid_state_function_test_sb 00:09:44.341 ************************************ 00:09:44.341 11:50:34 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:09:44.341 11:50:34 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 2 00:09:44.341 11:50:34 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:09:44.341 11:50:34 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:44.341 11:50:34 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:44.341 ************************************ 00:09:44.341 START TEST raid_superblock_test 00:09:44.341 ************************************ 00:09:44.341 11:50:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 2 00:09:44.341 11:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:09:44.341 11:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:09:44.341 11:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:09:44.341 11:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:09:44.341 11:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:09:44.341 11:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:09:44.341 11:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:09:44.341 11:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:09:44.341 11:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:09:44.341 11:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:09:44.341 11:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:09:44.341 11:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:09:44.341 11:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:09:44.341 11:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:09:44.341 11:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:09:44.341 11:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:09:44.341 11:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:09:44.341 11:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=592890 00:09:44.341 11:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 592890 /var/tmp/spdk-raid.sock 00:09:44.341 11:50:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 592890 ']' 00:09:44.341 11:50:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:44.341 11:50:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:44.341 11:50:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:44.341 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:44.341 11:50:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:44.341 11:50:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:44.341 [2024-07-12 11:50:34.501931] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:09:44.342 [2024-07-12 11:50:34.501971] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid592890 ] 00:09:44.342 [2024-07-12 11:50:34.566616] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:44.600 [2024-07-12 11:50:34.645121] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:44.600 [2024-07-12 11:50:34.693892] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:44.600 [2024-07-12 11:50:34.693916] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:45.168 11:50:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:45.168 11:50:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:09:45.168 11:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:09:45.168 11:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:09:45.168 11:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:09:45.168 11:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:09:45.168 11:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:09:45.168 11:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:09:45.168 11:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:09:45.168 11:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:09:45.168 11:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:09:45.427 malloc1 00:09:45.427 11:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:09:45.427 [2024-07-12 11:50:35.597289] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:09:45.427 [2024-07-12 11:50:35.597319] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:45.427 [2024-07-12 11:50:35.597330] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18dd270 00:09:45.427 [2024-07-12 11:50:35.597352] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:45.427 [2024-07-12 11:50:35.598459] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:45.427 [2024-07-12 11:50:35.598479] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:09:45.427 pt1 00:09:45.427 11:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:09:45.427 11:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:09:45.427 11:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:09:45.427 11:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:09:45.427 11:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:09:45.427 11:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:09:45.427 11:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:09:45.427 11:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:09:45.427 11:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:09:45.686 malloc2 00:09:45.686 11:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:09:45.686 [2024-07-12 11:50:35.929796] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:09:45.686 [2024-07-12 11:50:35.929827] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:45.686 [2024-07-12 11:50:35.929836] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18de580 00:09:45.686 [2024-07-12 11:50:35.929842] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:45.686 [2024-07-12 11:50:35.930911] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:45.686 [2024-07-12 11:50:35.930932] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:09:45.945 pt2 00:09:45.945 11:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:09:45.945 11:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:09:45.945 11:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2' -n raid_bdev1 -s 00:09:45.945 [2024-07-12 11:50:36.098259] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:09:45.945 [2024-07-12 11:50:36.099148] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:09:45.945 [2024-07-12 11:50:36.099245] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a88890 00:09:45.945 [2024-07-12 11:50:36.099253] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:09:45.945 [2024-07-12 11:50:36.099387] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a8a8f0 00:09:45.945 [2024-07-12 11:50:36.099480] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a88890 00:09:45.945 [2024-07-12 11:50:36.099485] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a88890 00:09:45.945 [2024-07-12 11:50:36.099557] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:45.945 11:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:09:45.945 11:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:09:45.945 11:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:45.945 11:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:45.945 11:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:45.945 11:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:45.945 11:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:45.945 11:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:45.945 11:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:45.945 11:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:45.945 11:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:45.945 11:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:09:46.204 11:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:46.204 "name": "raid_bdev1", 00:09:46.204 "uuid": "7444bb91-a4fb-4748-9bb0-1c54e090632b", 00:09:46.204 "strip_size_kb": 64, 00:09:46.204 "state": "online", 00:09:46.204 "raid_level": "raid0", 00:09:46.204 "superblock": true, 00:09:46.204 "num_base_bdevs": 2, 00:09:46.204 "num_base_bdevs_discovered": 2, 00:09:46.204 "num_base_bdevs_operational": 2, 00:09:46.204 "base_bdevs_list": [ 00:09:46.204 { 00:09:46.204 "name": "pt1", 00:09:46.204 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:46.204 "is_configured": true, 00:09:46.204 "data_offset": 2048, 00:09:46.204 "data_size": 63488 00:09:46.204 }, 00:09:46.204 { 00:09:46.204 "name": "pt2", 00:09:46.204 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:46.204 "is_configured": true, 00:09:46.204 "data_offset": 2048, 00:09:46.204 "data_size": 63488 00:09:46.204 } 00:09:46.204 ] 00:09:46.204 }' 00:09:46.204 11:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:46.204 11:50:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:46.773 11:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:09:46.773 11:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:09:46.773 11:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:09:46.773 11:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:09:46.773 11:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:09:46.773 11:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:09:46.773 11:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:09:46.773 11:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:09:46.773 [2024-07-12 11:50:36.880446] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:46.773 11:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:09:46.773 "name": "raid_bdev1", 00:09:46.773 "aliases": [ 00:09:46.773 "7444bb91-a4fb-4748-9bb0-1c54e090632b" 00:09:46.773 ], 00:09:46.773 "product_name": "Raid Volume", 00:09:46.773 "block_size": 512, 00:09:46.773 "num_blocks": 126976, 00:09:46.773 "uuid": "7444bb91-a4fb-4748-9bb0-1c54e090632b", 00:09:46.773 "assigned_rate_limits": { 00:09:46.773 "rw_ios_per_sec": 0, 00:09:46.773 "rw_mbytes_per_sec": 0, 00:09:46.773 "r_mbytes_per_sec": 0, 00:09:46.773 "w_mbytes_per_sec": 0 00:09:46.773 }, 00:09:46.773 "claimed": false, 00:09:46.773 "zoned": false, 00:09:46.773 "supported_io_types": { 00:09:46.773 "read": true, 00:09:46.773 "write": true, 00:09:46.773 "unmap": true, 00:09:46.773 "flush": true, 00:09:46.773 "reset": true, 00:09:46.773 "nvme_admin": false, 00:09:46.773 "nvme_io": false, 00:09:46.773 "nvme_io_md": false, 00:09:46.773 "write_zeroes": true, 00:09:46.773 "zcopy": false, 00:09:46.773 "get_zone_info": false, 00:09:46.773 "zone_management": false, 00:09:46.773 "zone_append": false, 00:09:46.773 "compare": false, 00:09:46.773 "compare_and_write": false, 00:09:46.773 "abort": false, 00:09:46.773 "seek_hole": false, 00:09:46.773 "seek_data": false, 00:09:46.773 "copy": false, 00:09:46.773 "nvme_iov_md": false 00:09:46.773 }, 00:09:46.773 "memory_domains": [ 00:09:46.773 { 00:09:46.773 "dma_device_id": "system", 00:09:46.773 "dma_device_type": 1 00:09:46.773 }, 00:09:46.773 { 00:09:46.773 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:46.773 "dma_device_type": 2 00:09:46.773 }, 00:09:46.773 { 00:09:46.773 "dma_device_id": "system", 00:09:46.773 "dma_device_type": 1 00:09:46.773 }, 00:09:46.773 { 00:09:46.773 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:46.773 "dma_device_type": 2 00:09:46.773 } 00:09:46.773 ], 00:09:46.773 "driver_specific": { 00:09:46.773 "raid": { 00:09:46.773 "uuid": "7444bb91-a4fb-4748-9bb0-1c54e090632b", 00:09:46.773 "strip_size_kb": 64, 00:09:46.773 "state": "online", 00:09:46.773 "raid_level": "raid0", 00:09:46.773 "superblock": true, 00:09:46.773 "num_base_bdevs": 2, 00:09:46.773 "num_base_bdevs_discovered": 2, 00:09:46.773 "num_base_bdevs_operational": 2, 00:09:46.773 "base_bdevs_list": [ 00:09:46.773 { 00:09:46.773 "name": "pt1", 00:09:46.773 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:46.773 "is_configured": true, 00:09:46.773 "data_offset": 2048, 00:09:46.773 "data_size": 63488 00:09:46.773 }, 00:09:46.773 { 00:09:46.773 "name": "pt2", 00:09:46.773 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:46.773 "is_configured": true, 00:09:46.773 "data_offset": 2048, 00:09:46.773 "data_size": 63488 00:09:46.773 } 00:09:46.773 ] 00:09:46.773 } 00:09:46.773 } 00:09:46.773 }' 00:09:46.773 11:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:09:46.773 11:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:09:46.773 pt2' 00:09:46.773 11:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:46.773 11:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:09:46.774 11:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:47.032 11:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:47.032 "name": "pt1", 00:09:47.032 "aliases": [ 00:09:47.032 "00000000-0000-0000-0000-000000000001" 00:09:47.032 ], 00:09:47.032 "product_name": "passthru", 00:09:47.032 "block_size": 512, 00:09:47.032 "num_blocks": 65536, 00:09:47.032 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:47.032 "assigned_rate_limits": { 00:09:47.032 "rw_ios_per_sec": 0, 00:09:47.032 "rw_mbytes_per_sec": 0, 00:09:47.032 "r_mbytes_per_sec": 0, 00:09:47.032 "w_mbytes_per_sec": 0 00:09:47.032 }, 00:09:47.032 "claimed": true, 00:09:47.032 "claim_type": "exclusive_write", 00:09:47.032 "zoned": false, 00:09:47.032 "supported_io_types": { 00:09:47.032 "read": true, 00:09:47.032 "write": true, 00:09:47.032 "unmap": true, 00:09:47.032 "flush": true, 00:09:47.032 "reset": true, 00:09:47.032 "nvme_admin": false, 00:09:47.032 "nvme_io": false, 00:09:47.032 "nvme_io_md": false, 00:09:47.032 "write_zeroes": true, 00:09:47.032 "zcopy": true, 00:09:47.032 "get_zone_info": false, 00:09:47.032 "zone_management": false, 00:09:47.032 "zone_append": false, 00:09:47.032 "compare": false, 00:09:47.032 "compare_and_write": false, 00:09:47.032 "abort": true, 00:09:47.032 "seek_hole": false, 00:09:47.032 "seek_data": false, 00:09:47.032 "copy": true, 00:09:47.032 "nvme_iov_md": false 00:09:47.032 }, 00:09:47.032 "memory_domains": [ 00:09:47.032 { 00:09:47.032 "dma_device_id": "system", 00:09:47.032 "dma_device_type": 1 00:09:47.032 }, 00:09:47.032 { 00:09:47.032 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:47.032 "dma_device_type": 2 00:09:47.032 } 00:09:47.032 ], 00:09:47.032 "driver_specific": { 00:09:47.032 "passthru": { 00:09:47.032 "name": "pt1", 00:09:47.032 "base_bdev_name": "malloc1" 00:09:47.032 } 00:09:47.032 } 00:09:47.032 }' 00:09:47.032 11:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:47.032 11:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:47.032 11:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:47.032 11:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:47.032 11:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:47.032 11:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:47.032 11:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:47.291 11:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:47.291 11:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:47.291 11:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:47.291 11:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:47.291 11:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:47.291 11:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:47.291 11:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:09:47.291 11:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:47.549 11:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:47.549 "name": "pt2", 00:09:47.549 "aliases": [ 00:09:47.549 "00000000-0000-0000-0000-000000000002" 00:09:47.549 ], 00:09:47.549 "product_name": "passthru", 00:09:47.549 "block_size": 512, 00:09:47.549 "num_blocks": 65536, 00:09:47.549 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:47.549 "assigned_rate_limits": { 00:09:47.549 "rw_ios_per_sec": 0, 00:09:47.549 "rw_mbytes_per_sec": 0, 00:09:47.549 "r_mbytes_per_sec": 0, 00:09:47.549 "w_mbytes_per_sec": 0 00:09:47.549 }, 00:09:47.549 "claimed": true, 00:09:47.549 "claim_type": "exclusive_write", 00:09:47.549 "zoned": false, 00:09:47.549 "supported_io_types": { 00:09:47.549 "read": true, 00:09:47.549 "write": true, 00:09:47.549 "unmap": true, 00:09:47.549 "flush": true, 00:09:47.549 "reset": true, 00:09:47.549 "nvme_admin": false, 00:09:47.549 "nvme_io": false, 00:09:47.549 "nvme_io_md": false, 00:09:47.549 "write_zeroes": true, 00:09:47.549 "zcopy": true, 00:09:47.549 "get_zone_info": false, 00:09:47.549 "zone_management": false, 00:09:47.549 "zone_append": false, 00:09:47.549 "compare": false, 00:09:47.549 "compare_and_write": false, 00:09:47.550 "abort": true, 00:09:47.550 "seek_hole": false, 00:09:47.550 "seek_data": false, 00:09:47.550 "copy": true, 00:09:47.550 "nvme_iov_md": false 00:09:47.550 }, 00:09:47.550 "memory_domains": [ 00:09:47.550 { 00:09:47.550 "dma_device_id": "system", 00:09:47.550 "dma_device_type": 1 00:09:47.550 }, 00:09:47.550 { 00:09:47.550 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:47.550 "dma_device_type": 2 00:09:47.550 } 00:09:47.550 ], 00:09:47.550 "driver_specific": { 00:09:47.550 "passthru": { 00:09:47.550 "name": "pt2", 00:09:47.550 "base_bdev_name": "malloc2" 00:09:47.550 } 00:09:47.550 } 00:09:47.550 }' 00:09:47.550 11:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:47.550 11:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:47.550 11:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:47.550 11:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:47.550 11:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:47.550 11:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:47.550 11:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:47.809 11:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:47.809 11:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:47.809 11:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:47.809 11:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:47.809 11:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:47.809 11:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:09:47.809 11:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:09:48.067 [2024-07-12 11:50:38.071493] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:48.067 11:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=7444bb91-a4fb-4748-9bb0-1c54e090632b 00:09:48.067 11:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 7444bb91-a4fb-4748-9bb0-1c54e090632b ']' 00:09:48.067 11:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:09:48.067 [2024-07-12 11:50:38.239764] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:09:48.067 [2024-07-12 11:50:38.239776] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:09:48.067 [2024-07-12 11:50:38.239812] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:48.067 [2024-07-12 11:50:38.239841] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:48.067 [2024-07-12 11:50:38.239847] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a88890 name raid_bdev1, state offline 00:09:48.067 11:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:48.067 11:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:09:48.325 11:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:09:48.325 11:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:09:48.325 11:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:09:48.325 11:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:09:48.583 11:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:09:48.583 11:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:09:48.583 11:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:09:48.583 11:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:09:48.842 11:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:09:48.842 11:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:09:48.842 11:50:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:09:48.842 11:50:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:09:48.842 11:50:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:48.842 11:50:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:48.842 11:50:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:48.842 11:50:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:48.842 11:50:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:48.842 11:50:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:48.842 11:50:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:48.842 11:50:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:09:48.842 11:50:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:09:48.842 [2024-07-12 11:50:39.065883] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:09:48.842 [2024-07-12 11:50:39.066854] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:09:48.842 [2024-07-12 11:50:39.066895] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:09:48.842 [2024-07-12 11:50:39.066919] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:09:48.842 [2024-07-12 11:50:39.066928] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:09:48.842 [2024-07-12 11:50:39.066949] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a8a890 name raid_bdev1, state configuring 00:09:48.842 request: 00:09:48.842 { 00:09:48.842 "name": "raid_bdev1", 00:09:48.842 "raid_level": "raid0", 00:09:48.842 "base_bdevs": [ 00:09:48.842 "malloc1", 00:09:48.842 "malloc2" 00:09:48.842 ], 00:09:48.842 "superblock": false, 00:09:48.842 "strip_size_kb": 64, 00:09:48.842 "method": "bdev_raid_create", 00:09:48.842 "req_id": 1 00:09:48.842 } 00:09:48.842 Got JSON-RPC error response 00:09:48.842 response: 00:09:48.842 { 00:09:48.842 "code": -17, 00:09:48.842 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:09:48.842 } 00:09:48.842 11:50:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:09:48.842 11:50:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:09:48.842 11:50:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:09:48.842 11:50:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:09:48.842 11:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:48.842 11:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:09:49.133 11:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:09:49.133 11:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:09:49.133 11:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:09:49.486 [2024-07-12 11:50:39.390690] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:09:49.486 [2024-07-12 11:50:39.390716] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:49.486 [2024-07-12 11:50:39.390726] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18dbae0 00:09:49.486 [2024-07-12 11:50:39.390732] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:49.486 [2024-07-12 11:50:39.391896] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:49.486 [2024-07-12 11:50:39.391918] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:09:49.486 [2024-07-12 11:50:39.391966] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:09:49.486 [2024-07-12 11:50:39.391986] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:09:49.486 pt1 00:09:49.486 11:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 2 00:09:49.486 11:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:09:49.486 11:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:49.486 11:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:49.486 11:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:49.486 11:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:49.486 11:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:49.486 11:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:49.486 11:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:49.486 11:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:49.486 11:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:49.486 11:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:09:49.486 11:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:49.486 "name": "raid_bdev1", 00:09:49.486 "uuid": "7444bb91-a4fb-4748-9bb0-1c54e090632b", 00:09:49.486 "strip_size_kb": 64, 00:09:49.486 "state": "configuring", 00:09:49.486 "raid_level": "raid0", 00:09:49.486 "superblock": true, 00:09:49.486 "num_base_bdevs": 2, 00:09:49.486 "num_base_bdevs_discovered": 1, 00:09:49.486 "num_base_bdevs_operational": 2, 00:09:49.486 "base_bdevs_list": [ 00:09:49.486 { 00:09:49.486 "name": "pt1", 00:09:49.486 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:49.486 "is_configured": true, 00:09:49.486 "data_offset": 2048, 00:09:49.486 "data_size": 63488 00:09:49.486 }, 00:09:49.486 { 00:09:49.486 "name": null, 00:09:49.486 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:49.486 "is_configured": false, 00:09:49.486 "data_offset": 2048, 00:09:49.486 "data_size": 63488 00:09:49.486 } 00:09:49.486 ] 00:09:49.486 }' 00:09:49.486 11:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:49.486 11:50:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:50.180 11:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:09:50.180 11:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:09:50.180 11:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:09:50.180 11:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:09:50.180 [2024-07-12 11:50:40.208811] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:09:50.180 [2024-07-12 11:50:40.208847] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:50.180 [2024-07-12 11:50:40.208859] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18dd4a0 00:09:50.180 [2024-07-12 11:50:40.208865] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:50.180 [2024-07-12 11:50:40.209115] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:50.180 [2024-07-12 11:50:40.209125] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:09:50.181 [2024-07-12 11:50:40.209169] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:09:50.181 [2024-07-12 11:50:40.209181] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:09:50.181 [2024-07-12 11:50:40.209250] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a87bf0 00:09:50.181 [2024-07-12 11:50:40.209255] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:09:50.181 [2024-07-12 11:50:40.209364] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18dddf0 00:09:50.181 [2024-07-12 11:50:40.209449] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a87bf0 00:09:50.181 [2024-07-12 11:50:40.209454] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a87bf0 00:09:50.181 [2024-07-12 11:50:40.209516] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:50.181 pt2 00:09:50.181 11:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:09:50.181 11:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:09:50.181 11:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:09:50.181 11:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:09:50.181 11:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:50.181 11:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:50.181 11:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:50.181 11:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:50.181 11:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:50.181 11:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:50.181 11:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:50.181 11:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:50.181 11:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:50.181 11:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:09:50.439 11:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:50.439 "name": "raid_bdev1", 00:09:50.439 "uuid": "7444bb91-a4fb-4748-9bb0-1c54e090632b", 00:09:50.439 "strip_size_kb": 64, 00:09:50.439 "state": "online", 00:09:50.439 "raid_level": "raid0", 00:09:50.439 "superblock": true, 00:09:50.439 "num_base_bdevs": 2, 00:09:50.439 "num_base_bdevs_discovered": 2, 00:09:50.439 "num_base_bdevs_operational": 2, 00:09:50.439 "base_bdevs_list": [ 00:09:50.439 { 00:09:50.439 "name": "pt1", 00:09:50.439 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:50.439 "is_configured": true, 00:09:50.439 "data_offset": 2048, 00:09:50.439 "data_size": 63488 00:09:50.439 }, 00:09:50.439 { 00:09:50.439 "name": "pt2", 00:09:50.439 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:50.439 "is_configured": true, 00:09:50.439 "data_offset": 2048, 00:09:50.439 "data_size": 63488 00:09:50.439 } 00:09:50.439 ] 00:09:50.439 }' 00:09:50.439 11:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:50.439 11:50:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:50.697 11:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:09:50.697 11:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:09:50.697 11:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:09:50.697 11:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:09:50.697 11:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:09:50.697 11:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:09:50.697 11:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:09:50.697 11:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:09:50.956 [2024-07-12 11:50:40.995018] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:50.956 11:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:09:50.956 "name": "raid_bdev1", 00:09:50.956 "aliases": [ 00:09:50.956 "7444bb91-a4fb-4748-9bb0-1c54e090632b" 00:09:50.956 ], 00:09:50.956 "product_name": "Raid Volume", 00:09:50.956 "block_size": 512, 00:09:50.956 "num_blocks": 126976, 00:09:50.956 "uuid": "7444bb91-a4fb-4748-9bb0-1c54e090632b", 00:09:50.956 "assigned_rate_limits": { 00:09:50.956 "rw_ios_per_sec": 0, 00:09:50.956 "rw_mbytes_per_sec": 0, 00:09:50.956 "r_mbytes_per_sec": 0, 00:09:50.956 "w_mbytes_per_sec": 0 00:09:50.956 }, 00:09:50.956 "claimed": false, 00:09:50.956 "zoned": false, 00:09:50.956 "supported_io_types": { 00:09:50.956 "read": true, 00:09:50.956 "write": true, 00:09:50.956 "unmap": true, 00:09:50.956 "flush": true, 00:09:50.956 "reset": true, 00:09:50.956 "nvme_admin": false, 00:09:50.956 "nvme_io": false, 00:09:50.956 "nvme_io_md": false, 00:09:50.956 "write_zeroes": true, 00:09:50.956 "zcopy": false, 00:09:50.956 "get_zone_info": false, 00:09:50.956 "zone_management": false, 00:09:50.956 "zone_append": false, 00:09:50.956 "compare": false, 00:09:50.956 "compare_and_write": false, 00:09:50.956 "abort": false, 00:09:50.956 "seek_hole": false, 00:09:50.956 "seek_data": false, 00:09:50.956 "copy": false, 00:09:50.956 "nvme_iov_md": false 00:09:50.956 }, 00:09:50.956 "memory_domains": [ 00:09:50.956 { 00:09:50.956 "dma_device_id": "system", 00:09:50.956 "dma_device_type": 1 00:09:50.956 }, 00:09:50.956 { 00:09:50.956 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:50.956 "dma_device_type": 2 00:09:50.956 }, 00:09:50.956 { 00:09:50.956 "dma_device_id": "system", 00:09:50.956 "dma_device_type": 1 00:09:50.956 }, 00:09:50.956 { 00:09:50.956 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:50.956 "dma_device_type": 2 00:09:50.956 } 00:09:50.956 ], 00:09:50.956 "driver_specific": { 00:09:50.956 "raid": { 00:09:50.956 "uuid": "7444bb91-a4fb-4748-9bb0-1c54e090632b", 00:09:50.956 "strip_size_kb": 64, 00:09:50.956 "state": "online", 00:09:50.956 "raid_level": "raid0", 00:09:50.956 "superblock": true, 00:09:50.956 "num_base_bdevs": 2, 00:09:50.956 "num_base_bdevs_discovered": 2, 00:09:50.956 "num_base_bdevs_operational": 2, 00:09:50.956 "base_bdevs_list": [ 00:09:50.956 { 00:09:50.956 "name": "pt1", 00:09:50.956 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:50.956 "is_configured": true, 00:09:50.956 "data_offset": 2048, 00:09:50.956 "data_size": 63488 00:09:50.956 }, 00:09:50.956 { 00:09:50.956 "name": "pt2", 00:09:50.956 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:50.956 "is_configured": true, 00:09:50.956 "data_offset": 2048, 00:09:50.956 "data_size": 63488 00:09:50.956 } 00:09:50.956 ] 00:09:50.956 } 00:09:50.956 } 00:09:50.956 }' 00:09:50.956 11:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:09:50.956 11:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:09:50.956 pt2' 00:09:50.956 11:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:50.956 11:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:50.956 11:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:09:51.215 11:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:51.215 "name": "pt1", 00:09:51.215 "aliases": [ 00:09:51.215 "00000000-0000-0000-0000-000000000001" 00:09:51.215 ], 00:09:51.215 "product_name": "passthru", 00:09:51.215 "block_size": 512, 00:09:51.215 "num_blocks": 65536, 00:09:51.215 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:51.215 "assigned_rate_limits": { 00:09:51.215 "rw_ios_per_sec": 0, 00:09:51.215 "rw_mbytes_per_sec": 0, 00:09:51.215 "r_mbytes_per_sec": 0, 00:09:51.215 "w_mbytes_per_sec": 0 00:09:51.215 }, 00:09:51.215 "claimed": true, 00:09:51.215 "claim_type": "exclusive_write", 00:09:51.215 "zoned": false, 00:09:51.215 "supported_io_types": { 00:09:51.215 "read": true, 00:09:51.215 "write": true, 00:09:51.215 "unmap": true, 00:09:51.215 "flush": true, 00:09:51.215 "reset": true, 00:09:51.215 "nvme_admin": false, 00:09:51.215 "nvme_io": false, 00:09:51.215 "nvme_io_md": false, 00:09:51.215 "write_zeroes": true, 00:09:51.215 "zcopy": true, 00:09:51.216 "get_zone_info": false, 00:09:51.216 "zone_management": false, 00:09:51.216 "zone_append": false, 00:09:51.216 "compare": false, 00:09:51.216 "compare_and_write": false, 00:09:51.216 "abort": true, 00:09:51.216 "seek_hole": false, 00:09:51.216 "seek_data": false, 00:09:51.216 "copy": true, 00:09:51.216 "nvme_iov_md": false 00:09:51.216 }, 00:09:51.216 "memory_domains": [ 00:09:51.216 { 00:09:51.216 "dma_device_id": "system", 00:09:51.216 "dma_device_type": 1 00:09:51.216 }, 00:09:51.216 { 00:09:51.216 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:51.216 "dma_device_type": 2 00:09:51.216 } 00:09:51.216 ], 00:09:51.216 "driver_specific": { 00:09:51.216 "passthru": { 00:09:51.216 "name": "pt1", 00:09:51.216 "base_bdev_name": "malloc1" 00:09:51.216 } 00:09:51.216 } 00:09:51.216 }' 00:09:51.216 11:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:51.216 11:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:51.216 11:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:51.216 11:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:51.216 11:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:51.216 11:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:51.216 11:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:51.216 11:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:51.216 11:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:51.216 11:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:51.216 11:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:51.475 11:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:51.475 11:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:51.475 11:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:09:51.475 11:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:51.475 11:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:51.475 "name": "pt2", 00:09:51.475 "aliases": [ 00:09:51.475 "00000000-0000-0000-0000-000000000002" 00:09:51.475 ], 00:09:51.475 "product_name": "passthru", 00:09:51.475 "block_size": 512, 00:09:51.475 "num_blocks": 65536, 00:09:51.475 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:51.475 "assigned_rate_limits": { 00:09:51.475 "rw_ios_per_sec": 0, 00:09:51.475 "rw_mbytes_per_sec": 0, 00:09:51.475 "r_mbytes_per_sec": 0, 00:09:51.475 "w_mbytes_per_sec": 0 00:09:51.475 }, 00:09:51.475 "claimed": true, 00:09:51.475 "claim_type": "exclusive_write", 00:09:51.475 "zoned": false, 00:09:51.475 "supported_io_types": { 00:09:51.475 "read": true, 00:09:51.475 "write": true, 00:09:51.475 "unmap": true, 00:09:51.475 "flush": true, 00:09:51.475 "reset": true, 00:09:51.475 "nvme_admin": false, 00:09:51.475 "nvme_io": false, 00:09:51.475 "nvme_io_md": false, 00:09:51.475 "write_zeroes": true, 00:09:51.475 "zcopy": true, 00:09:51.475 "get_zone_info": false, 00:09:51.475 "zone_management": false, 00:09:51.475 "zone_append": false, 00:09:51.475 "compare": false, 00:09:51.475 "compare_and_write": false, 00:09:51.475 "abort": true, 00:09:51.475 "seek_hole": false, 00:09:51.475 "seek_data": false, 00:09:51.475 "copy": true, 00:09:51.475 "nvme_iov_md": false 00:09:51.475 }, 00:09:51.475 "memory_domains": [ 00:09:51.475 { 00:09:51.475 "dma_device_id": "system", 00:09:51.475 "dma_device_type": 1 00:09:51.475 }, 00:09:51.475 { 00:09:51.475 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:51.475 "dma_device_type": 2 00:09:51.475 } 00:09:51.475 ], 00:09:51.475 "driver_specific": { 00:09:51.475 "passthru": { 00:09:51.475 "name": "pt2", 00:09:51.475 "base_bdev_name": "malloc2" 00:09:51.475 } 00:09:51.475 } 00:09:51.475 }' 00:09:51.475 11:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:51.475 11:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:51.475 11:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:51.475 11:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:51.734 11:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:51.734 11:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:51.734 11:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:51.734 11:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:51.734 11:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:51.734 11:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:51.734 11:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:51.734 11:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:51.734 11:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:09:51.734 11:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:09:51.992 [2024-07-12 11:50:42.093870] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:51.992 11:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 7444bb91-a4fb-4748-9bb0-1c54e090632b '!=' 7444bb91-a4fb-4748-9bb0-1c54e090632b ']' 00:09:51.992 11:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:09:51.992 11:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:09:51.992 11:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:09:51.992 11:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 592890 00:09:51.993 11:50:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 592890 ']' 00:09:51.993 11:50:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 592890 00:09:51.993 11:50:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:09:51.993 11:50:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:51.993 11:50:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 592890 00:09:51.993 11:50:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:51.993 11:50:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:51.993 11:50:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 592890' 00:09:51.993 killing process with pid 592890 00:09:51.993 11:50:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 592890 00:09:51.993 [2024-07-12 11:50:42.162379] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:51.993 [2024-07-12 11:50:42.162418] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:51.993 [2024-07-12 11:50:42.162449] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:51.993 [2024-07-12 11:50:42.162454] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a87bf0 name raid_bdev1, state offline 00:09:51.993 11:50:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 592890 00:09:51.993 [2024-07-12 11:50:42.177504] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:52.252 11:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:09:52.252 00:09:52.252 real 0m7.885s 00:09:52.252 user 0m14.268s 00:09:52.252 sys 0m1.208s 00:09:52.252 11:50:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:52.252 11:50:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:52.252 ************************************ 00:09:52.252 END TEST raid_superblock_test 00:09:52.252 ************************************ 00:09:52.252 11:50:42 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:09:52.252 11:50:42 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 2 read 00:09:52.252 11:50:42 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:09:52.252 11:50:42 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:52.252 11:50:42 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:52.252 ************************************ 00:09:52.252 START TEST raid_read_error_test 00:09:52.252 ************************************ 00:09:52.252 11:50:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 2 read 00:09:52.252 11:50:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:09:52.252 11:50:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:09:52.252 11:50:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:09:52.252 11:50:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:09:52.253 11:50:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:09:52.253 11:50:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:09:52.253 11:50:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:09:52.253 11:50:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:09:52.253 11:50:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:09:52.253 11:50:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:09:52.253 11:50:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:09:52.253 11:50:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:09:52.253 11:50:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:09:52.253 11:50:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:09:52.253 11:50:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:09:52.253 11:50:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:09:52.253 11:50:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:09:52.253 11:50:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:09:52.253 11:50:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:09:52.253 11:50:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:09:52.253 11:50:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:09:52.253 11:50:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:09:52.253 11:50:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.a7Pn0ZNdSj 00:09:52.253 11:50:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=594404 00:09:52.253 11:50:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 594404 /var/tmp/spdk-raid.sock 00:09:52.253 11:50:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:09:52.253 11:50:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 594404 ']' 00:09:52.253 11:50:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:52.253 11:50:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:52.253 11:50:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:52.253 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:52.253 11:50:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:52.253 11:50:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:09:52.253 [2024-07-12 11:50:42.466888] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:09:52.253 [2024-07-12 11:50:42.466923] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid594404 ] 00:09:52.512 [2024-07-12 11:50:42.530366] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:52.512 [2024-07-12 11:50:42.607748] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:52.512 [2024-07-12 11:50:42.662570] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:52.512 [2024-07-12 11:50:42.662598] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:53.080 11:50:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:53.080 11:50:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:09:53.080 11:50:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:09:53.080 11:50:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:09:53.338 BaseBdev1_malloc 00:09:53.338 11:50:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:09:53.338 true 00:09:53.597 11:50:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:09:53.597 [2024-07-12 11:50:43.746680] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:09:53.597 [2024-07-12 11:50:43.746712] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:53.597 [2024-07-12 11:50:43.746723] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x129d2d0 00:09:53.597 [2024-07-12 11:50:43.746729] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:53.597 [2024-07-12 11:50:43.747966] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:53.597 [2024-07-12 11:50:43.747986] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:09:53.597 BaseBdev1 00:09:53.597 11:50:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:09:53.597 11:50:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:09:53.856 BaseBdev2_malloc 00:09:53.856 11:50:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:09:53.856 true 00:09:53.856 11:50:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:09:54.115 [2024-07-12 11:50:44.251605] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:09:54.115 [2024-07-12 11:50:44.251633] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:54.115 [2024-07-12 11:50:44.251644] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12a1f40 00:09:54.115 [2024-07-12 11:50:44.251650] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:54.115 [2024-07-12 11:50:44.252714] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:54.115 [2024-07-12 11:50:44.252735] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:09:54.115 BaseBdev2 00:09:54.115 11:50:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:09:54.374 [2024-07-12 11:50:44.408026] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:54.374 [2024-07-12 11:50:44.408888] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:09:54.374 [2024-07-12 11:50:44.409014] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12a2c80 00:09:54.374 [2024-07-12 11:50:44.409022] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:09:54.374 [2024-07-12 11:50:44.409148] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12a5ac0 00:09:54.374 [2024-07-12 11:50:44.409245] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12a2c80 00:09:54.374 [2024-07-12 11:50:44.409251] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12a2c80 00:09:54.374 [2024-07-12 11:50:44.409317] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:54.374 11:50:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:09:54.374 11:50:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:09:54.374 11:50:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:54.374 11:50:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:54.374 11:50:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:54.374 11:50:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:54.374 11:50:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:54.374 11:50:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:54.374 11:50:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:54.374 11:50:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:54.374 11:50:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:54.374 11:50:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:09:54.374 11:50:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:54.374 "name": "raid_bdev1", 00:09:54.374 "uuid": "7c3225fd-3191-4593-97a8-61d88b2145a5", 00:09:54.374 "strip_size_kb": 64, 00:09:54.374 "state": "online", 00:09:54.374 "raid_level": "raid0", 00:09:54.374 "superblock": true, 00:09:54.374 "num_base_bdevs": 2, 00:09:54.374 "num_base_bdevs_discovered": 2, 00:09:54.374 "num_base_bdevs_operational": 2, 00:09:54.374 "base_bdevs_list": [ 00:09:54.374 { 00:09:54.374 "name": "BaseBdev1", 00:09:54.374 "uuid": "e5f76744-2bea-5eb3-9a03-2854422c645f", 00:09:54.374 "is_configured": true, 00:09:54.374 "data_offset": 2048, 00:09:54.374 "data_size": 63488 00:09:54.374 }, 00:09:54.374 { 00:09:54.374 "name": "BaseBdev2", 00:09:54.374 "uuid": "ae8a0173-0423-5d5f-8f53-045860ed6286", 00:09:54.374 "is_configured": true, 00:09:54.374 "data_offset": 2048, 00:09:54.374 "data_size": 63488 00:09:54.374 } 00:09:54.374 ] 00:09:54.374 }' 00:09:54.374 11:50:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:54.374 11:50:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:09:54.943 11:50:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:09:54.943 11:50:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:09:54.943 [2024-07-12 11:50:45.118060] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12a4730 00:09:55.879 11:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:09:56.138 11:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:09:56.138 11:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:09:56.138 11:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:09:56.138 11:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:09:56.138 11:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:09:56.138 11:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:56.138 11:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:56.138 11:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:56.138 11:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:56.138 11:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:56.138 11:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:56.138 11:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:56.138 11:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:56.138 11:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:56.138 11:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:09:56.397 11:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:56.397 "name": "raid_bdev1", 00:09:56.397 "uuid": "7c3225fd-3191-4593-97a8-61d88b2145a5", 00:09:56.397 "strip_size_kb": 64, 00:09:56.397 "state": "online", 00:09:56.397 "raid_level": "raid0", 00:09:56.397 "superblock": true, 00:09:56.397 "num_base_bdevs": 2, 00:09:56.397 "num_base_bdevs_discovered": 2, 00:09:56.397 "num_base_bdevs_operational": 2, 00:09:56.397 "base_bdevs_list": [ 00:09:56.397 { 00:09:56.397 "name": "BaseBdev1", 00:09:56.397 "uuid": "e5f76744-2bea-5eb3-9a03-2854422c645f", 00:09:56.397 "is_configured": true, 00:09:56.397 "data_offset": 2048, 00:09:56.397 "data_size": 63488 00:09:56.397 }, 00:09:56.397 { 00:09:56.397 "name": "BaseBdev2", 00:09:56.397 "uuid": "ae8a0173-0423-5d5f-8f53-045860ed6286", 00:09:56.397 "is_configured": true, 00:09:56.397 "data_offset": 2048, 00:09:56.397 "data_size": 63488 00:09:56.397 } 00:09:56.397 ] 00:09:56.397 }' 00:09:56.397 11:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:56.397 11:50:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:09:56.656 11:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:09:56.914 [2024-07-12 11:50:47.009402] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:09:56.914 [2024-07-12 11:50:47.009441] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:09:56.914 [2024-07-12 11:50:47.011516] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:56.914 [2024-07-12 11:50:47.011538] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:56.914 [2024-07-12 11:50:47.011555] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:56.914 [2024-07-12 11:50:47.011560] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12a2c80 name raid_bdev1, state offline 00:09:56.914 0 00:09:56.914 11:50:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 594404 00:09:56.914 11:50:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 594404 ']' 00:09:56.914 11:50:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 594404 00:09:56.914 11:50:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:09:56.914 11:50:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:56.914 11:50:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 594404 00:09:56.914 11:50:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:56.914 11:50:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:56.914 11:50:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 594404' 00:09:56.914 killing process with pid 594404 00:09:56.914 11:50:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 594404 00:09:56.914 [2024-07-12 11:50:47.072474] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:56.914 11:50:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 594404 00:09:56.914 [2024-07-12 11:50:47.081669] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:57.172 11:50:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:09:57.172 11:50:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.a7Pn0ZNdSj 00:09:57.172 11:50:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:09:57.172 11:50:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.53 00:09:57.172 11:50:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:09:57.172 11:50:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:09:57.172 11:50:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:09:57.172 11:50:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.53 != \0\.\0\0 ]] 00:09:57.172 00:09:57.172 real 0m4.857s 00:09:57.172 user 0m7.433s 00:09:57.172 sys 0m0.679s 00:09:57.172 11:50:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:57.172 11:50:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:09:57.172 ************************************ 00:09:57.172 END TEST raid_read_error_test 00:09:57.172 ************************************ 00:09:57.172 11:50:47 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:09:57.172 11:50:47 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 2 write 00:09:57.172 11:50:47 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:09:57.172 11:50:47 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:57.172 11:50:47 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:57.172 ************************************ 00:09:57.172 START TEST raid_write_error_test 00:09:57.172 ************************************ 00:09:57.172 11:50:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 2 write 00:09:57.172 11:50:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:09:57.172 11:50:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:09:57.172 11:50:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:09:57.172 11:50:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:09:57.172 11:50:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:09:57.172 11:50:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:09:57.172 11:50:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:09:57.172 11:50:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:09:57.172 11:50:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:09:57.172 11:50:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:09:57.172 11:50:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:09:57.172 11:50:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:09:57.172 11:50:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:09:57.172 11:50:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:09:57.172 11:50:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:09:57.172 11:50:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:09:57.172 11:50:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:09:57.172 11:50:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:09:57.172 11:50:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:09:57.172 11:50:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:09:57.172 11:50:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:09:57.172 11:50:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:09:57.172 11:50:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.Qeeq05EDXq 00:09:57.172 11:50:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=595279 00:09:57.172 11:50:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 595279 /var/tmp/spdk-raid.sock 00:09:57.172 11:50:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:09:57.172 11:50:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 595279 ']' 00:09:57.172 11:50:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:57.172 11:50:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:57.172 11:50:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:57.172 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:57.172 11:50:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:57.172 11:50:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:09:57.172 [2024-07-12 11:50:47.396529] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:09:57.172 [2024-07-12 11:50:47.396569] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid595279 ] 00:09:57.430 [2024-07-12 11:50:47.460059] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:57.430 [2024-07-12 11:50:47.530181] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:57.430 [2024-07-12 11:50:47.580612] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:57.430 [2024-07-12 11:50:47.580641] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:57.997 11:50:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:57.997 11:50:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:09:57.997 11:50:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:09:57.997 11:50:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:09:58.255 BaseBdev1_malloc 00:09:58.255 11:50:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:09:58.255 true 00:09:58.514 11:50:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:09:58.514 [2024-07-12 11:50:48.645113] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:09:58.514 [2024-07-12 11:50:48.645149] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:58.514 [2024-07-12 11:50:48.645159] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bb02d0 00:09:58.514 [2024-07-12 11:50:48.645165] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:58.514 [2024-07-12 11:50:48.646284] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:58.514 [2024-07-12 11:50:48.646305] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:09:58.514 BaseBdev1 00:09:58.514 11:50:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:09:58.514 11:50:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:09:58.772 BaseBdev2_malloc 00:09:58.772 11:50:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:09:58.772 true 00:09:58.773 11:50:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:09:59.031 [2024-07-12 11:50:49.121729] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:09:59.031 [2024-07-12 11:50:49.121757] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:59.031 [2024-07-12 11:50:49.121767] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bb4f40 00:09:59.031 [2024-07-12 11:50:49.121774] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:59.031 [2024-07-12 11:50:49.122822] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:59.031 [2024-07-12 11:50:49.122842] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:09:59.031 BaseBdev2 00:09:59.031 11:50:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:09:59.289 [2024-07-12 11:50:49.282170] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:59.290 [2024-07-12 11:50:49.283105] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:09:59.290 [2024-07-12 11:50:49.283241] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1bb5c80 00:09:59.290 [2024-07-12 11:50:49.283250] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:09:59.290 [2024-07-12 11:50:49.283380] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bb8ac0 00:09:59.290 [2024-07-12 11:50:49.283483] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1bb5c80 00:09:59.290 [2024-07-12 11:50:49.283489] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1bb5c80 00:09:59.290 [2024-07-12 11:50:49.283566] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:59.290 11:50:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:09:59.290 11:50:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:09:59.290 11:50:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:59.290 11:50:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:59.290 11:50:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:59.290 11:50:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:59.290 11:50:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:59.290 11:50:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:59.290 11:50:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:59.290 11:50:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:59.290 11:50:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:59.290 11:50:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:09:59.290 11:50:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:59.290 "name": "raid_bdev1", 00:09:59.290 "uuid": "d8c13c33-6e1d-495c-b5dc-fc7086cd3fbb", 00:09:59.290 "strip_size_kb": 64, 00:09:59.290 "state": "online", 00:09:59.290 "raid_level": "raid0", 00:09:59.290 "superblock": true, 00:09:59.290 "num_base_bdevs": 2, 00:09:59.290 "num_base_bdevs_discovered": 2, 00:09:59.290 "num_base_bdevs_operational": 2, 00:09:59.290 "base_bdevs_list": [ 00:09:59.290 { 00:09:59.290 "name": "BaseBdev1", 00:09:59.290 "uuid": "15af2538-da22-5af5-9c2c-45ff45efd1f7", 00:09:59.290 "is_configured": true, 00:09:59.290 "data_offset": 2048, 00:09:59.290 "data_size": 63488 00:09:59.290 }, 00:09:59.290 { 00:09:59.290 "name": "BaseBdev2", 00:09:59.290 "uuid": "31a3d576-a91e-5555-9693-df99a609c3e7", 00:09:59.290 "is_configured": true, 00:09:59.290 "data_offset": 2048, 00:09:59.290 "data_size": 63488 00:09:59.290 } 00:09:59.290 ] 00:09:59.290 }' 00:09:59.290 11:50:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:59.290 11:50:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:09:59.856 11:50:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:09:59.856 11:50:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:09:59.856 [2024-07-12 11:50:50.040322] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bb7730 00:10:00.792 11:50:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:10:01.051 11:50:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:10:01.051 11:50:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:10:01.051 11:50:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:10:01.051 11:50:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:10:01.051 11:50:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:01.051 11:50:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:01.051 11:50:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:01.051 11:50:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:01.051 11:50:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:01.051 11:50:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:01.051 11:50:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:01.051 11:50:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:01.051 11:50:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:01.051 11:50:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:01.051 11:50:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:01.309 11:50:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:01.309 "name": "raid_bdev1", 00:10:01.309 "uuid": "d8c13c33-6e1d-495c-b5dc-fc7086cd3fbb", 00:10:01.309 "strip_size_kb": 64, 00:10:01.309 "state": "online", 00:10:01.309 "raid_level": "raid0", 00:10:01.309 "superblock": true, 00:10:01.309 "num_base_bdevs": 2, 00:10:01.309 "num_base_bdevs_discovered": 2, 00:10:01.309 "num_base_bdevs_operational": 2, 00:10:01.309 "base_bdevs_list": [ 00:10:01.309 { 00:10:01.309 "name": "BaseBdev1", 00:10:01.309 "uuid": "15af2538-da22-5af5-9c2c-45ff45efd1f7", 00:10:01.309 "is_configured": true, 00:10:01.309 "data_offset": 2048, 00:10:01.309 "data_size": 63488 00:10:01.309 }, 00:10:01.309 { 00:10:01.309 "name": "BaseBdev2", 00:10:01.309 "uuid": "31a3d576-a91e-5555-9693-df99a609c3e7", 00:10:01.309 "is_configured": true, 00:10:01.309 "data_offset": 2048, 00:10:01.309 "data_size": 63488 00:10:01.309 } 00:10:01.309 ] 00:10:01.309 }' 00:10:01.309 11:50:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:01.309 11:50:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:01.567 11:50:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:01.825 [2024-07-12 11:50:51.956497] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:01.825 [2024-07-12 11:50:51.956528] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:01.825 [2024-07-12 11:50:51.958706] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:01.825 [2024-07-12 11:50:51.958728] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:01.825 [2024-07-12 11:50:51.958749] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:01.825 [2024-07-12 11:50:51.958755] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bb5c80 name raid_bdev1, state offline 00:10:01.825 0 00:10:01.825 11:50:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 595279 00:10:01.825 11:50:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 595279 ']' 00:10:01.825 11:50:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 595279 00:10:01.825 11:50:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:10:01.825 11:50:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:01.825 11:50:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 595279 00:10:01.825 11:50:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:01.825 11:50:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:01.825 11:50:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 595279' 00:10:01.825 killing process with pid 595279 00:10:01.825 11:50:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 595279 00:10:01.825 [2024-07-12 11:50:52.016230] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:01.825 11:50:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 595279 00:10:01.825 [2024-07-12 11:50:52.025616] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:02.084 11:50:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.Qeeq05EDXq 00:10:02.084 11:50:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:10:02.084 11:50:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:10:02.084 11:50:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:10:02.084 11:50:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:10:02.084 11:50:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:02.084 11:50:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:02.084 11:50:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:10:02.084 00:10:02.084 real 0m4.880s 00:10:02.084 user 0m7.458s 00:10:02.084 sys 0m0.687s 00:10:02.084 11:50:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:02.084 11:50:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:02.084 ************************************ 00:10:02.084 END TEST raid_write_error_test 00:10:02.084 ************************************ 00:10:02.084 11:50:52 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:02.084 11:50:52 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:10:02.084 11:50:52 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 2 false 00:10:02.084 11:50:52 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:02.084 11:50:52 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:02.084 11:50:52 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:02.084 ************************************ 00:10:02.084 START TEST raid_state_function_test 00:10:02.084 ************************************ 00:10:02.084 11:50:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 2 false 00:10:02.084 11:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:10:02.084 11:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:10:02.084 11:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:10:02.084 11:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:10:02.084 11:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:10:02.084 11:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:02.084 11:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:10:02.084 11:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:02.084 11:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:02.084 11:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:10:02.084 11:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:02.084 11:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:02.084 11:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:02.084 11:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:10:02.084 11:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:10:02.084 11:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:10:02.084 11:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:10:02.084 11:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:10:02.084 11:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:10:02.084 11:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:10:02.084 11:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:10:02.084 11:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:10:02.084 11:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:10:02.084 11:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:02.084 11:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=596281 00:10:02.084 11:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 596281' 00:10:02.084 Process raid pid: 596281 00:10:02.084 11:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 596281 /var/tmp/spdk-raid.sock 00:10:02.084 11:50:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 596281 ']' 00:10:02.084 11:50:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:02.084 11:50:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:02.084 11:50:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:02.085 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:02.085 11:50:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:02.085 11:50:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:02.085 [2024-07-12 11:50:52.312529] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:10:02.085 [2024-07-12 11:50:52.312562] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:02.343 [2024-07-12 11:50:52.371142] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:02.343 [2024-07-12 11:50:52.450950] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:02.343 [2024-07-12 11:50:52.503868] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:02.343 [2024-07-12 11:50:52.503891] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:02.911 11:50:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:02.911 11:50:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:10:02.911 11:50:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:03.168 [2024-07-12 11:50:53.250772] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:03.168 [2024-07-12 11:50:53.250802] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:03.168 [2024-07-12 11:50:53.250808] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:03.168 [2024-07-12 11:50:53.250813] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:03.168 11:50:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:10:03.168 11:50:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:03.168 11:50:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:03.168 11:50:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:03.169 11:50:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:03.169 11:50:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:03.169 11:50:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:03.169 11:50:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:03.169 11:50:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:03.169 11:50:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:03.169 11:50:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:03.169 11:50:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:03.428 11:50:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:03.428 "name": "Existed_Raid", 00:10:03.428 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:03.428 "strip_size_kb": 64, 00:10:03.428 "state": "configuring", 00:10:03.428 "raid_level": "concat", 00:10:03.428 "superblock": false, 00:10:03.428 "num_base_bdevs": 2, 00:10:03.428 "num_base_bdevs_discovered": 0, 00:10:03.428 "num_base_bdevs_operational": 2, 00:10:03.428 "base_bdevs_list": [ 00:10:03.428 { 00:10:03.428 "name": "BaseBdev1", 00:10:03.428 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:03.428 "is_configured": false, 00:10:03.428 "data_offset": 0, 00:10:03.428 "data_size": 0 00:10:03.428 }, 00:10:03.428 { 00:10:03.428 "name": "BaseBdev2", 00:10:03.428 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:03.428 "is_configured": false, 00:10:03.428 "data_offset": 0, 00:10:03.428 "data_size": 0 00:10:03.428 } 00:10:03.428 ] 00:10:03.428 }' 00:10:03.428 11:50:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:03.428 11:50:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:03.686 11:50:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:03.945 [2024-07-12 11:50:54.048749] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:03.945 [2024-07-12 11:50:54.048768] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23981b0 name Existed_Raid, state configuring 00:10:03.945 11:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:04.204 [2024-07-12 11:50:54.217188] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:04.204 [2024-07-12 11:50:54.217204] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:04.204 [2024-07-12 11:50:54.217208] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:04.204 [2024-07-12 11:50:54.217213] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:04.204 11:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:04.204 [2024-07-12 11:50:54.389698] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:04.204 BaseBdev1 00:10:04.204 11:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:10:04.204 11:50:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:10:04.204 11:50:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:04.204 11:50:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:10:04.204 11:50:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:04.204 11:50:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:04.204 11:50:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:04.463 11:50:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:04.723 [ 00:10:04.723 { 00:10:04.723 "name": "BaseBdev1", 00:10:04.723 "aliases": [ 00:10:04.723 "3d3706b1-ba84-401a-a620-7a000488a58f" 00:10:04.723 ], 00:10:04.723 "product_name": "Malloc disk", 00:10:04.723 "block_size": 512, 00:10:04.723 "num_blocks": 65536, 00:10:04.723 "uuid": "3d3706b1-ba84-401a-a620-7a000488a58f", 00:10:04.723 "assigned_rate_limits": { 00:10:04.723 "rw_ios_per_sec": 0, 00:10:04.723 "rw_mbytes_per_sec": 0, 00:10:04.723 "r_mbytes_per_sec": 0, 00:10:04.723 "w_mbytes_per_sec": 0 00:10:04.723 }, 00:10:04.723 "claimed": true, 00:10:04.723 "claim_type": "exclusive_write", 00:10:04.723 "zoned": false, 00:10:04.723 "supported_io_types": { 00:10:04.723 "read": true, 00:10:04.723 "write": true, 00:10:04.723 "unmap": true, 00:10:04.723 "flush": true, 00:10:04.723 "reset": true, 00:10:04.723 "nvme_admin": false, 00:10:04.723 "nvme_io": false, 00:10:04.723 "nvme_io_md": false, 00:10:04.723 "write_zeroes": true, 00:10:04.723 "zcopy": true, 00:10:04.723 "get_zone_info": false, 00:10:04.723 "zone_management": false, 00:10:04.723 "zone_append": false, 00:10:04.723 "compare": false, 00:10:04.723 "compare_and_write": false, 00:10:04.723 "abort": true, 00:10:04.723 "seek_hole": false, 00:10:04.723 "seek_data": false, 00:10:04.723 "copy": true, 00:10:04.723 "nvme_iov_md": false 00:10:04.723 }, 00:10:04.723 "memory_domains": [ 00:10:04.723 { 00:10:04.723 "dma_device_id": "system", 00:10:04.723 "dma_device_type": 1 00:10:04.723 }, 00:10:04.723 { 00:10:04.723 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:04.723 "dma_device_type": 2 00:10:04.723 } 00:10:04.723 ], 00:10:04.723 "driver_specific": {} 00:10:04.723 } 00:10:04.723 ] 00:10:04.723 11:50:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:10:04.723 11:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:10:04.723 11:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:04.723 11:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:04.723 11:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:04.723 11:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:04.723 11:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:04.723 11:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:04.723 11:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:04.723 11:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:04.723 11:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:04.723 11:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:04.723 11:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:04.723 11:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:04.723 "name": "Existed_Raid", 00:10:04.723 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:04.723 "strip_size_kb": 64, 00:10:04.723 "state": "configuring", 00:10:04.723 "raid_level": "concat", 00:10:04.723 "superblock": false, 00:10:04.723 "num_base_bdevs": 2, 00:10:04.723 "num_base_bdevs_discovered": 1, 00:10:04.723 "num_base_bdevs_operational": 2, 00:10:04.723 "base_bdevs_list": [ 00:10:04.723 { 00:10:04.723 "name": "BaseBdev1", 00:10:04.723 "uuid": "3d3706b1-ba84-401a-a620-7a000488a58f", 00:10:04.723 "is_configured": true, 00:10:04.723 "data_offset": 0, 00:10:04.723 "data_size": 65536 00:10:04.723 }, 00:10:04.723 { 00:10:04.723 "name": "BaseBdev2", 00:10:04.723 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:04.723 "is_configured": false, 00:10:04.723 "data_offset": 0, 00:10:04.723 "data_size": 0 00:10:04.723 } 00:10:04.723 ] 00:10:04.723 }' 00:10:04.723 11:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:04.723 11:50:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:05.291 11:50:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:05.550 [2024-07-12 11:50:55.576778] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:05.550 [2024-07-12 11:50:55.576806] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2397aa0 name Existed_Raid, state configuring 00:10:05.550 11:50:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:05.550 [2024-07-12 11:50:55.741216] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:05.550 [2024-07-12 11:50:55.742208] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:05.550 [2024-07-12 11:50:55.742230] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:05.550 11:50:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:10:05.550 11:50:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:05.550 11:50:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:10:05.550 11:50:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:05.550 11:50:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:05.550 11:50:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:05.550 11:50:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:05.550 11:50:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:05.550 11:50:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:05.550 11:50:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:05.550 11:50:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:05.550 11:50:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:05.550 11:50:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:05.550 11:50:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:05.809 11:50:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:05.810 "name": "Existed_Raid", 00:10:05.810 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:05.810 "strip_size_kb": 64, 00:10:05.810 "state": "configuring", 00:10:05.810 "raid_level": "concat", 00:10:05.810 "superblock": false, 00:10:05.810 "num_base_bdevs": 2, 00:10:05.810 "num_base_bdevs_discovered": 1, 00:10:05.810 "num_base_bdevs_operational": 2, 00:10:05.810 "base_bdevs_list": [ 00:10:05.810 { 00:10:05.810 "name": "BaseBdev1", 00:10:05.810 "uuid": "3d3706b1-ba84-401a-a620-7a000488a58f", 00:10:05.810 "is_configured": true, 00:10:05.810 "data_offset": 0, 00:10:05.810 "data_size": 65536 00:10:05.810 }, 00:10:05.810 { 00:10:05.810 "name": "BaseBdev2", 00:10:05.810 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:05.810 "is_configured": false, 00:10:05.810 "data_offset": 0, 00:10:05.810 "data_size": 0 00:10:05.810 } 00:10:05.810 ] 00:10:05.810 }' 00:10:05.810 11:50:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:05.810 11:50:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:06.377 11:50:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:06.377 [2024-07-12 11:50:56.553987] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:06.377 [2024-07-12 11:50:56.554014] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2398890 00:10:06.377 [2024-07-12 11:50:56.554017] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:06.378 [2024-07-12 11:50:56.554138] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2396c20 00:10:06.378 [2024-07-12 11:50:56.554215] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2398890 00:10:06.378 [2024-07-12 11:50:56.554220] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2398890 00:10:06.378 [2024-07-12 11:50:56.554353] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:06.378 BaseBdev2 00:10:06.378 11:50:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:10:06.378 11:50:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:10:06.378 11:50:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:06.378 11:50:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:10:06.378 11:50:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:06.378 11:50:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:06.378 11:50:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:06.637 11:50:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:06.896 [ 00:10:06.896 { 00:10:06.896 "name": "BaseBdev2", 00:10:06.896 "aliases": [ 00:10:06.896 "0fe1e081-5d10-4a3e-85c8-6bfef17c9685" 00:10:06.896 ], 00:10:06.896 "product_name": "Malloc disk", 00:10:06.896 "block_size": 512, 00:10:06.896 "num_blocks": 65536, 00:10:06.896 "uuid": "0fe1e081-5d10-4a3e-85c8-6bfef17c9685", 00:10:06.896 "assigned_rate_limits": { 00:10:06.896 "rw_ios_per_sec": 0, 00:10:06.896 "rw_mbytes_per_sec": 0, 00:10:06.896 "r_mbytes_per_sec": 0, 00:10:06.896 "w_mbytes_per_sec": 0 00:10:06.896 }, 00:10:06.896 "claimed": true, 00:10:06.896 "claim_type": "exclusive_write", 00:10:06.896 "zoned": false, 00:10:06.896 "supported_io_types": { 00:10:06.896 "read": true, 00:10:06.896 "write": true, 00:10:06.896 "unmap": true, 00:10:06.896 "flush": true, 00:10:06.896 "reset": true, 00:10:06.896 "nvme_admin": false, 00:10:06.896 "nvme_io": false, 00:10:06.896 "nvme_io_md": false, 00:10:06.896 "write_zeroes": true, 00:10:06.896 "zcopy": true, 00:10:06.896 "get_zone_info": false, 00:10:06.896 "zone_management": false, 00:10:06.896 "zone_append": false, 00:10:06.896 "compare": false, 00:10:06.896 "compare_and_write": false, 00:10:06.896 "abort": true, 00:10:06.896 "seek_hole": false, 00:10:06.896 "seek_data": false, 00:10:06.896 "copy": true, 00:10:06.896 "nvme_iov_md": false 00:10:06.896 }, 00:10:06.896 "memory_domains": [ 00:10:06.896 { 00:10:06.896 "dma_device_id": "system", 00:10:06.896 "dma_device_type": 1 00:10:06.896 }, 00:10:06.896 { 00:10:06.896 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:06.896 "dma_device_type": 2 00:10:06.896 } 00:10:06.896 ], 00:10:06.896 "driver_specific": {} 00:10:06.896 } 00:10:06.896 ] 00:10:06.896 11:50:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:10:06.896 11:50:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:10:06.896 11:50:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:06.896 11:50:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:10:06.896 11:50:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:06.896 11:50:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:06.896 11:50:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:06.896 11:50:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:06.896 11:50:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:06.896 11:50:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:06.896 11:50:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:06.896 11:50:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:06.896 11:50:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:06.896 11:50:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:06.896 11:50:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:06.896 11:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:06.896 "name": "Existed_Raid", 00:10:06.896 "uuid": "bf11574c-86ca-4819-819e-3a060439ae18", 00:10:06.896 "strip_size_kb": 64, 00:10:06.896 "state": "online", 00:10:06.896 "raid_level": "concat", 00:10:06.896 "superblock": false, 00:10:06.896 "num_base_bdevs": 2, 00:10:06.896 "num_base_bdevs_discovered": 2, 00:10:06.896 "num_base_bdevs_operational": 2, 00:10:06.896 "base_bdevs_list": [ 00:10:06.896 { 00:10:06.896 "name": "BaseBdev1", 00:10:06.896 "uuid": "3d3706b1-ba84-401a-a620-7a000488a58f", 00:10:06.896 "is_configured": true, 00:10:06.896 "data_offset": 0, 00:10:06.896 "data_size": 65536 00:10:06.896 }, 00:10:06.896 { 00:10:06.896 "name": "BaseBdev2", 00:10:06.896 "uuid": "0fe1e081-5d10-4a3e-85c8-6bfef17c9685", 00:10:06.896 "is_configured": true, 00:10:06.896 "data_offset": 0, 00:10:06.896 "data_size": 65536 00:10:06.897 } 00:10:06.897 ] 00:10:06.897 }' 00:10:06.897 11:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:06.897 11:50:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:07.465 11:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:10:07.465 11:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:10:07.465 11:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:07.465 11:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:07.465 11:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:07.465 11:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:07.465 11:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:07.465 11:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:07.724 [2024-07-12 11:50:57.721201] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:07.724 11:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:07.724 "name": "Existed_Raid", 00:10:07.724 "aliases": [ 00:10:07.724 "bf11574c-86ca-4819-819e-3a060439ae18" 00:10:07.724 ], 00:10:07.724 "product_name": "Raid Volume", 00:10:07.724 "block_size": 512, 00:10:07.724 "num_blocks": 131072, 00:10:07.724 "uuid": "bf11574c-86ca-4819-819e-3a060439ae18", 00:10:07.724 "assigned_rate_limits": { 00:10:07.724 "rw_ios_per_sec": 0, 00:10:07.724 "rw_mbytes_per_sec": 0, 00:10:07.724 "r_mbytes_per_sec": 0, 00:10:07.724 "w_mbytes_per_sec": 0 00:10:07.724 }, 00:10:07.724 "claimed": false, 00:10:07.724 "zoned": false, 00:10:07.724 "supported_io_types": { 00:10:07.724 "read": true, 00:10:07.724 "write": true, 00:10:07.724 "unmap": true, 00:10:07.724 "flush": true, 00:10:07.724 "reset": true, 00:10:07.724 "nvme_admin": false, 00:10:07.724 "nvme_io": false, 00:10:07.724 "nvme_io_md": false, 00:10:07.724 "write_zeroes": true, 00:10:07.724 "zcopy": false, 00:10:07.724 "get_zone_info": false, 00:10:07.724 "zone_management": false, 00:10:07.724 "zone_append": false, 00:10:07.724 "compare": false, 00:10:07.724 "compare_and_write": false, 00:10:07.724 "abort": false, 00:10:07.724 "seek_hole": false, 00:10:07.724 "seek_data": false, 00:10:07.724 "copy": false, 00:10:07.724 "nvme_iov_md": false 00:10:07.724 }, 00:10:07.724 "memory_domains": [ 00:10:07.724 { 00:10:07.724 "dma_device_id": "system", 00:10:07.724 "dma_device_type": 1 00:10:07.724 }, 00:10:07.724 { 00:10:07.724 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:07.724 "dma_device_type": 2 00:10:07.724 }, 00:10:07.724 { 00:10:07.724 "dma_device_id": "system", 00:10:07.724 "dma_device_type": 1 00:10:07.724 }, 00:10:07.724 { 00:10:07.724 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:07.724 "dma_device_type": 2 00:10:07.724 } 00:10:07.724 ], 00:10:07.724 "driver_specific": { 00:10:07.724 "raid": { 00:10:07.724 "uuid": "bf11574c-86ca-4819-819e-3a060439ae18", 00:10:07.724 "strip_size_kb": 64, 00:10:07.724 "state": "online", 00:10:07.724 "raid_level": "concat", 00:10:07.724 "superblock": false, 00:10:07.724 "num_base_bdevs": 2, 00:10:07.724 "num_base_bdevs_discovered": 2, 00:10:07.724 "num_base_bdevs_operational": 2, 00:10:07.724 "base_bdevs_list": [ 00:10:07.724 { 00:10:07.724 "name": "BaseBdev1", 00:10:07.724 "uuid": "3d3706b1-ba84-401a-a620-7a000488a58f", 00:10:07.724 "is_configured": true, 00:10:07.724 "data_offset": 0, 00:10:07.724 "data_size": 65536 00:10:07.724 }, 00:10:07.724 { 00:10:07.724 "name": "BaseBdev2", 00:10:07.724 "uuid": "0fe1e081-5d10-4a3e-85c8-6bfef17c9685", 00:10:07.724 "is_configured": true, 00:10:07.724 "data_offset": 0, 00:10:07.724 "data_size": 65536 00:10:07.724 } 00:10:07.724 ] 00:10:07.724 } 00:10:07.724 } 00:10:07.724 }' 00:10:07.724 11:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:07.724 11:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:10:07.724 BaseBdev2' 00:10:07.724 11:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:07.724 11:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:07.724 11:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:07.724 11:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:07.724 "name": "BaseBdev1", 00:10:07.724 "aliases": [ 00:10:07.724 "3d3706b1-ba84-401a-a620-7a000488a58f" 00:10:07.724 ], 00:10:07.724 "product_name": "Malloc disk", 00:10:07.724 "block_size": 512, 00:10:07.724 "num_blocks": 65536, 00:10:07.724 "uuid": "3d3706b1-ba84-401a-a620-7a000488a58f", 00:10:07.724 "assigned_rate_limits": { 00:10:07.724 "rw_ios_per_sec": 0, 00:10:07.724 "rw_mbytes_per_sec": 0, 00:10:07.724 "r_mbytes_per_sec": 0, 00:10:07.725 "w_mbytes_per_sec": 0 00:10:07.725 }, 00:10:07.725 "claimed": true, 00:10:07.725 "claim_type": "exclusive_write", 00:10:07.725 "zoned": false, 00:10:07.725 "supported_io_types": { 00:10:07.725 "read": true, 00:10:07.725 "write": true, 00:10:07.725 "unmap": true, 00:10:07.725 "flush": true, 00:10:07.725 "reset": true, 00:10:07.725 "nvme_admin": false, 00:10:07.725 "nvme_io": false, 00:10:07.725 "nvme_io_md": false, 00:10:07.725 "write_zeroes": true, 00:10:07.725 "zcopy": true, 00:10:07.725 "get_zone_info": false, 00:10:07.725 "zone_management": false, 00:10:07.725 "zone_append": false, 00:10:07.725 "compare": false, 00:10:07.725 "compare_and_write": false, 00:10:07.725 "abort": true, 00:10:07.725 "seek_hole": false, 00:10:07.725 "seek_data": false, 00:10:07.725 "copy": true, 00:10:07.725 "nvme_iov_md": false 00:10:07.725 }, 00:10:07.725 "memory_domains": [ 00:10:07.725 { 00:10:07.725 "dma_device_id": "system", 00:10:07.725 "dma_device_type": 1 00:10:07.725 }, 00:10:07.725 { 00:10:07.725 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:07.725 "dma_device_type": 2 00:10:07.725 } 00:10:07.725 ], 00:10:07.725 "driver_specific": {} 00:10:07.725 }' 00:10:07.725 11:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:07.984 11:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:07.984 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:07.984 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:07.984 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:07.984 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:07.984 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:07.984 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:07.984 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:07.984 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:07.984 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:08.243 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:08.243 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:08.243 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:08.243 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:08.243 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:08.243 "name": "BaseBdev2", 00:10:08.243 "aliases": [ 00:10:08.243 "0fe1e081-5d10-4a3e-85c8-6bfef17c9685" 00:10:08.243 ], 00:10:08.243 "product_name": "Malloc disk", 00:10:08.243 "block_size": 512, 00:10:08.243 "num_blocks": 65536, 00:10:08.243 "uuid": "0fe1e081-5d10-4a3e-85c8-6bfef17c9685", 00:10:08.243 "assigned_rate_limits": { 00:10:08.243 "rw_ios_per_sec": 0, 00:10:08.243 "rw_mbytes_per_sec": 0, 00:10:08.243 "r_mbytes_per_sec": 0, 00:10:08.243 "w_mbytes_per_sec": 0 00:10:08.243 }, 00:10:08.243 "claimed": true, 00:10:08.243 "claim_type": "exclusive_write", 00:10:08.243 "zoned": false, 00:10:08.243 "supported_io_types": { 00:10:08.243 "read": true, 00:10:08.243 "write": true, 00:10:08.243 "unmap": true, 00:10:08.243 "flush": true, 00:10:08.243 "reset": true, 00:10:08.243 "nvme_admin": false, 00:10:08.243 "nvme_io": false, 00:10:08.243 "nvme_io_md": false, 00:10:08.243 "write_zeroes": true, 00:10:08.243 "zcopy": true, 00:10:08.243 "get_zone_info": false, 00:10:08.243 "zone_management": false, 00:10:08.243 "zone_append": false, 00:10:08.243 "compare": false, 00:10:08.243 "compare_and_write": false, 00:10:08.243 "abort": true, 00:10:08.243 "seek_hole": false, 00:10:08.243 "seek_data": false, 00:10:08.243 "copy": true, 00:10:08.243 "nvme_iov_md": false 00:10:08.243 }, 00:10:08.243 "memory_domains": [ 00:10:08.243 { 00:10:08.243 "dma_device_id": "system", 00:10:08.243 "dma_device_type": 1 00:10:08.243 }, 00:10:08.243 { 00:10:08.243 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:08.243 "dma_device_type": 2 00:10:08.243 } 00:10:08.243 ], 00:10:08.243 "driver_specific": {} 00:10:08.243 }' 00:10:08.243 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:08.243 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:08.503 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:08.503 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:08.503 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:08.503 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:08.503 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:08.503 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:08.503 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:08.503 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:08.503 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:08.503 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:08.503 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:08.762 [2024-07-12 11:50:58.896089] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:08.762 [2024-07-12 11:50:58.896107] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:08.762 [2024-07-12 11:50:58.896135] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:08.762 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:10:08.762 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:10:08.762 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:08.762 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:08.762 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:10:08.762 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:10:08.762 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:08.762 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:10:08.762 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:08.762 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:08.763 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:08.763 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:08.763 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:08.763 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:08.763 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:08.763 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:08.763 11:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:09.022 11:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:09.022 "name": "Existed_Raid", 00:10:09.022 "uuid": "bf11574c-86ca-4819-819e-3a060439ae18", 00:10:09.022 "strip_size_kb": 64, 00:10:09.022 "state": "offline", 00:10:09.022 "raid_level": "concat", 00:10:09.022 "superblock": false, 00:10:09.022 "num_base_bdevs": 2, 00:10:09.022 "num_base_bdevs_discovered": 1, 00:10:09.022 "num_base_bdevs_operational": 1, 00:10:09.022 "base_bdevs_list": [ 00:10:09.022 { 00:10:09.022 "name": null, 00:10:09.022 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:09.022 "is_configured": false, 00:10:09.022 "data_offset": 0, 00:10:09.022 "data_size": 65536 00:10:09.022 }, 00:10:09.022 { 00:10:09.022 "name": "BaseBdev2", 00:10:09.022 "uuid": "0fe1e081-5d10-4a3e-85c8-6bfef17c9685", 00:10:09.022 "is_configured": true, 00:10:09.022 "data_offset": 0, 00:10:09.022 "data_size": 65536 00:10:09.022 } 00:10:09.022 ] 00:10:09.022 }' 00:10:09.022 11:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:09.022 11:50:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:09.590 11:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:10:09.590 11:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:09.590 11:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:09.590 11:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:10:09.590 11:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:10:09.590 11:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:09.590 11:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:09.849 [2024-07-12 11:50:59.883535] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:09.849 [2024-07-12 11:50:59.883572] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2398890 name Existed_Raid, state offline 00:10:09.849 11:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:10:09.849 11:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:09.849 11:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:09.849 11:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:10:09.849 11:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:10:09.849 11:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:10:09.849 11:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:10:09.849 11:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 596281 00:10:09.849 11:51:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 596281 ']' 00:10:09.849 11:51:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 596281 00:10:09.849 11:51:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:10:10.108 11:51:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:10.108 11:51:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 596281 00:10:10.108 11:51:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:10.108 11:51:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:10.108 11:51:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 596281' 00:10:10.108 killing process with pid 596281 00:10:10.108 11:51:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 596281 00:10:10.108 [2024-07-12 11:51:00.134235] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:10.108 11:51:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 596281 00:10:10.108 [2024-07-12 11:51:00.135036] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:10.108 11:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:10:10.108 00:10:10.108 real 0m8.041s 00:10:10.108 user 0m14.424s 00:10:10.108 sys 0m1.261s 00:10:10.108 11:51:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:10.108 11:51:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:10.108 ************************************ 00:10:10.108 END TEST raid_state_function_test 00:10:10.108 ************************************ 00:10:10.108 11:51:00 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:10.108 11:51:00 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 2 true 00:10:10.108 11:51:00 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:10.108 11:51:00 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:10.108 11:51:00 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:10.368 ************************************ 00:10:10.368 START TEST raid_state_function_test_sb 00:10:10.368 ************************************ 00:10:10.368 11:51:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 2 true 00:10:10.368 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:10:10.368 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:10:10.368 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:10:10.368 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:10:10.368 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:10:10.368 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:10.368 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:10:10.368 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:10.368 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:10.368 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:10:10.368 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:10.368 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:10.368 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:10.368 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:10:10.368 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:10:10.368 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:10:10.368 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:10:10.368 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:10:10.368 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:10:10.368 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:10:10.368 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:10:10.368 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:10:10.368 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:10:10.368 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=597901 00:10:10.368 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 597901' 00:10:10.368 Process raid pid: 597901 00:10:10.368 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:10.368 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 597901 /var/tmp/spdk-raid.sock 00:10:10.368 11:51:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 597901 ']' 00:10:10.368 11:51:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:10.368 11:51:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:10.368 11:51:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:10.368 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:10.368 11:51:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:10.368 11:51:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:10.368 [2024-07-12 11:51:00.436413] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:10:10.368 [2024-07-12 11:51:00.436452] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:10.368 [2024-07-12 11:51:00.502236] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:10.368 [2024-07-12 11:51:00.580909] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:10.627 [2024-07-12 11:51:00.632737] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:10.627 [2024-07-12 11:51:00.632762] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:11.195 11:51:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:11.195 11:51:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:10:11.195 11:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:11.195 [2024-07-12 11:51:01.379998] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:11.195 [2024-07-12 11:51:01.380030] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:11.195 [2024-07-12 11:51:01.380036] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:11.195 [2024-07-12 11:51:01.380041] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:11.195 11:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:10:11.195 11:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:11.195 11:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:11.195 11:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:11.195 11:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:11.195 11:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:11.195 11:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:11.195 11:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:11.195 11:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:11.195 11:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:11.195 11:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:11.195 11:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:11.455 11:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:11.455 "name": "Existed_Raid", 00:10:11.455 "uuid": "6aceff25-8b7b-4d00-ad19-d4c817a3a5fd", 00:10:11.455 "strip_size_kb": 64, 00:10:11.455 "state": "configuring", 00:10:11.455 "raid_level": "concat", 00:10:11.455 "superblock": true, 00:10:11.455 "num_base_bdevs": 2, 00:10:11.455 "num_base_bdevs_discovered": 0, 00:10:11.455 "num_base_bdevs_operational": 2, 00:10:11.455 "base_bdevs_list": [ 00:10:11.455 { 00:10:11.455 "name": "BaseBdev1", 00:10:11.455 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:11.455 "is_configured": false, 00:10:11.455 "data_offset": 0, 00:10:11.455 "data_size": 0 00:10:11.455 }, 00:10:11.455 { 00:10:11.455 "name": "BaseBdev2", 00:10:11.455 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:11.455 "is_configured": false, 00:10:11.455 "data_offset": 0, 00:10:11.455 "data_size": 0 00:10:11.455 } 00:10:11.455 ] 00:10:11.455 }' 00:10:11.455 11:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:11.455 11:51:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:12.023 11:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:12.023 [2024-07-12 11:51:02.214056] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:12.023 [2024-07-12 11:51:02.214080] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f491b0 name Existed_Raid, state configuring 00:10:12.023 11:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:12.281 [2024-07-12 11:51:02.382523] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:12.282 [2024-07-12 11:51:02.382540] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:12.282 [2024-07-12 11:51:02.382545] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:12.282 [2024-07-12 11:51:02.382550] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:12.282 11:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:12.541 [2024-07-12 11:51:02.559313] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:12.541 BaseBdev1 00:10:12.541 11:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:10:12.541 11:51:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:10:12.541 11:51:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:12.541 11:51:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:10:12.541 11:51:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:12.541 11:51:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:12.541 11:51:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:12.541 11:51:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:12.799 [ 00:10:12.799 { 00:10:12.799 "name": "BaseBdev1", 00:10:12.799 "aliases": [ 00:10:12.799 "590737e3-f467-4f57-86da-505578df31b5" 00:10:12.799 ], 00:10:12.799 "product_name": "Malloc disk", 00:10:12.799 "block_size": 512, 00:10:12.799 "num_blocks": 65536, 00:10:12.799 "uuid": "590737e3-f467-4f57-86da-505578df31b5", 00:10:12.799 "assigned_rate_limits": { 00:10:12.799 "rw_ios_per_sec": 0, 00:10:12.799 "rw_mbytes_per_sec": 0, 00:10:12.799 "r_mbytes_per_sec": 0, 00:10:12.799 "w_mbytes_per_sec": 0 00:10:12.799 }, 00:10:12.799 "claimed": true, 00:10:12.799 "claim_type": "exclusive_write", 00:10:12.799 "zoned": false, 00:10:12.799 "supported_io_types": { 00:10:12.799 "read": true, 00:10:12.799 "write": true, 00:10:12.799 "unmap": true, 00:10:12.799 "flush": true, 00:10:12.799 "reset": true, 00:10:12.799 "nvme_admin": false, 00:10:12.799 "nvme_io": false, 00:10:12.799 "nvme_io_md": false, 00:10:12.799 "write_zeroes": true, 00:10:12.799 "zcopy": true, 00:10:12.799 "get_zone_info": false, 00:10:12.799 "zone_management": false, 00:10:12.799 "zone_append": false, 00:10:12.799 "compare": false, 00:10:12.799 "compare_and_write": false, 00:10:12.799 "abort": true, 00:10:12.799 "seek_hole": false, 00:10:12.799 "seek_data": false, 00:10:12.799 "copy": true, 00:10:12.799 "nvme_iov_md": false 00:10:12.799 }, 00:10:12.799 "memory_domains": [ 00:10:12.799 { 00:10:12.799 "dma_device_id": "system", 00:10:12.799 "dma_device_type": 1 00:10:12.799 }, 00:10:12.799 { 00:10:12.799 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:12.799 "dma_device_type": 2 00:10:12.799 } 00:10:12.799 ], 00:10:12.799 "driver_specific": {} 00:10:12.799 } 00:10:12.799 ] 00:10:12.799 11:51:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:10:12.799 11:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:10:12.799 11:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:12.799 11:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:12.799 11:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:12.799 11:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:12.799 11:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:12.799 11:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:12.799 11:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:12.799 11:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:12.799 11:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:12.799 11:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:12.799 11:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:13.058 11:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:13.058 "name": "Existed_Raid", 00:10:13.058 "uuid": "0b296a18-a883-40ca-ba72-1bfb6dcf4140", 00:10:13.058 "strip_size_kb": 64, 00:10:13.058 "state": "configuring", 00:10:13.058 "raid_level": "concat", 00:10:13.058 "superblock": true, 00:10:13.058 "num_base_bdevs": 2, 00:10:13.058 "num_base_bdevs_discovered": 1, 00:10:13.058 "num_base_bdevs_operational": 2, 00:10:13.058 "base_bdevs_list": [ 00:10:13.058 { 00:10:13.058 "name": "BaseBdev1", 00:10:13.058 "uuid": "590737e3-f467-4f57-86da-505578df31b5", 00:10:13.058 "is_configured": true, 00:10:13.058 "data_offset": 2048, 00:10:13.058 "data_size": 63488 00:10:13.058 }, 00:10:13.058 { 00:10:13.058 "name": "BaseBdev2", 00:10:13.058 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:13.058 "is_configured": false, 00:10:13.058 "data_offset": 0, 00:10:13.058 "data_size": 0 00:10:13.058 } 00:10:13.058 ] 00:10:13.058 }' 00:10:13.058 11:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:13.058 11:51:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:13.625 11:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:13.625 [2024-07-12 11:51:03.738353] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:13.625 [2024-07-12 11:51:03.738384] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f48aa0 name Existed_Raid, state configuring 00:10:13.626 11:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:13.884 [2024-07-12 11:51:03.906821] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:13.884 [2024-07-12 11:51:03.907852] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:13.884 [2024-07-12 11:51:03.907874] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:13.884 11:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:10:13.884 11:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:13.884 11:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:10:13.884 11:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:13.884 11:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:13.884 11:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:13.884 11:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:13.884 11:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:13.884 11:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:13.884 11:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:13.884 11:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:13.884 11:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:13.884 11:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:13.884 11:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:13.884 11:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:13.884 "name": "Existed_Raid", 00:10:13.884 "uuid": "9268a750-1b37-4493-8dc0-748b1144f76c", 00:10:13.884 "strip_size_kb": 64, 00:10:13.884 "state": "configuring", 00:10:13.884 "raid_level": "concat", 00:10:13.884 "superblock": true, 00:10:13.884 "num_base_bdevs": 2, 00:10:13.884 "num_base_bdevs_discovered": 1, 00:10:13.884 "num_base_bdevs_operational": 2, 00:10:13.884 "base_bdevs_list": [ 00:10:13.884 { 00:10:13.884 "name": "BaseBdev1", 00:10:13.884 "uuid": "590737e3-f467-4f57-86da-505578df31b5", 00:10:13.884 "is_configured": true, 00:10:13.884 "data_offset": 2048, 00:10:13.884 "data_size": 63488 00:10:13.884 }, 00:10:13.884 { 00:10:13.884 "name": "BaseBdev2", 00:10:13.884 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:13.884 "is_configured": false, 00:10:13.884 "data_offset": 0, 00:10:13.884 "data_size": 0 00:10:13.885 } 00:10:13.885 ] 00:10:13.885 }' 00:10:13.885 11:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:13.885 11:51:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:14.451 11:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:14.710 [2024-07-12 11:51:04.767698] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:14.710 [2024-07-12 11:51:04.767810] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f49890 00:10:14.710 [2024-07-12 11:51:04.767822] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:14.710 [2024-07-12 11:51:04.767943] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f47c20 00:10:14.710 [2024-07-12 11:51:04.768026] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f49890 00:10:14.710 [2024-07-12 11:51:04.768031] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1f49890 00:10:14.710 [2024-07-12 11:51:04.768093] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:14.710 BaseBdev2 00:10:14.710 11:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:10:14.710 11:51:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:10:14.710 11:51:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:14.710 11:51:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:10:14.710 11:51:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:14.710 11:51:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:14.710 11:51:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:14.710 11:51:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:14.968 [ 00:10:14.968 { 00:10:14.968 "name": "BaseBdev2", 00:10:14.968 "aliases": [ 00:10:14.968 "dbaefc2b-1bab-4a88-a62e-830772cfdc60" 00:10:14.968 ], 00:10:14.968 "product_name": "Malloc disk", 00:10:14.968 "block_size": 512, 00:10:14.968 "num_blocks": 65536, 00:10:14.968 "uuid": "dbaefc2b-1bab-4a88-a62e-830772cfdc60", 00:10:14.968 "assigned_rate_limits": { 00:10:14.968 "rw_ios_per_sec": 0, 00:10:14.968 "rw_mbytes_per_sec": 0, 00:10:14.968 "r_mbytes_per_sec": 0, 00:10:14.968 "w_mbytes_per_sec": 0 00:10:14.968 }, 00:10:14.968 "claimed": true, 00:10:14.968 "claim_type": "exclusive_write", 00:10:14.968 "zoned": false, 00:10:14.968 "supported_io_types": { 00:10:14.968 "read": true, 00:10:14.968 "write": true, 00:10:14.968 "unmap": true, 00:10:14.968 "flush": true, 00:10:14.968 "reset": true, 00:10:14.968 "nvme_admin": false, 00:10:14.968 "nvme_io": false, 00:10:14.968 "nvme_io_md": false, 00:10:14.968 "write_zeroes": true, 00:10:14.968 "zcopy": true, 00:10:14.968 "get_zone_info": false, 00:10:14.968 "zone_management": false, 00:10:14.968 "zone_append": false, 00:10:14.968 "compare": false, 00:10:14.968 "compare_and_write": false, 00:10:14.968 "abort": true, 00:10:14.968 "seek_hole": false, 00:10:14.968 "seek_data": false, 00:10:14.968 "copy": true, 00:10:14.968 "nvme_iov_md": false 00:10:14.968 }, 00:10:14.968 "memory_domains": [ 00:10:14.968 { 00:10:14.968 "dma_device_id": "system", 00:10:14.968 "dma_device_type": 1 00:10:14.968 }, 00:10:14.968 { 00:10:14.968 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:14.968 "dma_device_type": 2 00:10:14.968 } 00:10:14.968 ], 00:10:14.968 "driver_specific": {} 00:10:14.968 } 00:10:14.968 ] 00:10:14.968 11:51:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:10:14.968 11:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:10:14.968 11:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:14.968 11:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:10:14.968 11:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:14.968 11:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:14.968 11:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:14.968 11:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:14.968 11:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:14.968 11:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:14.968 11:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:14.968 11:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:14.968 11:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:14.968 11:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:14.969 11:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:15.227 11:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:15.227 "name": "Existed_Raid", 00:10:15.227 "uuid": "9268a750-1b37-4493-8dc0-748b1144f76c", 00:10:15.227 "strip_size_kb": 64, 00:10:15.227 "state": "online", 00:10:15.227 "raid_level": "concat", 00:10:15.227 "superblock": true, 00:10:15.227 "num_base_bdevs": 2, 00:10:15.227 "num_base_bdevs_discovered": 2, 00:10:15.227 "num_base_bdevs_operational": 2, 00:10:15.227 "base_bdevs_list": [ 00:10:15.227 { 00:10:15.227 "name": "BaseBdev1", 00:10:15.227 "uuid": "590737e3-f467-4f57-86da-505578df31b5", 00:10:15.227 "is_configured": true, 00:10:15.227 "data_offset": 2048, 00:10:15.227 "data_size": 63488 00:10:15.227 }, 00:10:15.227 { 00:10:15.227 "name": "BaseBdev2", 00:10:15.227 "uuid": "dbaefc2b-1bab-4a88-a62e-830772cfdc60", 00:10:15.227 "is_configured": true, 00:10:15.227 "data_offset": 2048, 00:10:15.227 "data_size": 63488 00:10:15.227 } 00:10:15.227 ] 00:10:15.227 }' 00:10:15.227 11:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:15.227 11:51:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:15.794 11:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:10:15.794 11:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:10:15.794 11:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:15.794 11:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:15.794 11:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:15.794 11:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:10:15.794 11:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:15.794 11:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:15.794 [2024-07-12 11:51:05.962975] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:15.794 11:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:15.794 "name": "Existed_Raid", 00:10:15.794 "aliases": [ 00:10:15.794 "9268a750-1b37-4493-8dc0-748b1144f76c" 00:10:15.794 ], 00:10:15.794 "product_name": "Raid Volume", 00:10:15.794 "block_size": 512, 00:10:15.794 "num_blocks": 126976, 00:10:15.794 "uuid": "9268a750-1b37-4493-8dc0-748b1144f76c", 00:10:15.794 "assigned_rate_limits": { 00:10:15.794 "rw_ios_per_sec": 0, 00:10:15.794 "rw_mbytes_per_sec": 0, 00:10:15.794 "r_mbytes_per_sec": 0, 00:10:15.794 "w_mbytes_per_sec": 0 00:10:15.794 }, 00:10:15.794 "claimed": false, 00:10:15.794 "zoned": false, 00:10:15.794 "supported_io_types": { 00:10:15.794 "read": true, 00:10:15.794 "write": true, 00:10:15.794 "unmap": true, 00:10:15.794 "flush": true, 00:10:15.794 "reset": true, 00:10:15.794 "nvme_admin": false, 00:10:15.794 "nvme_io": false, 00:10:15.794 "nvme_io_md": false, 00:10:15.794 "write_zeroes": true, 00:10:15.794 "zcopy": false, 00:10:15.794 "get_zone_info": false, 00:10:15.794 "zone_management": false, 00:10:15.794 "zone_append": false, 00:10:15.794 "compare": false, 00:10:15.794 "compare_and_write": false, 00:10:15.794 "abort": false, 00:10:15.794 "seek_hole": false, 00:10:15.794 "seek_data": false, 00:10:15.794 "copy": false, 00:10:15.794 "nvme_iov_md": false 00:10:15.794 }, 00:10:15.794 "memory_domains": [ 00:10:15.794 { 00:10:15.794 "dma_device_id": "system", 00:10:15.794 "dma_device_type": 1 00:10:15.794 }, 00:10:15.794 { 00:10:15.795 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:15.795 "dma_device_type": 2 00:10:15.795 }, 00:10:15.795 { 00:10:15.795 "dma_device_id": "system", 00:10:15.795 "dma_device_type": 1 00:10:15.795 }, 00:10:15.795 { 00:10:15.795 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:15.795 "dma_device_type": 2 00:10:15.795 } 00:10:15.795 ], 00:10:15.795 "driver_specific": { 00:10:15.795 "raid": { 00:10:15.795 "uuid": "9268a750-1b37-4493-8dc0-748b1144f76c", 00:10:15.795 "strip_size_kb": 64, 00:10:15.795 "state": "online", 00:10:15.795 "raid_level": "concat", 00:10:15.795 "superblock": true, 00:10:15.795 "num_base_bdevs": 2, 00:10:15.795 "num_base_bdevs_discovered": 2, 00:10:15.795 "num_base_bdevs_operational": 2, 00:10:15.795 "base_bdevs_list": [ 00:10:15.795 { 00:10:15.795 "name": "BaseBdev1", 00:10:15.795 "uuid": "590737e3-f467-4f57-86da-505578df31b5", 00:10:15.795 "is_configured": true, 00:10:15.795 "data_offset": 2048, 00:10:15.795 "data_size": 63488 00:10:15.795 }, 00:10:15.795 { 00:10:15.795 "name": "BaseBdev2", 00:10:15.795 "uuid": "dbaefc2b-1bab-4a88-a62e-830772cfdc60", 00:10:15.795 "is_configured": true, 00:10:15.795 "data_offset": 2048, 00:10:15.795 "data_size": 63488 00:10:15.795 } 00:10:15.795 ] 00:10:15.795 } 00:10:15.795 } 00:10:15.795 }' 00:10:15.795 11:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:15.795 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:10:15.795 BaseBdev2' 00:10:15.795 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:15.795 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:15.795 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:16.053 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:16.053 "name": "BaseBdev1", 00:10:16.053 "aliases": [ 00:10:16.053 "590737e3-f467-4f57-86da-505578df31b5" 00:10:16.053 ], 00:10:16.053 "product_name": "Malloc disk", 00:10:16.053 "block_size": 512, 00:10:16.053 "num_blocks": 65536, 00:10:16.053 "uuid": "590737e3-f467-4f57-86da-505578df31b5", 00:10:16.053 "assigned_rate_limits": { 00:10:16.053 "rw_ios_per_sec": 0, 00:10:16.053 "rw_mbytes_per_sec": 0, 00:10:16.053 "r_mbytes_per_sec": 0, 00:10:16.053 "w_mbytes_per_sec": 0 00:10:16.053 }, 00:10:16.054 "claimed": true, 00:10:16.054 "claim_type": "exclusive_write", 00:10:16.054 "zoned": false, 00:10:16.054 "supported_io_types": { 00:10:16.054 "read": true, 00:10:16.054 "write": true, 00:10:16.054 "unmap": true, 00:10:16.054 "flush": true, 00:10:16.054 "reset": true, 00:10:16.054 "nvme_admin": false, 00:10:16.054 "nvme_io": false, 00:10:16.054 "nvme_io_md": false, 00:10:16.054 "write_zeroes": true, 00:10:16.054 "zcopy": true, 00:10:16.054 "get_zone_info": false, 00:10:16.054 "zone_management": false, 00:10:16.054 "zone_append": false, 00:10:16.054 "compare": false, 00:10:16.054 "compare_and_write": false, 00:10:16.054 "abort": true, 00:10:16.054 "seek_hole": false, 00:10:16.054 "seek_data": false, 00:10:16.054 "copy": true, 00:10:16.054 "nvme_iov_md": false 00:10:16.054 }, 00:10:16.054 "memory_domains": [ 00:10:16.054 { 00:10:16.054 "dma_device_id": "system", 00:10:16.054 "dma_device_type": 1 00:10:16.054 }, 00:10:16.054 { 00:10:16.054 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:16.054 "dma_device_type": 2 00:10:16.054 } 00:10:16.054 ], 00:10:16.054 "driver_specific": {} 00:10:16.054 }' 00:10:16.054 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:16.054 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:16.054 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:16.054 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:16.313 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:16.313 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:16.313 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:16.313 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:16.313 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:16.313 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:16.313 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:16.313 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:16.313 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:16.313 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:16.313 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:16.572 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:16.572 "name": "BaseBdev2", 00:10:16.572 "aliases": [ 00:10:16.572 "dbaefc2b-1bab-4a88-a62e-830772cfdc60" 00:10:16.572 ], 00:10:16.572 "product_name": "Malloc disk", 00:10:16.572 "block_size": 512, 00:10:16.572 "num_blocks": 65536, 00:10:16.572 "uuid": "dbaefc2b-1bab-4a88-a62e-830772cfdc60", 00:10:16.572 "assigned_rate_limits": { 00:10:16.572 "rw_ios_per_sec": 0, 00:10:16.572 "rw_mbytes_per_sec": 0, 00:10:16.572 "r_mbytes_per_sec": 0, 00:10:16.572 "w_mbytes_per_sec": 0 00:10:16.572 }, 00:10:16.572 "claimed": true, 00:10:16.572 "claim_type": "exclusive_write", 00:10:16.572 "zoned": false, 00:10:16.572 "supported_io_types": { 00:10:16.572 "read": true, 00:10:16.572 "write": true, 00:10:16.572 "unmap": true, 00:10:16.572 "flush": true, 00:10:16.572 "reset": true, 00:10:16.572 "nvme_admin": false, 00:10:16.572 "nvme_io": false, 00:10:16.572 "nvme_io_md": false, 00:10:16.572 "write_zeroes": true, 00:10:16.572 "zcopy": true, 00:10:16.572 "get_zone_info": false, 00:10:16.572 "zone_management": false, 00:10:16.572 "zone_append": false, 00:10:16.572 "compare": false, 00:10:16.572 "compare_and_write": false, 00:10:16.572 "abort": true, 00:10:16.572 "seek_hole": false, 00:10:16.572 "seek_data": false, 00:10:16.572 "copy": true, 00:10:16.572 "nvme_iov_md": false 00:10:16.572 }, 00:10:16.572 "memory_domains": [ 00:10:16.572 { 00:10:16.572 "dma_device_id": "system", 00:10:16.572 "dma_device_type": 1 00:10:16.572 }, 00:10:16.572 { 00:10:16.572 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:16.572 "dma_device_type": 2 00:10:16.572 } 00:10:16.572 ], 00:10:16.572 "driver_specific": {} 00:10:16.572 }' 00:10:16.572 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:16.572 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:16.572 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:16.572 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:16.572 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:16.831 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:16.831 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:16.831 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:16.831 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:16.831 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:16.831 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:16.831 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:16.831 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:17.092 [2024-07-12 11:51:07.141886] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:17.092 [2024-07-12 11:51:07.141908] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:17.092 [2024-07-12 11:51:07.141938] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:17.092 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:10:17.092 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:10:17.092 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:17.092 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:10:17.092 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:10:17.092 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:10:17.092 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:17.092 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:10:17.092 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:17.092 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:17.092 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:17.092 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:17.092 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:17.092 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:17.092 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:17.092 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:17.092 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:17.092 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:17.092 "name": "Existed_Raid", 00:10:17.092 "uuid": "9268a750-1b37-4493-8dc0-748b1144f76c", 00:10:17.092 "strip_size_kb": 64, 00:10:17.092 "state": "offline", 00:10:17.092 "raid_level": "concat", 00:10:17.092 "superblock": true, 00:10:17.092 "num_base_bdevs": 2, 00:10:17.092 "num_base_bdevs_discovered": 1, 00:10:17.092 "num_base_bdevs_operational": 1, 00:10:17.092 "base_bdevs_list": [ 00:10:17.092 { 00:10:17.092 "name": null, 00:10:17.092 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:17.092 "is_configured": false, 00:10:17.092 "data_offset": 2048, 00:10:17.092 "data_size": 63488 00:10:17.092 }, 00:10:17.092 { 00:10:17.092 "name": "BaseBdev2", 00:10:17.092 "uuid": "dbaefc2b-1bab-4a88-a62e-830772cfdc60", 00:10:17.092 "is_configured": true, 00:10:17.092 "data_offset": 2048, 00:10:17.092 "data_size": 63488 00:10:17.092 } 00:10:17.092 ] 00:10:17.092 }' 00:10:17.092 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:17.092 11:51:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:17.658 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:10:17.658 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:17.658 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:17.658 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:10:17.916 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:10:17.916 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:17.916 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:17.916 [2024-07-12 11:51:08.149437] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:17.916 [2024-07-12 11:51:08.149479] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f49890 name Existed_Raid, state offline 00:10:18.175 11:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:10:18.175 11:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:18.175 11:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:18.175 11:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:10:18.175 11:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:10:18.175 11:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:10:18.175 11:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:10:18.175 11:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 597901 00:10:18.175 11:51:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 597901 ']' 00:10:18.175 11:51:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 597901 00:10:18.175 11:51:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:10:18.175 11:51:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:18.175 11:51:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 597901 00:10:18.175 11:51:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:18.175 11:51:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:18.176 11:51:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 597901' 00:10:18.176 killing process with pid 597901 00:10:18.176 11:51:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 597901 00:10:18.176 [2024-07-12 11:51:08.387420] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:18.176 11:51:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 597901 00:10:18.176 [2024-07-12 11:51:08.388205] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:18.435 11:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:10:18.435 00:10:18.435 real 0m8.184s 00:10:18.435 user 0m14.646s 00:10:18.435 sys 0m1.330s 00:10:18.435 11:51:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:18.435 11:51:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:18.435 ************************************ 00:10:18.435 END TEST raid_state_function_test_sb 00:10:18.435 ************************************ 00:10:18.435 11:51:08 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:18.435 11:51:08 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 2 00:10:18.435 11:51:08 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:10:18.435 11:51:08 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:18.435 11:51:08 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:18.435 ************************************ 00:10:18.435 START TEST raid_superblock_test 00:10:18.435 ************************************ 00:10:18.435 11:51:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 2 00:10:18.435 11:51:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:10:18.435 11:51:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:10:18.435 11:51:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:10:18.435 11:51:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:10:18.435 11:51:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:10:18.435 11:51:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:10:18.435 11:51:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:10:18.435 11:51:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:10:18.435 11:51:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:10:18.435 11:51:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:10:18.435 11:51:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:10:18.435 11:51:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:10:18.435 11:51:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:10:18.435 11:51:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:10:18.435 11:51:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:10:18.435 11:51:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:10:18.435 11:51:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=599754 00:10:18.435 11:51:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 599754 /var/tmp/spdk-raid.sock 00:10:18.435 11:51:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:10:18.435 11:51:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 599754 ']' 00:10:18.435 11:51:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:18.435 11:51:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:18.435 11:51:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:18.435 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:18.435 11:51:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:18.435 11:51:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:18.695 [2024-07-12 11:51:08.686662] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:10:18.695 [2024-07-12 11:51:08.686704] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid599754 ] 00:10:18.695 [2024-07-12 11:51:08.751029] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:18.695 [2024-07-12 11:51:08.828609] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:18.695 [2024-07-12 11:51:08.884430] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:18.695 [2024-07-12 11:51:08.884458] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:19.263 11:51:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:19.263 11:51:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:10:19.263 11:51:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:10:19.263 11:51:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:19.263 11:51:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:10:19.263 11:51:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:10:19.263 11:51:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:10:19.263 11:51:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:10:19.263 11:51:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:10:19.263 11:51:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:10:19.263 11:51:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:10:19.522 malloc1 00:10:19.522 11:51:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:10:19.781 [2024-07-12 11:51:09.812749] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:10:19.781 [2024-07-12 11:51:09.812784] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:19.781 [2024-07-12 11:51:09.812796] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2286270 00:10:19.781 [2024-07-12 11:51:09.812802] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:19.781 [2024-07-12 11:51:09.813961] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:19.781 [2024-07-12 11:51:09.813981] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:10:19.781 pt1 00:10:19.781 11:51:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:10:19.781 11:51:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:19.781 11:51:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:10:19.781 11:51:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:10:19.781 11:51:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:10:19.781 11:51:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:10:19.781 11:51:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:10:19.781 11:51:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:10:19.781 11:51:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:10:19.781 malloc2 00:10:19.781 11:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:10:20.041 [2024-07-12 11:51:10.161211] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:10:20.041 [2024-07-12 11:51:10.161248] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:20.041 [2024-07-12 11:51:10.161258] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2287580 00:10:20.041 [2024-07-12 11:51:10.161265] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:20.041 [2024-07-12 11:51:10.162399] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:20.041 [2024-07-12 11:51:10.162421] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:10:20.041 pt2 00:10:20.041 11:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:10:20.041 11:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:20.041 11:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2' -n raid_bdev1 -s 00:10:20.299 [2024-07-12 11:51:10.329663] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:10:20.299 [2024-07-12 11:51:10.330537] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:20.299 [2024-07-12 11:51:10.330635] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2431890 00:10:20.299 [2024-07-12 11:51:10.330642] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:20.299 [2024-07-12 11:51:10.330776] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24338f0 00:10:20.299 [2024-07-12 11:51:10.330869] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2431890 00:10:20.299 [2024-07-12 11:51:10.330874] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2431890 00:10:20.300 [2024-07-12 11:51:10.330937] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:20.300 11:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:10:20.300 11:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:20.300 11:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:20.300 11:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:20.300 11:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:20.300 11:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:20.300 11:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:20.300 11:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:20.300 11:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:20.300 11:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:20.300 11:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:20.300 11:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:20.300 11:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:20.300 "name": "raid_bdev1", 00:10:20.300 "uuid": "0be5f46d-620f-42e4-b361-08b774c208a7", 00:10:20.300 "strip_size_kb": 64, 00:10:20.300 "state": "online", 00:10:20.300 "raid_level": "concat", 00:10:20.300 "superblock": true, 00:10:20.300 "num_base_bdevs": 2, 00:10:20.300 "num_base_bdevs_discovered": 2, 00:10:20.300 "num_base_bdevs_operational": 2, 00:10:20.300 "base_bdevs_list": [ 00:10:20.300 { 00:10:20.300 "name": "pt1", 00:10:20.300 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:20.300 "is_configured": true, 00:10:20.300 "data_offset": 2048, 00:10:20.300 "data_size": 63488 00:10:20.300 }, 00:10:20.300 { 00:10:20.300 "name": "pt2", 00:10:20.300 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:20.300 "is_configured": true, 00:10:20.300 "data_offset": 2048, 00:10:20.300 "data_size": 63488 00:10:20.300 } 00:10:20.300 ] 00:10:20.300 }' 00:10:20.300 11:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:20.300 11:51:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:20.868 11:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:10:20.868 11:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:10:20.868 11:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:20.868 11:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:20.868 11:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:20.868 11:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:20.868 11:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:20.868 11:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:21.127 [2024-07-12 11:51:11.164015] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:21.127 11:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:21.127 "name": "raid_bdev1", 00:10:21.127 "aliases": [ 00:10:21.127 "0be5f46d-620f-42e4-b361-08b774c208a7" 00:10:21.127 ], 00:10:21.127 "product_name": "Raid Volume", 00:10:21.127 "block_size": 512, 00:10:21.127 "num_blocks": 126976, 00:10:21.127 "uuid": "0be5f46d-620f-42e4-b361-08b774c208a7", 00:10:21.127 "assigned_rate_limits": { 00:10:21.127 "rw_ios_per_sec": 0, 00:10:21.127 "rw_mbytes_per_sec": 0, 00:10:21.127 "r_mbytes_per_sec": 0, 00:10:21.127 "w_mbytes_per_sec": 0 00:10:21.127 }, 00:10:21.127 "claimed": false, 00:10:21.127 "zoned": false, 00:10:21.127 "supported_io_types": { 00:10:21.127 "read": true, 00:10:21.127 "write": true, 00:10:21.127 "unmap": true, 00:10:21.127 "flush": true, 00:10:21.127 "reset": true, 00:10:21.128 "nvme_admin": false, 00:10:21.128 "nvme_io": false, 00:10:21.128 "nvme_io_md": false, 00:10:21.128 "write_zeroes": true, 00:10:21.128 "zcopy": false, 00:10:21.128 "get_zone_info": false, 00:10:21.128 "zone_management": false, 00:10:21.128 "zone_append": false, 00:10:21.128 "compare": false, 00:10:21.128 "compare_and_write": false, 00:10:21.128 "abort": false, 00:10:21.128 "seek_hole": false, 00:10:21.128 "seek_data": false, 00:10:21.128 "copy": false, 00:10:21.128 "nvme_iov_md": false 00:10:21.128 }, 00:10:21.128 "memory_domains": [ 00:10:21.128 { 00:10:21.128 "dma_device_id": "system", 00:10:21.128 "dma_device_type": 1 00:10:21.128 }, 00:10:21.128 { 00:10:21.128 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:21.128 "dma_device_type": 2 00:10:21.128 }, 00:10:21.128 { 00:10:21.128 "dma_device_id": "system", 00:10:21.128 "dma_device_type": 1 00:10:21.128 }, 00:10:21.128 { 00:10:21.128 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:21.128 "dma_device_type": 2 00:10:21.128 } 00:10:21.128 ], 00:10:21.128 "driver_specific": { 00:10:21.128 "raid": { 00:10:21.128 "uuid": "0be5f46d-620f-42e4-b361-08b774c208a7", 00:10:21.128 "strip_size_kb": 64, 00:10:21.128 "state": "online", 00:10:21.128 "raid_level": "concat", 00:10:21.128 "superblock": true, 00:10:21.128 "num_base_bdevs": 2, 00:10:21.128 "num_base_bdevs_discovered": 2, 00:10:21.128 "num_base_bdevs_operational": 2, 00:10:21.128 "base_bdevs_list": [ 00:10:21.128 { 00:10:21.128 "name": "pt1", 00:10:21.128 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:21.128 "is_configured": true, 00:10:21.128 "data_offset": 2048, 00:10:21.128 "data_size": 63488 00:10:21.128 }, 00:10:21.128 { 00:10:21.128 "name": "pt2", 00:10:21.128 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:21.128 "is_configured": true, 00:10:21.128 "data_offset": 2048, 00:10:21.128 "data_size": 63488 00:10:21.128 } 00:10:21.128 ] 00:10:21.128 } 00:10:21.128 } 00:10:21.128 }' 00:10:21.128 11:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:21.128 11:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:10:21.128 pt2' 00:10:21.128 11:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:21.128 11:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:10:21.128 11:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:21.387 11:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:21.387 "name": "pt1", 00:10:21.387 "aliases": [ 00:10:21.387 "00000000-0000-0000-0000-000000000001" 00:10:21.387 ], 00:10:21.387 "product_name": "passthru", 00:10:21.387 "block_size": 512, 00:10:21.387 "num_blocks": 65536, 00:10:21.387 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:21.387 "assigned_rate_limits": { 00:10:21.387 "rw_ios_per_sec": 0, 00:10:21.387 "rw_mbytes_per_sec": 0, 00:10:21.387 "r_mbytes_per_sec": 0, 00:10:21.387 "w_mbytes_per_sec": 0 00:10:21.387 }, 00:10:21.387 "claimed": true, 00:10:21.387 "claim_type": "exclusive_write", 00:10:21.387 "zoned": false, 00:10:21.387 "supported_io_types": { 00:10:21.387 "read": true, 00:10:21.387 "write": true, 00:10:21.387 "unmap": true, 00:10:21.387 "flush": true, 00:10:21.387 "reset": true, 00:10:21.387 "nvme_admin": false, 00:10:21.387 "nvme_io": false, 00:10:21.387 "nvme_io_md": false, 00:10:21.387 "write_zeroes": true, 00:10:21.387 "zcopy": true, 00:10:21.387 "get_zone_info": false, 00:10:21.387 "zone_management": false, 00:10:21.387 "zone_append": false, 00:10:21.387 "compare": false, 00:10:21.387 "compare_and_write": false, 00:10:21.387 "abort": true, 00:10:21.387 "seek_hole": false, 00:10:21.387 "seek_data": false, 00:10:21.387 "copy": true, 00:10:21.387 "nvme_iov_md": false 00:10:21.387 }, 00:10:21.387 "memory_domains": [ 00:10:21.387 { 00:10:21.387 "dma_device_id": "system", 00:10:21.387 "dma_device_type": 1 00:10:21.387 }, 00:10:21.387 { 00:10:21.387 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:21.387 "dma_device_type": 2 00:10:21.387 } 00:10:21.387 ], 00:10:21.387 "driver_specific": { 00:10:21.387 "passthru": { 00:10:21.387 "name": "pt1", 00:10:21.387 "base_bdev_name": "malloc1" 00:10:21.387 } 00:10:21.387 } 00:10:21.387 }' 00:10:21.387 11:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:21.387 11:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:21.387 11:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:21.387 11:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:21.387 11:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:21.387 11:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:21.387 11:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:21.387 11:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:21.387 11:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:21.387 11:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:21.646 11:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:21.646 11:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:21.646 11:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:21.646 11:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:10:21.646 11:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:21.646 11:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:21.646 "name": "pt2", 00:10:21.646 "aliases": [ 00:10:21.646 "00000000-0000-0000-0000-000000000002" 00:10:21.646 ], 00:10:21.646 "product_name": "passthru", 00:10:21.646 "block_size": 512, 00:10:21.646 "num_blocks": 65536, 00:10:21.646 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:21.646 "assigned_rate_limits": { 00:10:21.646 "rw_ios_per_sec": 0, 00:10:21.646 "rw_mbytes_per_sec": 0, 00:10:21.646 "r_mbytes_per_sec": 0, 00:10:21.646 "w_mbytes_per_sec": 0 00:10:21.646 }, 00:10:21.646 "claimed": true, 00:10:21.646 "claim_type": "exclusive_write", 00:10:21.646 "zoned": false, 00:10:21.646 "supported_io_types": { 00:10:21.646 "read": true, 00:10:21.646 "write": true, 00:10:21.646 "unmap": true, 00:10:21.646 "flush": true, 00:10:21.646 "reset": true, 00:10:21.646 "nvme_admin": false, 00:10:21.646 "nvme_io": false, 00:10:21.646 "nvme_io_md": false, 00:10:21.646 "write_zeroes": true, 00:10:21.646 "zcopy": true, 00:10:21.646 "get_zone_info": false, 00:10:21.646 "zone_management": false, 00:10:21.646 "zone_append": false, 00:10:21.646 "compare": false, 00:10:21.646 "compare_and_write": false, 00:10:21.646 "abort": true, 00:10:21.646 "seek_hole": false, 00:10:21.646 "seek_data": false, 00:10:21.646 "copy": true, 00:10:21.646 "nvme_iov_md": false 00:10:21.646 }, 00:10:21.646 "memory_domains": [ 00:10:21.646 { 00:10:21.646 "dma_device_id": "system", 00:10:21.646 "dma_device_type": 1 00:10:21.646 }, 00:10:21.646 { 00:10:21.646 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:21.646 "dma_device_type": 2 00:10:21.646 } 00:10:21.646 ], 00:10:21.646 "driver_specific": { 00:10:21.646 "passthru": { 00:10:21.646 "name": "pt2", 00:10:21.646 "base_bdev_name": "malloc2" 00:10:21.646 } 00:10:21.646 } 00:10:21.646 }' 00:10:21.646 11:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:21.905 11:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:21.905 11:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:21.905 11:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:21.905 11:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:21.905 11:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:21.905 11:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:21.905 11:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:21.905 11:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:21.905 11:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:21.905 11:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:22.165 11:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:22.165 11:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:22.165 11:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:10:22.165 [2024-07-12 11:51:12.330997] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:22.165 11:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=0be5f46d-620f-42e4-b361-08b774c208a7 00:10:22.165 11:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 0be5f46d-620f-42e4-b361-08b774c208a7 ']' 00:10:22.165 11:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:22.424 [2024-07-12 11:51:12.483208] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:22.424 [2024-07-12 11:51:12.483223] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:22.424 [2024-07-12 11:51:12.483263] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:22.424 [2024-07-12 11:51:12.483292] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:22.424 [2024-07-12 11:51:12.483298] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2431890 name raid_bdev1, state offline 00:10:22.424 11:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:22.424 11:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:10:22.424 11:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:10:22.424 11:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:10:22.424 11:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:10:22.424 11:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:10:22.683 11:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:10:22.684 11:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:10:22.941 11:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:10:22.941 11:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:10:22.941 11:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:10:22.941 11:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:10:22.941 11:51:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:10:22.941 11:51:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:10:22.941 11:51:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:22.941 11:51:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:22.941 11:51:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:22.941 11:51:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:22.941 11:51:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:22.941 11:51:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:22.941 11:51:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:22.941 11:51:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:10:22.941 11:51:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:10:23.199 [2024-07-12 11:51:13.293297] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:10:23.199 [2024-07-12 11:51:13.294290] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:10:23.199 [2024-07-12 11:51:13.294333] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:10:23.199 [2024-07-12 11:51:13.294361] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:10:23.199 [2024-07-12 11:51:13.294371] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:23.199 [2024-07-12 11:51:13.294377] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2433890 name raid_bdev1, state configuring 00:10:23.199 request: 00:10:23.199 { 00:10:23.199 "name": "raid_bdev1", 00:10:23.199 "raid_level": "concat", 00:10:23.199 "base_bdevs": [ 00:10:23.199 "malloc1", 00:10:23.199 "malloc2" 00:10:23.199 ], 00:10:23.199 "superblock": false, 00:10:23.199 "strip_size_kb": 64, 00:10:23.199 "method": "bdev_raid_create", 00:10:23.199 "req_id": 1 00:10:23.199 } 00:10:23.199 Got JSON-RPC error response 00:10:23.199 response: 00:10:23.199 { 00:10:23.199 "code": -17, 00:10:23.199 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:10:23.199 } 00:10:23.199 11:51:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:10:23.199 11:51:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:10:23.199 11:51:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:10:23.199 11:51:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:10:23.199 11:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:23.199 11:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:10:23.457 11:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:10:23.458 11:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:10:23.458 11:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:10:23.458 [2024-07-12 11:51:13.626109] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:10:23.458 [2024-07-12 11:51:13.626142] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:23.458 [2024-07-12 11:51:13.626153] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2284ae0 00:10:23.458 [2024-07-12 11:51:13.626175] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:23.458 [2024-07-12 11:51:13.627364] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:23.458 [2024-07-12 11:51:13.627386] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:10:23.458 [2024-07-12 11:51:13.627437] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:10:23.458 [2024-07-12 11:51:13.627458] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:10:23.458 pt1 00:10:23.458 11:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 2 00:10:23.458 11:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:23.458 11:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:23.458 11:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:23.458 11:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:23.458 11:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:23.458 11:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:23.458 11:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:23.458 11:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:23.458 11:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:23.458 11:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:23.458 11:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:23.716 11:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:23.716 "name": "raid_bdev1", 00:10:23.716 "uuid": "0be5f46d-620f-42e4-b361-08b774c208a7", 00:10:23.716 "strip_size_kb": 64, 00:10:23.716 "state": "configuring", 00:10:23.716 "raid_level": "concat", 00:10:23.716 "superblock": true, 00:10:23.716 "num_base_bdevs": 2, 00:10:23.716 "num_base_bdevs_discovered": 1, 00:10:23.716 "num_base_bdevs_operational": 2, 00:10:23.716 "base_bdevs_list": [ 00:10:23.716 { 00:10:23.716 "name": "pt1", 00:10:23.716 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:23.716 "is_configured": true, 00:10:23.716 "data_offset": 2048, 00:10:23.716 "data_size": 63488 00:10:23.716 }, 00:10:23.716 { 00:10:23.716 "name": null, 00:10:23.716 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:23.716 "is_configured": false, 00:10:23.716 "data_offset": 2048, 00:10:23.716 "data_size": 63488 00:10:23.716 } 00:10:23.716 ] 00:10:23.716 }' 00:10:23.716 11:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:23.716 11:51:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:24.284 11:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:10:24.284 11:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:10:24.284 11:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:10:24.284 11:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:10:24.284 [2024-07-12 11:51:14.444225] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:10:24.284 [2024-07-12 11:51:14.444264] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:24.284 [2024-07-12 11:51:14.444275] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2285290 00:10:24.284 [2024-07-12 11:51:14.444281] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:24.284 [2024-07-12 11:51:14.444535] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:24.284 [2024-07-12 11:51:14.444549] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:10:24.284 [2024-07-12 11:51:14.444605] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:10:24.284 [2024-07-12 11:51:14.444620] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:24.284 [2024-07-12 11:51:14.444691] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2436fe0 00:10:24.284 [2024-07-12 11:51:14.444696] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:24.284 [2024-07-12 11:51:14.444808] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2431120 00:10:24.284 [2024-07-12 11:51:14.444890] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2436fe0 00:10:24.284 [2024-07-12 11:51:14.444895] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2436fe0 00:10:24.284 [2024-07-12 11:51:14.444962] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:24.284 pt2 00:10:24.284 11:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:10:24.284 11:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:10:24.284 11:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:10:24.284 11:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:24.284 11:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:24.284 11:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:24.285 11:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:24.285 11:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:24.285 11:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:24.285 11:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:24.285 11:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:24.285 11:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:24.285 11:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:24.285 11:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:24.543 11:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:24.543 "name": "raid_bdev1", 00:10:24.543 "uuid": "0be5f46d-620f-42e4-b361-08b774c208a7", 00:10:24.543 "strip_size_kb": 64, 00:10:24.543 "state": "online", 00:10:24.543 "raid_level": "concat", 00:10:24.544 "superblock": true, 00:10:24.544 "num_base_bdevs": 2, 00:10:24.544 "num_base_bdevs_discovered": 2, 00:10:24.544 "num_base_bdevs_operational": 2, 00:10:24.544 "base_bdevs_list": [ 00:10:24.544 { 00:10:24.544 "name": "pt1", 00:10:24.544 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:24.544 "is_configured": true, 00:10:24.544 "data_offset": 2048, 00:10:24.544 "data_size": 63488 00:10:24.544 }, 00:10:24.544 { 00:10:24.544 "name": "pt2", 00:10:24.544 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:24.544 "is_configured": true, 00:10:24.544 "data_offset": 2048, 00:10:24.544 "data_size": 63488 00:10:24.544 } 00:10:24.544 ] 00:10:24.544 }' 00:10:24.544 11:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:24.544 11:51:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:25.110 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:10:25.110 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:10:25.110 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:25.110 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:25.110 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:25.110 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:25.110 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:25.110 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:25.110 [2024-07-12 11:51:15.266555] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:25.110 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:25.110 "name": "raid_bdev1", 00:10:25.110 "aliases": [ 00:10:25.110 "0be5f46d-620f-42e4-b361-08b774c208a7" 00:10:25.110 ], 00:10:25.110 "product_name": "Raid Volume", 00:10:25.110 "block_size": 512, 00:10:25.110 "num_blocks": 126976, 00:10:25.110 "uuid": "0be5f46d-620f-42e4-b361-08b774c208a7", 00:10:25.110 "assigned_rate_limits": { 00:10:25.110 "rw_ios_per_sec": 0, 00:10:25.110 "rw_mbytes_per_sec": 0, 00:10:25.110 "r_mbytes_per_sec": 0, 00:10:25.110 "w_mbytes_per_sec": 0 00:10:25.110 }, 00:10:25.110 "claimed": false, 00:10:25.110 "zoned": false, 00:10:25.110 "supported_io_types": { 00:10:25.110 "read": true, 00:10:25.110 "write": true, 00:10:25.110 "unmap": true, 00:10:25.110 "flush": true, 00:10:25.110 "reset": true, 00:10:25.110 "nvme_admin": false, 00:10:25.110 "nvme_io": false, 00:10:25.110 "nvme_io_md": false, 00:10:25.110 "write_zeroes": true, 00:10:25.110 "zcopy": false, 00:10:25.110 "get_zone_info": false, 00:10:25.110 "zone_management": false, 00:10:25.110 "zone_append": false, 00:10:25.110 "compare": false, 00:10:25.110 "compare_and_write": false, 00:10:25.110 "abort": false, 00:10:25.110 "seek_hole": false, 00:10:25.110 "seek_data": false, 00:10:25.110 "copy": false, 00:10:25.110 "nvme_iov_md": false 00:10:25.110 }, 00:10:25.110 "memory_domains": [ 00:10:25.110 { 00:10:25.110 "dma_device_id": "system", 00:10:25.110 "dma_device_type": 1 00:10:25.110 }, 00:10:25.110 { 00:10:25.110 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:25.110 "dma_device_type": 2 00:10:25.110 }, 00:10:25.110 { 00:10:25.110 "dma_device_id": "system", 00:10:25.110 "dma_device_type": 1 00:10:25.110 }, 00:10:25.110 { 00:10:25.110 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:25.110 "dma_device_type": 2 00:10:25.110 } 00:10:25.110 ], 00:10:25.110 "driver_specific": { 00:10:25.110 "raid": { 00:10:25.110 "uuid": "0be5f46d-620f-42e4-b361-08b774c208a7", 00:10:25.110 "strip_size_kb": 64, 00:10:25.110 "state": "online", 00:10:25.110 "raid_level": "concat", 00:10:25.110 "superblock": true, 00:10:25.110 "num_base_bdevs": 2, 00:10:25.110 "num_base_bdevs_discovered": 2, 00:10:25.110 "num_base_bdevs_operational": 2, 00:10:25.110 "base_bdevs_list": [ 00:10:25.110 { 00:10:25.110 "name": "pt1", 00:10:25.110 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:25.110 "is_configured": true, 00:10:25.110 "data_offset": 2048, 00:10:25.110 "data_size": 63488 00:10:25.110 }, 00:10:25.110 { 00:10:25.110 "name": "pt2", 00:10:25.110 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:25.110 "is_configured": true, 00:10:25.110 "data_offset": 2048, 00:10:25.110 "data_size": 63488 00:10:25.110 } 00:10:25.110 ] 00:10:25.110 } 00:10:25.110 } 00:10:25.110 }' 00:10:25.110 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:25.111 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:10:25.111 pt2' 00:10:25.111 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:25.111 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:10:25.111 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:25.370 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:25.370 "name": "pt1", 00:10:25.370 "aliases": [ 00:10:25.370 "00000000-0000-0000-0000-000000000001" 00:10:25.370 ], 00:10:25.370 "product_name": "passthru", 00:10:25.370 "block_size": 512, 00:10:25.370 "num_blocks": 65536, 00:10:25.370 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:25.370 "assigned_rate_limits": { 00:10:25.370 "rw_ios_per_sec": 0, 00:10:25.370 "rw_mbytes_per_sec": 0, 00:10:25.370 "r_mbytes_per_sec": 0, 00:10:25.370 "w_mbytes_per_sec": 0 00:10:25.370 }, 00:10:25.370 "claimed": true, 00:10:25.370 "claim_type": "exclusive_write", 00:10:25.370 "zoned": false, 00:10:25.370 "supported_io_types": { 00:10:25.370 "read": true, 00:10:25.370 "write": true, 00:10:25.370 "unmap": true, 00:10:25.370 "flush": true, 00:10:25.370 "reset": true, 00:10:25.370 "nvme_admin": false, 00:10:25.370 "nvme_io": false, 00:10:25.370 "nvme_io_md": false, 00:10:25.370 "write_zeroes": true, 00:10:25.370 "zcopy": true, 00:10:25.370 "get_zone_info": false, 00:10:25.370 "zone_management": false, 00:10:25.370 "zone_append": false, 00:10:25.370 "compare": false, 00:10:25.370 "compare_and_write": false, 00:10:25.370 "abort": true, 00:10:25.370 "seek_hole": false, 00:10:25.370 "seek_data": false, 00:10:25.370 "copy": true, 00:10:25.370 "nvme_iov_md": false 00:10:25.370 }, 00:10:25.370 "memory_domains": [ 00:10:25.370 { 00:10:25.370 "dma_device_id": "system", 00:10:25.370 "dma_device_type": 1 00:10:25.370 }, 00:10:25.370 { 00:10:25.370 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:25.370 "dma_device_type": 2 00:10:25.370 } 00:10:25.370 ], 00:10:25.370 "driver_specific": { 00:10:25.370 "passthru": { 00:10:25.370 "name": "pt1", 00:10:25.370 "base_bdev_name": "malloc1" 00:10:25.370 } 00:10:25.370 } 00:10:25.370 }' 00:10:25.370 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:25.370 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:25.370 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:25.370 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:25.628 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:25.628 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:25.628 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:25.628 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:25.628 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:25.628 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:25.628 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:25.628 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:25.628 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:25.628 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:10:25.628 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:25.887 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:25.887 "name": "pt2", 00:10:25.887 "aliases": [ 00:10:25.887 "00000000-0000-0000-0000-000000000002" 00:10:25.887 ], 00:10:25.887 "product_name": "passthru", 00:10:25.887 "block_size": 512, 00:10:25.887 "num_blocks": 65536, 00:10:25.887 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:25.887 "assigned_rate_limits": { 00:10:25.887 "rw_ios_per_sec": 0, 00:10:25.887 "rw_mbytes_per_sec": 0, 00:10:25.887 "r_mbytes_per_sec": 0, 00:10:25.887 "w_mbytes_per_sec": 0 00:10:25.887 }, 00:10:25.887 "claimed": true, 00:10:25.887 "claim_type": "exclusive_write", 00:10:25.887 "zoned": false, 00:10:25.887 "supported_io_types": { 00:10:25.887 "read": true, 00:10:25.887 "write": true, 00:10:25.887 "unmap": true, 00:10:25.887 "flush": true, 00:10:25.887 "reset": true, 00:10:25.887 "nvme_admin": false, 00:10:25.887 "nvme_io": false, 00:10:25.887 "nvme_io_md": false, 00:10:25.887 "write_zeroes": true, 00:10:25.887 "zcopy": true, 00:10:25.887 "get_zone_info": false, 00:10:25.887 "zone_management": false, 00:10:25.887 "zone_append": false, 00:10:25.887 "compare": false, 00:10:25.887 "compare_and_write": false, 00:10:25.887 "abort": true, 00:10:25.887 "seek_hole": false, 00:10:25.887 "seek_data": false, 00:10:25.887 "copy": true, 00:10:25.887 "nvme_iov_md": false 00:10:25.887 }, 00:10:25.887 "memory_domains": [ 00:10:25.887 { 00:10:25.887 "dma_device_id": "system", 00:10:25.887 "dma_device_type": 1 00:10:25.887 }, 00:10:25.887 { 00:10:25.887 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:25.887 "dma_device_type": 2 00:10:25.887 } 00:10:25.887 ], 00:10:25.887 "driver_specific": { 00:10:25.887 "passthru": { 00:10:25.887 "name": "pt2", 00:10:25.887 "base_bdev_name": "malloc2" 00:10:25.887 } 00:10:25.887 } 00:10:25.887 }' 00:10:25.887 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:25.887 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:25.887 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:25.887 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:25.887 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:25.887 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:25.887 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:26.146 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:26.146 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:26.146 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:26.146 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:26.146 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:26.146 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:26.146 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:10:26.405 [2024-07-12 11:51:16.425489] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:26.405 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 0be5f46d-620f-42e4-b361-08b774c208a7 '!=' 0be5f46d-620f-42e4-b361-08b774c208a7 ']' 00:10:26.405 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:10:26.405 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:26.405 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:26.405 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 599754 00:10:26.405 11:51:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 599754 ']' 00:10:26.405 11:51:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 599754 00:10:26.405 11:51:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:10:26.405 11:51:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:26.405 11:51:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 599754 00:10:26.405 11:51:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:26.405 11:51:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:26.405 11:51:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 599754' 00:10:26.405 killing process with pid 599754 00:10:26.405 11:51:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 599754 00:10:26.405 [2024-07-12 11:51:16.483864] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:26.405 [2024-07-12 11:51:16.483906] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:26.405 [2024-07-12 11:51:16.483936] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:26.406 [2024-07-12 11:51:16.483942] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2436fe0 name raid_bdev1, state offline 00:10:26.406 11:51:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 599754 00:10:26.406 [2024-07-12 11:51:16.498974] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:26.664 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:10:26.664 00:10:26.664 real 0m8.040s 00:10:26.664 user 0m14.474s 00:10:26.664 sys 0m1.304s 00:10:26.664 11:51:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:26.664 11:51:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:26.664 ************************************ 00:10:26.664 END TEST raid_superblock_test 00:10:26.664 ************************************ 00:10:26.664 11:51:16 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:26.664 11:51:16 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 2 read 00:10:26.664 11:51:16 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:26.664 11:51:16 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:26.664 11:51:16 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:26.664 ************************************ 00:10:26.664 START TEST raid_read_error_test 00:10:26.664 ************************************ 00:10:26.664 11:51:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 2 read 00:10:26.664 11:51:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:10:26.664 11:51:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:10:26.664 11:51:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:10:26.664 11:51:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:10:26.664 11:51:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:26.664 11:51:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:10:26.664 11:51:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:26.664 11:51:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:26.664 11:51:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:10:26.664 11:51:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:26.664 11:51:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:26.664 11:51:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:26.664 11:51:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:10:26.665 11:51:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:10:26.665 11:51:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:10:26.665 11:51:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:10:26.665 11:51:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:10:26.665 11:51:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:10:26.665 11:51:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:10:26.665 11:51:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:10:26.665 11:51:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:10:26.665 11:51:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:10:26.665 11:51:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.1m6XhHhxn9 00:10:26.665 11:51:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=601568 00:10:26.665 11:51:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 601568 /var/tmp/spdk-raid.sock 00:10:26.665 11:51:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:10:26.665 11:51:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 601568 ']' 00:10:26.665 11:51:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:26.665 11:51:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:26.665 11:51:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:26.665 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:26.665 11:51:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:26.665 11:51:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:26.665 [2024-07-12 11:51:16.796377] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:10:26.665 [2024-07-12 11:51:16.796412] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid601568 ] 00:10:26.665 [2024-07-12 11:51:16.859227] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:26.923 [2024-07-12 11:51:16.938551] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:26.923 [2024-07-12 11:51:16.989485] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:26.923 [2024-07-12 11:51:16.989512] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:27.489 11:51:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:27.489 11:51:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:10:27.489 11:51:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:27.489 11:51:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:10:27.746 BaseBdev1_malloc 00:10:27.746 11:51:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:10:27.746 true 00:10:27.746 11:51:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:10:28.004 [2024-07-12 11:51:18.073413] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:10:28.004 [2024-07-12 11:51:18.073444] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:28.004 [2024-07-12 11:51:18.073455] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16662d0 00:10:28.004 [2024-07-12 11:51:18.073461] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:28.004 [2024-07-12 11:51:18.074681] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:28.004 [2024-07-12 11:51:18.074704] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:10:28.004 BaseBdev1 00:10:28.004 11:51:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:28.004 11:51:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:10:28.004 BaseBdev2_malloc 00:10:28.004 11:51:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:10:28.314 true 00:10:28.314 11:51:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:10:28.314 [2024-07-12 11:51:18.554079] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:10:28.314 [2024-07-12 11:51:18.554107] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:28.314 [2024-07-12 11:51:18.554117] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x166af40 00:10:28.314 [2024-07-12 11:51:18.554123] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:28.314 [2024-07-12 11:51:18.555174] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:28.314 [2024-07-12 11:51:18.555194] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:10:28.314 BaseBdev2 00:10:28.572 11:51:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:10:28.572 [2024-07-12 11:51:18.706496] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:28.572 [2024-07-12 11:51:18.707346] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:28.572 [2024-07-12 11:51:18.707471] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x166bc80 00:10:28.572 [2024-07-12 11:51:18.707479] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:28.572 [2024-07-12 11:51:18.707614] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x166eaf0 00:10:28.572 [2024-07-12 11:51:18.707713] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x166bc80 00:10:28.572 [2024-07-12 11:51:18.707718] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x166bc80 00:10:28.572 [2024-07-12 11:51:18.707785] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:28.572 11:51:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:10:28.572 11:51:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:28.572 11:51:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:28.572 11:51:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:28.572 11:51:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:28.572 11:51:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:28.572 11:51:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:28.572 11:51:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:28.572 11:51:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:28.572 11:51:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:28.572 11:51:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:28.572 11:51:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:28.831 11:51:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:28.831 "name": "raid_bdev1", 00:10:28.831 "uuid": "95d03d7a-dc40-452f-95f6-140cd84f3246", 00:10:28.831 "strip_size_kb": 64, 00:10:28.831 "state": "online", 00:10:28.831 "raid_level": "concat", 00:10:28.831 "superblock": true, 00:10:28.831 "num_base_bdevs": 2, 00:10:28.831 "num_base_bdevs_discovered": 2, 00:10:28.831 "num_base_bdevs_operational": 2, 00:10:28.831 "base_bdevs_list": [ 00:10:28.831 { 00:10:28.831 "name": "BaseBdev1", 00:10:28.831 "uuid": "f09c99ff-a4d5-575d-a3a5-0eceeb762afb", 00:10:28.831 "is_configured": true, 00:10:28.831 "data_offset": 2048, 00:10:28.831 "data_size": 63488 00:10:28.831 }, 00:10:28.831 { 00:10:28.831 "name": "BaseBdev2", 00:10:28.831 "uuid": "29018910-c7a5-543f-8976-1281f3e31b41", 00:10:28.831 "is_configured": true, 00:10:28.831 "data_offset": 2048, 00:10:28.831 "data_size": 63488 00:10:28.831 } 00:10:28.831 ] 00:10:28.831 }' 00:10:28.831 11:51:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:28.831 11:51:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:29.395 11:51:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:10:29.395 11:51:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:10:29.395 [2024-07-12 11:51:19.444582] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x166d730 00:10:30.330 11:51:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:10:30.331 11:51:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:10:30.331 11:51:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:10:30.331 11:51:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:10:30.331 11:51:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:10:30.331 11:51:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:30.331 11:51:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:30.331 11:51:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:30.331 11:51:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:30.331 11:51:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:30.331 11:51:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:30.331 11:51:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:30.331 11:51:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:30.331 11:51:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:30.331 11:51:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:30.331 11:51:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:30.590 11:51:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:30.590 "name": "raid_bdev1", 00:10:30.590 "uuid": "95d03d7a-dc40-452f-95f6-140cd84f3246", 00:10:30.590 "strip_size_kb": 64, 00:10:30.590 "state": "online", 00:10:30.590 "raid_level": "concat", 00:10:30.590 "superblock": true, 00:10:30.590 "num_base_bdevs": 2, 00:10:30.590 "num_base_bdevs_discovered": 2, 00:10:30.590 "num_base_bdevs_operational": 2, 00:10:30.590 "base_bdevs_list": [ 00:10:30.590 { 00:10:30.590 "name": "BaseBdev1", 00:10:30.590 "uuid": "f09c99ff-a4d5-575d-a3a5-0eceeb762afb", 00:10:30.590 "is_configured": true, 00:10:30.590 "data_offset": 2048, 00:10:30.590 "data_size": 63488 00:10:30.590 }, 00:10:30.590 { 00:10:30.590 "name": "BaseBdev2", 00:10:30.590 "uuid": "29018910-c7a5-543f-8976-1281f3e31b41", 00:10:30.590 "is_configured": true, 00:10:30.590 "data_offset": 2048, 00:10:30.590 "data_size": 63488 00:10:30.590 } 00:10:30.590 ] 00:10:30.590 }' 00:10:30.590 11:51:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:30.590 11:51:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:31.157 11:51:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:31.157 [2024-07-12 11:51:21.372873] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:31.157 [2024-07-12 11:51:21.372908] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:31.157 [2024-07-12 11:51:21.374894] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:31.157 [2024-07-12 11:51:21.374920] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:31.157 [2024-07-12 11:51:21.374937] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:31.157 [2024-07-12 11:51:21.374942] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x166bc80 name raid_bdev1, state offline 00:10:31.157 0 00:10:31.157 11:51:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 601568 00:10:31.157 11:51:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 601568 ']' 00:10:31.157 11:51:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 601568 00:10:31.157 11:51:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:10:31.157 11:51:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:31.157 11:51:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 601568 00:10:31.416 11:51:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:31.416 11:51:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:31.416 11:51:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 601568' 00:10:31.416 killing process with pid 601568 00:10:31.416 11:51:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 601568 00:10:31.416 [2024-07-12 11:51:21.440292] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:31.416 11:51:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 601568 00:10:31.416 [2024-07-12 11:51:21.449383] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:31.416 11:51:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.1m6XhHhxn9 00:10:31.416 11:51:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:10:31.416 11:51:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:10:31.416 11:51:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:10:31.416 11:51:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:10:31.416 11:51:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:31.416 11:51:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:31.416 11:51:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:10:31.416 00:10:31.416 real 0m4.901s 00:10:31.416 user 0m7.520s 00:10:31.416 sys 0m0.699s 00:10:31.416 11:51:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:31.416 11:51:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:31.416 ************************************ 00:10:31.416 END TEST raid_read_error_test 00:10:31.416 ************************************ 00:10:31.676 11:51:21 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:31.676 11:51:21 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 2 write 00:10:31.676 11:51:21 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:31.676 11:51:21 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:31.676 11:51:21 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:31.676 ************************************ 00:10:31.676 START TEST raid_write_error_test 00:10:31.676 ************************************ 00:10:31.676 11:51:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 2 write 00:10:31.676 11:51:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:10:31.676 11:51:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:10:31.676 11:51:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:10:31.676 11:51:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:10:31.676 11:51:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:31.676 11:51:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:10:31.676 11:51:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:31.676 11:51:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:31.676 11:51:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:10:31.676 11:51:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:31.676 11:51:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:31.676 11:51:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:31.676 11:51:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:10:31.676 11:51:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:10:31.676 11:51:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:10:31.676 11:51:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:10:31.676 11:51:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:10:31.676 11:51:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:10:31.676 11:51:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:10:31.676 11:51:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:10:31.676 11:51:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:10:31.676 11:51:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:10:31.676 11:51:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.03HKhb0SNG 00:10:31.676 11:51:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=602357 00:10:31.676 11:51:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 602357 /var/tmp/spdk-raid.sock 00:10:31.676 11:51:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:10:31.676 11:51:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 602357 ']' 00:10:31.676 11:51:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:31.676 11:51:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:31.676 11:51:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:31.676 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:31.676 11:51:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:31.676 11:51:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:31.676 [2024-07-12 11:51:21.767587] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:10:31.676 [2024-07-12 11:51:21.767625] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid602357 ] 00:10:31.676 [2024-07-12 11:51:21.831824] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:31.676 [2024-07-12 11:51:21.902828] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:31.936 [2024-07-12 11:51:21.955212] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:31.936 [2024-07-12 11:51:21.955242] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:32.505 11:51:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:32.505 11:51:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:10:32.505 11:51:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:32.505 11:51:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:10:32.505 BaseBdev1_malloc 00:10:32.505 11:51:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:10:32.765 true 00:10:32.765 11:51:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:10:33.023 [2024-07-12 11:51:23.027856] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:10:33.023 [2024-07-12 11:51:23.027890] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:33.023 [2024-07-12 11:51:23.027900] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26f82d0 00:10:33.023 [2024-07-12 11:51:23.027906] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:33.023 [2024-07-12 11:51:23.029042] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:33.023 [2024-07-12 11:51:23.029061] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:10:33.023 BaseBdev1 00:10:33.023 11:51:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:33.023 11:51:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:10:33.023 BaseBdev2_malloc 00:10:33.023 11:51:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:10:33.282 true 00:10:33.282 11:51:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:10:33.542 [2024-07-12 11:51:23.556419] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:10:33.542 [2024-07-12 11:51:23.556451] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:33.542 [2024-07-12 11:51:23.556462] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26fcf40 00:10:33.542 [2024-07-12 11:51:23.556468] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:33.542 [2024-07-12 11:51:23.557469] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:33.542 [2024-07-12 11:51:23.557489] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:10:33.542 BaseBdev2 00:10:33.542 11:51:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:10:33.542 [2024-07-12 11:51:23.724879] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:33.542 [2024-07-12 11:51:23.725727] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:33.542 [2024-07-12 11:51:23.725851] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x26fdc80 00:10:33.542 [2024-07-12 11:51:23.725859] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:33.542 [2024-07-12 11:51:23.725982] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2700af0 00:10:33.542 [2024-07-12 11:51:23.726080] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26fdc80 00:10:33.542 [2024-07-12 11:51:23.726085] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x26fdc80 00:10:33.542 [2024-07-12 11:51:23.726150] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:33.542 11:51:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:10:33.542 11:51:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:33.542 11:51:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:33.542 11:51:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:33.542 11:51:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:33.542 11:51:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:33.542 11:51:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:33.542 11:51:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:33.542 11:51:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:33.542 11:51:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:33.542 11:51:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:33.542 11:51:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:33.802 11:51:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:33.802 "name": "raid_bdev1", 00:10:33.802 "uuid": "e168d2e0-79d7-40f0-b3ec-86e891f2344f", 00:10:33.802 "strip_size_kb": 64, 00:10:33.802 "state": "online", 00:10:33.802 "raid_level": "concat", 00:10:33.802 "superblock": true, 00:10:33.802 "num_base_bdevs": 2, 00:10:33.802 "num_base_bdevs_discovered": 2, 00:10:33.802 "num_base_bdevs_operational": 2, 00:10:33.802 "base_bdevs_list": [ 00:10:33.802 { 00:10:33.802 "name": "BaseBdev1", 00:10:33.802 "uuid": "7145c006-a38c-5091-9b26-510fff5f4c1e", 00:10:33.802 "is_configured": true, 00:10:33.802 "data_offset": 2048, 00:10:33.802 "data_size": 63488 00:10:33.802 }, 00:10:33.802 { 00:10:33.802 "name": "BaseBdev2", 00:10:33.802 "uuid": "a72a7238-0c65-50d2-a186-c6578287ebc2", 00:10:33.802 "is_configured": true, 00:10:33.802 "data_offset": 2048, 00:10:33.802 "data_size": 63488 00:10:33.802 } 00:10:33.802 ] 00:10:33.802 }' 00:10:33.802 11:51:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:33.802 11:51:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:34.369 11:51:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:10:34.369 11:51:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:10:34.369 [2024-07-12 11:51:24.491036] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26ff730 00:10:35.305 11:51:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:10:35.564 11:51:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:10:35.564 11:51:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:10:35.564 11:51:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:10:35.564 11:51:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:10:35.564 11:51:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:35.564 11:51:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:35.564 11:51:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:35.564 11:51:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:35.564 11:51:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:35.564 11:51:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:35.564 11:51:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:35.564 11:51:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:35.564 11:51:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:35.564 11:51:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:35.564 11:51:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:35.564 11:51:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:35.564 "name": "raid_bdev1", 00:10:35.564 "uuid": "e168d2e0-79d7-40f0-b3ec-86e891f2344f", 00:10:35.564 "strip_size_kb": 64, 00:10:35.564 "state": "online", 00:10:35.564 "raid_level": "concat", 00:10:35.564 "superblock": true, 00:10:35.564 "num_base_bdevs": 2, 00:10:35.564 "num_base_bdevs_discovered": 2, 00:10:35.564 "num_base_bdevs_operational": 2, 00:10:35.564 "base_bdevs_list": [ 00:10:35.564 { 00:10:35.564 "name": "BaseBdev1", 00:10:35.564 "uuid": "7145c006-a38c-5091-9b26-510fff5f4c1e", 00:10:35.564 "is_configured": true, 00:10:35.564 "data_offset": 2048, 00:10:35.564 "data_size": 63488 00:10:35.564 }, 00:10:35.564 { 00:10:35.564 "name": "BaseBdev2", 00:10:35.564 "uuid": "a72a7238-0c65-50d2-a186-c6578287ebc2", 00:10:35.564 "is_configured": true, 00:10:35.564 "data_offset": 2048, 00:10:35.564 "data_size": 63488 00:10:35.564 } 00:10:35.564 ] 00:10:35.564 }' 00:10:35.564 11:51:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:35.564 11:51:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:36.133 11:51:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:36.392 [2024-07-12 11:51:26.407640] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:36.392 [2024-07-12 11:51:26.407666] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:36.392 [2024-07-12 11:51:26.409828] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:36.392 [2024-07-12 11:51:26.409850] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:36.392 [2024-07-12 11:51:26.409870] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:36.392 [2024-07-12 11:51:26.409876] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26fdc80 name raid_bdev1, state offline 00:10:36.392 0 00:10:36.392 11:51:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 602357 00:10:36.392 11:51:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 602357 ']' 00:10:36.392 11:51:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 602357 00:10:36.392 11:51:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:10:36.392 11:51:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:36.392 11:51:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 602357 00:10:36.392 11:51:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:36.392 11:51:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:36.392 11:51:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 602357' 00:10:36.392 killing process with pid 602357 00:10:36.392 11:51:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 602357 00:10:36.392 [2024-07-12 11:51:26.468080] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:36.392 11:51:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 602357 00:10:36.392 [2024-07-12 11:51:26.477701] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:36.652 11:51:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.03HKhb0SNG 00:10:36.652 11:51:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:10:36.652 11:51:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:10:36.652 11:51:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:10:36.652 11:51:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:10:36.652 11:51:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:36.652 11:51:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:36.652 11:51:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:10:36.652 00:10:36.652 real 0m4.967s 00:10:36.652 user 0m7.600s 00:10:36.652 sys 0m0.722s 00:10:36.652 11:51:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:36.652 11:51:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:36.652 ************************************ 00:10:36.652 END TEST raid_write_error_test 00:10:36.652 ************************************ 00:10:36.652 11:51:26 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:36.652 11:51:26 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:10:36.652 11:51:26 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 2 false 00:10:36.652 11:51:26 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:36.652 11:51:26 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:36.652 11:51:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:36.652 ************************************ 00:10:36.652 START TEST raid_state_function_test 00:10:36.652 ************************************ 00:10:36.652 11:51:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 false 00:10:36.652 11:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:10:36.652 11:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:10:36.652 11:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:10:36.652 11:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:10:36.652 11:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:10:36.652 11:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:36.652 11:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:10:36.652 11:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:36.652 11:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:36.652 11:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:10:36.652 11:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:36.652 11:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:36.652 11:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:36.652 11:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:10:36.652 11:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:10:36.652 11:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:10:36.652 11:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:10:36.652 11:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:10:36.652 11:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:10:36.652 11:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:10:36.652 11:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:10:36.652 11:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:10:36.652 11:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=603361 00:10:36.652 11:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 603361' 00:10:36.652 Process raid pid: 603361 00:10:36.653 11:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:36.653 11:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 603361 /var/tmp/spdk-raid.sock 00:10:36.653 11:51:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 603361 ']' 00:10:36.653 11:51:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:36.653 11:51:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:36.653 11:51:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:36.653 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:36.653 11:51:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:36.653 11:51:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:36.653 [2024-07-12 11:51:26.792564] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:10:36.653 [2024-07-12 11:51:26.792602] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:36.653 [2024-07-12 11:51:26.856130] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:36.912 [2024-07-12 11:51:26.927810] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:36.912 [2024-07-12 11:51:26.976228] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:36.912 [2024-07-12 11:51:26.976250] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:37.480 11:51:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:37.480 11:51:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:10:37.480 11:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:37.480 [2024-07-12 11:51:27.722653] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:37.480 [2024-07-12 11:51:27.722687] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:37.480 [2024-07-12 11:51:27.722693] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:37.480 [2024-07-12 11:51:27.722698] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:37.739 11:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:10:37.739 11:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:37.739 11:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:37.739 11:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:37.739 11:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:37.739 11:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:37.739 11:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:37.739 11:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:37.739 11:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:37.739 11:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:37.739 11:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:37.739 11:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:37.739 11:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:37.739 "name": "Existed_Raid", 00:10:37.739 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:37.739 "strip_size_kb": 0, 00:10:37.739 "state": "configuring", 00:10:37.739 "raid_level": "raid1", 00:10:37.739 "superblock": false, 00:10:37.739 "num_base_bdevs": 2, 00:10:37.739 "num_base_bdevs_discovered": 0, 00:10:37.739 "num_base_bdevs_operational": 2, 00:10:37.739 "base_bdevs_list": [ 00:10:37.739 { 00:10:37.739 "name": "BaseBdev1", 00:10:37.739 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:37.739 "is_configured": false, 00:10:37.739 "data_offset": 0, 00:10:37.739 "data_size": 0 00:10:37.739 }, 00:10:37.739 { 00:10:37.739 "name": "BaseBdev2", 00:10:37.739 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:37.739 "is_configured": false, 00:10:37.739 "data_offset": 0, 00:10:37.739 "data_size": 0 00:10:37.739 } 00:10:37.739 ] 00:10:37.739 }' 00:10:37.739 11:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:37.739 11:51:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:38.309 11:51:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:38.309 [2024-07-12 11:51:28.508593] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:38.309 [2024-07-12 11:51:28.508614] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xee51b0 name Existed_Raid, state configuring 00:10:38.309 11:51:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:38.567 [2024-07-12 11:51:28.685063] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:38.567 [2024-07-12 11:51:28.685084] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:38.567 [2024-07-12 11:51:28.685089] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:38.567 [2024-07-12 11:51:28.685094] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:38.567 11:51:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:38.826 [2024-07-12 11:51:28.877818] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:38.826 BaseBdev1 00:10:38.826 11:51:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:10:38.826 11:51:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:10:38.826 11:51:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:38.826 11:51:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:10:38.826 11:51:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:38.826 11:51:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:38.826 11:51:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:39.085 11:51:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:39.085 [ 00:10:39.085 { 00:10:39.085 "name": "BaseBdev1", 00:10:39.085 "aliases": [ 00:10:39.085 "27085e38-1d09-4b6e-af30-8556ad738a8e" 00:10:39.085 ], 00:10:39.085 "product_name": "Malloc disk", 00:10:39.085 "block_size": 512, 00:10:39.085 "num_blocks": 65536, 00:10:39.085 "uuid": "27085e38-1d09-4b6e-af30-8556ad738a8e", 00:10:39.085 "assigned_rate_limits": { 00:10:39.085 "rw_ios_per_sec": 0, 00:10:39.085 "rw_mbytes_per_sec": 0, 00:10:39.085 "r_mbytes_per_sec": 0, 00:10:39.085 "w_mbytes_per_sec": 0 00:10:39.085 }, 00:10:39.085 "claimed": true, 00:10:39.085 "claim_type": "exclusive_write", 00:10:39.085 "zoned": false, 00:10:39.085 "supported_io_types": { 00:10:39.085 "read": true, 00:10:39.085 "write": true, 00:10:39.085 "unmap": true, 00:10:39.085 "flush": true, 00:10:39.085 "reset": true, 00:10:39.085 "nvme_admin": false, 00:10:39.085 "nvme_io": false, 00:10:39.085 "nvme_io_md": false, 00:10:39.085 "write_zeroes": true, 00:10:39.085 "zcopy": true, 00:10:39.085 "get_zone_info": false, 00:10:39.085 "zone_management": false, 00:10:39.085 "zone_append": false, 00:10:39.085 "compare": false, 00:10:39.085 "compare_and_write": false, 00:10:39.085 "abort": true, 00:10:39.085 "seek_hole": false, 00:10:39.085 "seek_data": false, 00:10:39.085 "copy": true, 00:10:39.085 "nvme_iov_md": false 00:10:39.085 }, 00:10:39.085 "memory_domains": [ 00:10:39.085 { 00:10:39.085 "dma_device_id": "system", 00:10:39.085 "dma_device_type": 1 00:10:39.085 }, 00:10:39.085 { 00:10:39.085 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:39.085 "dma_device_type": 2 00:10:39.085 } 00:10:39.085 ], 00:10:39.085 "driver_specific": {} 00:10:39.085 } 00:10:39.085 ] 00:10:39.085 11:51:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:10:39.085 11:51:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:10:39.085 11:51:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:39.085 11:51:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:39.085 11:51:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:39.085 11:51:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:39.085 11:51:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:39.085 11:51:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:39.085 11:51:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:39.085 11:51:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:39.085 11:51:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:39.085 11:51:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:39.085 11:51:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:39.344 11:51:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:39.344 "name": "Existed_Raid", 00:10:39.344 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:39.344 "strip_size_kb": 0, 00:10:39.344 "state": "configuring", 00:10:39.344 "raid_level": "raid1", 00:10:39.344 "superblock": false, 00:10:39.344 "num_base_bdevs": 2, 00:10:39.344 "num_base_bdevs_discovered": 1, 00:10:39.344 "num_base_bdevs_operational": 2, 00:10:39.344 "base_bdevs_list": [ 00:10:39.344 { 00:10:39.344 "name": "BaseBdev1", 00:10:39.344 "uuid": "27085e38-1d09-4b6e-af30-8556ad738a8e", 00:10:39.344 "is_configured": true, 00:10:39.344 "data_offset": 0, 00:10:39.344 "data_size": 65536 00:10:39.344 }, 00:10:39.344 { 00:10:39.344 "name": "BaseBdev2", 00:10:39.344 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:39.344 "is_configured": false, 00:10:39.344 "data_offset": 0, 00:10:39.344 "data_size": 0 00:10:39.344 } 00:10:39.344 ] 00:10:39.344 }' 00:10:39.344 11:51:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:39.344 11:51:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:39.910 11:51:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:39.910 [2024-07-12 11:51:30.020756] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:39.910 [2024-07-12 11:51:30.020788] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xee4aa0 name Existed_Raid, state configuring 00:10:39.910 11:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:40.168 [2024-07-12 11:51:30.193226] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:40.168 [2024-07-12 11:51:30.194377] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:40.168 [2024-07-12 11:51:30.194402] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:40.168 11:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:10:40.168 11:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:40.168 11:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:10:40.168 11:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:40.168 11:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:40.168 11:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:40.168 11:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:40.168 11:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:40.168 11:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:40.169 11:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:40.169 11:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:40.169 11:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:40.169 11:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:40.169 11:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:40.169 11:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:40.169 "name": "Existed_Raid", 00:10:40.169 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:40.169 "strip_size_kb": 0, 00:10:40.169 "state": "configuring", 00:10:40.169 "raid_level": "raid1", 00:10:40.169 "superblock": false, 00:10:40.169 "num_base_bdevs": 2, 00:10:40.169 "num_base_bdevs_discovered": 1, 00:10:40.169 "num_base_bdevs_operational": 2, 00:10:40.169 "base_bdevs_list": [ 00:10:40.169 { 00:10:40.169 "name": "BaseBdev1", 00:10:40.169 "uuid": "27085e38-1d09-4b6e-af30-8556ad738a8e", 00:10:40.169 "is_configured": true, 00:10:40.169 "data_offset": 0, 00:10:40.169 "data_size": 65536 00:10:40.169 }, 00:10:40.169 { 00:10:40.169 "name": "BaseBdev2", 00:10:40.169 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:40.169 "is_configured": false, 00:10:40.169 "data_offset": 0, 00:10:40.169 "data_size": 0 00:10:40.169 } 00:10:40.169 ] 00:10:40.169 }' 00:10:40.169 11:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:40.169 11:51:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:40.735 11:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:40.993 [2024-07-12 11:51:31.010006] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:40.993 [2024-07-12 11:51:31.010035] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xee5890 00:10:40.993 [2024-07-12 11:51:31.010040] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:10:40.993 [2024-07-12 11:51:31.010172] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xee3c20 00:10:40.993 [2024-07-12 11:51:31.010257] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xee5890 00:10:40.993 [2024-07-12 11:51:31.010263] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xee5890 00:10:40.993 [2024-07-12 11:51:31.010383] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:40.993 BaseBdev2 00:10:40.993 11:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:10:40.993 11:51:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:10:40.993 11:51:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:40.993 11:51:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:10:40.993 11:51:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:40.993 11:51:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:40.993 11:51:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:40.993 11:51:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:41.251 [ 00:10:41.251 { 00:10:41.251 "name": "BaseBdev2", 00:10:41.251 "aliases": [ 00:10:41.251 "e01eb6fb-9075-45b1-8039-722811003c65" 00:10:41.251 ], 00:10:41.251 "product_name": "Malloc disk", 00:10:41.251 "block_size": 512, 00:10:41.251 "num_blocks": 65536, 00:10:41.251 "uuid": "e01eb6fb-9075-45b1-8039-722811003c65", 00:10:41.251 "assigned_rate_limits": { 00:10:41.251 "rw_ios_per_sec": 0, 00:10:41.251 "rw_mbytes_per_sec": 0, 00:10:41.251 "r_mbytes_per_sec": 0, 00:10:41.251 "w_mbytes_per_sec": 0 00:10:41.251 }, 00:10:41.251 "claimed": true, 00:10:41.251 "claim_type": "exclusive_write", 00:10:41.251 "zoned": false, 00:10:41.251 "supported_io_types": { 00:10:41.251 "read": true, 00:10:41.251 "write": true, 00:10:41.251 "unmap": true, 00:10:41.251 "flush": true, 00:10:41.251 "reset": true, 00:10:41.251 "nvme_admin": false, 00:10:41.251 "nvme_io": false, 00:10:41.251 "nvme_io_md": false, 00:10:41.251 "write_zeroes": true, 00:10:41.251 "zcopy": true, 00:10:41.251 "get_zone_info": false, 00:10:41.251 "zone_management": false, 00:10:41.251 "zone_append": false, 00:10:41.251 "compare": false, 00:10:41.251 "compare_and_write": false, 00:10:41.251 "abort": true, 00:10:41.251 "seek_hole": false, 00:10:41.251 "seek_data": false, 00:10:41.251 "copy": true, 00:10:41.251 "nvme_iov_md": false 00:10:41.251 }, 00:10:41.251 "memory_domains": [ 00:10:41.251 { 00:10:41.251 "dma_device_id": "system", 00:10:41.251 "dma_device_type": 1 00:10:41.251 }, 00:10:41.251 { 00:10:41.251 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:41.251 "dma_device_type": 2 00:10:41.251 } 00:10:41.251 ], 00:10:41.251 "driver_specific": {} 00:10:41.251 } 00:10:41.251 ] 00:10:41.251 11:51:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:10:41.251 11:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:10:41.251 11:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:41.251 11:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:10:41.251 11:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:41.251 11:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:41.251 11:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:41.251 11:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:41.251 11:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:41.251 11:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:41.251 11:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:41.251 11:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:41.251 11:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:41.251 11:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:41.251 11:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:41.509 11:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:41.509 "name": "Existed_Raid", 00:10:41.509 "uuid": "6ec9c348-195b-431d-9011-12de0dac50f1", 00:10:41.509 "strip_size_kb": 0, 00:10:41.509 "state": "online", 00:10:41.509 "raid_level": "raid1", 00:10:41.509 "superblock": false, 00:10:41.509 "num_base_bdevs": 2, 00:10:41.509 "num_base_bdevs_discovered": 2, 00:10:41.509 "num_base_bdevs_operational": 2, 00:10:41.509 "base_bdevs_list": [ 00:10:41.509 { 00:10:41.509 "name": "BaseBdev1", 00:10:41.510 "uuid": "27085e38-1d09-4b6e-af30-8556ad738a8e", 00:10:41.510 "is_configured": true, 00:10:41.510 "data_offset": 0, 00:10:41.510 "data_size": 65536 00:10:41.510 }, 00:10:41.510 { 00:10:41.510 "name": "BaseBdev2", 00:10:41.510 "uuid": "e01eb6fb-9075-45b1-8039-722811003c65", 00:10:41.510 "is_configured": true, 00:10:41.510 "data_offset": 0, 00:10:41.510 "data_size": 65536 00:10:41.510 } 00:10:41.510 ] 00:10:41.510 }' 00:10:41.510 11:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:41.510 11:51:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:42.076 11:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:10:42.076 11:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:10:42.076 11:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:42.076 11:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:42.076 11:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:42.076 11:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:42.076 11:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:42.076 11:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:42.076 [2024-07-12 11:51:32.173178] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:42.076 11:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:42.076 "name": "Existed_Raid", 00:10:42.076 "aliases": [ 00:10:42.076 "6ec9c348-195b-431d-9011-12de0dac50f1" 00:10:42.076 ], 00:10:42.076 "product_name": "Raid Volume", 00:10:42.076 "block_size": 512, 00:10:42.076 "num_blocks": 65536, 00:10:42.076 "uuid": "6ec9c348-195b-431d-9011-12de0dac50f1", 00:10:42.076 "assigned_rate_limits": { 00:10:42.076 "rw_ios_per_sec": 0, 00:10:42.076 "rw_mbytes_per_sec": 0, 00:10:42.076 "r_mbytes_per_sec": 0, 00:10:42.076 "w_mbytes_per_sec": 0 00:10:42.076 }, 00:10:42.076 "claimed": false, 00:10:42.076 "zoned": false, 00:10:42.076 "supported_io_types": { 00:10:42.076 "read": true, 00:10:42.076 "write": true, 00:10:42.076 "unmap": false, 00:10:42.076 "flush": false, 00:10:42.076 "reset": true, 00:10:42.076 "nvme_admin": false, 00:10:42.076 "nvme_io": false, 00:10:42.076 "nvme_io_md": false, 00:10:42.076 "write_zeroes": true, 00:10:42.076 "zcopy": false, 00:10:42.076 "get_zone_info": false, 00:10:42.076 "zone_management": false, 00:10:42.076 "zone_append": false, 00:10:42.076 "compare": false, 00:10:42.076 "compare_and_write": false, 00:10:42.076 "abort": false, 00:10:42.076 "seek_hole": false, 00:10:42.076 "seek_data": false, 00:10:42.076 "copy": false, 00:10:42.076 "nvme_iov_md": false 00:10:42.076 }, 00:10:42.076 "memory_domains": [ 00:10:42.076 { 00:10:42.076 "dma_device_id": "system", 00:10:42.076 "dma_device_type": 1 00:10:42.076 }, 00:10:42.076 { 00:10:42.076 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:42.076 "dma_device_type": 2 00:10:42.076 }, 00:10:42.076 { 00:10:42.076 "dma_device_id": "system", 00:10:42.076 "dma_device_type": 1 00:10:42.076 }, 00:10:42.076 { 00:10:42.076 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:42.076 "dma_device_type": 2 00:10:42.076 } 00:10:42.076 ], 00:10:42.076 "driver_specific": { 00:10:42.076 "raid": { 00:10:42.076 "uuid": "6ec9c348-195b-431d-9011-12de0dac50f1", 00:10:42.076 "strip_size_kb": 0, 00:10:42.076 "state": "online", 00:10:42.076 "raid_level": "raid1", 00:10:42.076 "superblock": false, 00:10:42.076 "num_base_bdevs": 2, 00:10:42.076 "num_base_bdevs_discovered": 2, 00:10:42.076 "num_base_bdevs_operational": 2, 00:10:42.076 "base_bdevs_list": [ 00:10:42.076 { 00:10:42.076 "name": "BaseBdev1", 00:10:42.076 "uuid": "27085e38-1d09-4b6e-af30-8556ad738a8e", 00:10:42.076 "is_configured": true, 00:10:42.076 "data_offset": 0, 00:10:42.076 "data_size": 65536 00:10:42.076 }, 00:10:42.076 { 00:10:42.076 "name": "BaseBdev2", 00:10:42.076 "uuid": "e01eb6fb-9075-45b1-8039-722811003c65", 00:10:42.076 "is_configured": true, 00:10:42.076 "data_offset": 0, 00:10:42.076 "data_size": 65536 00:10:42.076 } 00:10:42.076 ] 00:10:42.076 } 00:10:42.076 } 00:10:42.076 }' 00:10:42.076 11:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:42.076 11:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:10:42.076 BaseBdev2' 00:10:42.076 11:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:42.076 11:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:42.076 11:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:42.333 11:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:42.333 "name": "BaseBdev1", 00:10:42.333 "aliases": [ 00:10:42.333 "27085e38-1d09-4b6e-af30-8556ad738a8e" 00:10:42.333 ], 00:10:42.333 "product_name": "Malloc disk", 00:10:42.333 "block_size": 512, 00:10:42.333 "num_blocks": 65536, 00:10:42.333 "uuid": "27085e38-1d09-4b6e-af30-8556ad738a8e", 00:10:42.333 "assigned_rate_limits": { 00:10:42.333 "rw_ios_per_sec": 0, 00:10:42.333 "rw_mbytes_per_sec": 0, 00:10:42.333 "r_mbytes_per_sec": 0, 00:10:42.333 "w_mbytes_per_sec": 0 00:10:42.333 }, 00:10:42.333 "claimed": true, 00:10:42.333 "claim_type": "exclusive_write", 00:10:42.333 "zoned": false, 00:10:42.333 "supported_io_types": { 00:10:42.333 "read": true, 00:10:42.333 "write": true, 00:10:42.333 "unmap": true, 00:10:42.333 "flush": true, 00:10:42.333 "reset": true, 00:10:42.333 "nvme_admin": false, 00:10:42.333 "nvme_io": false, 00:10:42.333 "nvme_io_md": false, 00:10:42.333 "write_zeroes": true, 00:10:42.333 "zcopy": true, 00:10:42.333 "get_zone_info": false, 00:10:42.333 "zone_management": false, 00:10:42.333 "zone_append": false, 00:10:42.333 "compare": false, 00:10:42.333 "compare_and_write": false, 00:10:42.333 "abort": true, 00:10:42.333 "seek_hole": false, 00:10:42.333 "seek_data": false, 00:10:42.333 "copy": true, 00:10:42.333 "nvme_iov_md": false 00:10:42.333 }, 00:10:42.333 "memory_domains": [ 00:10:42.333 { 00:10:42.333 "dma_device_id": "system", 00:10:42.334 "dma_device_type": 1 00:10:42.334 }, 00:10:42.334 { 00:10:42.334 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:42.334 "dma_device_type": 2 00:10:42.334 } 00:10:42.334 ], 00:10:42.334 "driver_specific": {} 00:10:42.334 }' 00:10:42.334 11:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:42.334 11:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:42.334 11:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:42.334 11:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:42.334 11:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:42.334 11:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:42.334 11:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:42.592 11:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:42.592 11:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:42.592 11:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:42.592 11:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:42.592 11:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:42.592 11:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:42.592 11:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:42.592 11:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:42.850 11:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:42.850 "name": "BaseBdev2", 00:10:42.850 "aliases": [ 00:10:42.850 "e01eb6fb-9075-45b1-8039-722811003c65" 00:10:42.850 ], 00:10:42.850 "product_name": "Malloc disk", 00:10:42.850 "block_size": 512, 00:10:42.850 "num_blocks": 65536, 00:10:42.850 "uuid": "e01eb6fb-9075-45b1-8039-722811003c65", 00:10:42.850 "assigned_rate_limits": { 00:10:42.850 "rw_ios_per_sec": 0, 00:10:42.850 "rw_mbytes_per_sec": 0, 00:10:42.850 "r_mbytes_per_sec": 0, 00:10:42.850 "w_mbytes_per_sec": 0 00:10:42.850 }, 00:10:42.850 "claimed": true, 00:10:42.850 "claim_type": "exclusive_write", 00:10:42.850 "zoned": false, 00:10:42.850 "supported_io_types": { 00:10:42.850 "read": true, 00:10:42.850 "write": true, 00:10:42.850 "unmap": true, 00:10:42.850 "flush": true, 00:10:42.850 "reset": true, 00:10:42.850 "nvme_admin": false, 00:10:42.850 "nvme_io": false, 00:10:42.850 "nvme_io_md": false, 00:10:42.850 "write_zeroes": true, 00:10:42.850 "zcopy": true, 00:10:42.850 "get_zone_info": false, 00:10:42.850 "zone_management": false, 00:10:42.850 "zone_append": false, 00:10:42.850 "compare": false, 00:10:42.850 "compare_and_write": false, 00:10:42.850 "abort": true, 00:10:42.850 "seek_hole": false, 00:10:42.850 "seek_data": false, 00:10:42.850 "copy": true, 00:10:42.850 "nvme_iov_md": false 00:10:42.850 }, 00:10:42.850 "memory_domains": [ 00:10:42.850 { 00:10:42.850 "dma_device_id": "system", 00:10:42.850 "dma_device_type": 1 00:10:42.850 }, 00:10:42.850 { 00:10:42.850 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:42.850 "dma_device_type": 2 00:10:42.850 } 00:10:42.850 ], 00:10:42.850 "driver_specific": {} 00:10:42.850 }' 00:10:42.850 11:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:42.850 11:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:42.850 11:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:42.850 11:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:42.850 11:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:42.850 11:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:42.850 11:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:42.850 11:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:42.850 11:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:42.850 11:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:43.108 11:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:43.108 11:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:43.108 11:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:43.108 [2024-07-12 11:51:33.324022] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:43.108 11:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:10:43.108 11:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:10:43.108 11:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:43.108 11:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:10:43.108 11:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:10:43.108 11:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:10:43.108 11:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:43.108 11:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:43.108 11:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:43.108 11:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:43.108 11:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:43.108 11:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:43.108 11:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:43.108 11:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:43.108 11:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:43.108 11:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:43.108 11:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:43.367 11:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:43.367 "name": "Existed_Raid", 00:10:43.367 "uuid": "6ec9c348-195b-431d-9011-12de0dac50f1", 00:10:43.367 "strip_size_kb": 0, 00:10:43.367 "state": "online", 00:10:43.367 "raid_level": "raid1", 00:10:43.367 "superblock": false, 00:10:43.367 "num_base_bdevs": 2, 00:10:43.367 "num_base_bdevs_discovered": 1, 00:10:43.367 "num_base_bdevs_operational": 1, 00:10:43.367 "base_bdevs_list": [ 00:10:43.367 { 00:10:43.367 "name": null, 00:10:43.367 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:43.367 "is_configured": false, 00:10:43.367 "data_offset": 0, 00:10:43.367 "data_size": 65536 00:10:43.367 }, 00:10:43.367 { 00:10:43.367 "name": "BaseBdev2", 00:10:43.367 "uuid": "e01eb6fb-9075-45b1-8039-722811003c65", 00:10:43.367 "is_configured": true, 00:10:43.367 "data_offset": 0, 00:10:43.367 "data_size": 65536 00:10:43.367 } 00:10:43.367 ] 00:10:43.367 }' 00:10:43.367 11:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:43.367 11:51:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:43.935 11:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:10:43.935 11:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:43.935 11:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:43.935 11:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:10:43.935 11:51:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:10:43.935 11:51:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:43.935 11:51:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:44.194 [2024-07-12 11:51:34.303432] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:44.194 [2024-07-12 11:51:34.303494] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:44.194 [2024-07-12 11:51:34.313532] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:44.194 [2024-07-12 11:51:34.313573] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:44.194 [2024-07-12 11:51:34.313580] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xee5890 name Existed_Raid, state offline 00:10:44.194 11:51:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:10:44.194 11:51:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:44.194 11:51:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:10:44.194 11:51:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:44.453 11:51:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:10:44.453 11:51:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:10:44.453 11:51:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:10:44.453 11:51:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 603361 00:10:44.453 11:51:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 603361 ']' 00:10:44.453 11:51:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 603361 00:10:44.453 11:51:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:10:44.453 11:51:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:44.453 11:51:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 603361 00:10:44.453 11:51:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:44.453 11:51:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:44.453 11:51:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 603361' 00:10:44.453 killing process with pid 603361 00:10:44.453 11:51:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 603361 00:10:44.453 [2024-07-12 11:51:34.539722] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:44.453 11:51:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 603361 00:10:44.453 [2024-07-12 11:51:34.540497] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:44.712 11:51:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:10:44.712 00:10:44.712 real 0m7.978s 00:10:44.712 user 0m14.271s 00:10:44.712 sys 0m1.311s 00:10:44.712 11:51:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:44.712 11:51:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:44.712 ************************************ 00:10:44.712 END TEST raid_state_function_test 00:10:44.712 ************************************ 00:10:44.712 11:51:34 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:44.712 11:51:34 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 2 true 00:10:44.712 11:51:34 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:44.713 11:51:34 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:44.713 11:51:34 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:44.713 ************************************ 00:10:44.713 START TEST raid_state_function_test_sb 00:10:44.713 ************************************ 00:10:44.713 11:51:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:10:44.713 11:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:10:44.713 11:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:10:44.713 11:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:10:44.713 11:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:10:44.713 11:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:10:44.713 11:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:44.713 11:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:10:44.713 11:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:44.713 11:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:44.713 11:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:10:44.713 11:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:44.713 11:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:44.713 11:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:44.713 11:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:10:44.713 11:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:10:44.713 11:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:10:44.713 11:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:10:44.713 11:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:10:44.713 11:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:10:44.713 11:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:10:44.713 11:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:10:44.713 11:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:10:44.713 11:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=604952 00:10:44.713 11:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 604952' 00:10:44.713 Process raid pid: 604952 00:10:44.713 11:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:44.713 11:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 604952 /var/tmp/spdk-raid.sock 00:10:44.713 11:51:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 604952 ']' 00:10:44.713 11:51:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:44.713 11:51:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:44.713 11:51:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:44.713 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:44.713 11:51:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:44.713 11:51:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:44.713 [2024-07-12 11:51:34.837421] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:10:44.713 [2024-07-12 11:51:34.837456] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:44.713 [2024-07-12 11:51:34.901604] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:44.972 [2024-07-12 11:51:34.979843] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:44.972 [2024-07-12 11:51:35.030356] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:44.972 [2024-07-12 11:51:35.030388] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:45.539 11:51:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:45.539 11:51:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:10:45.539 11:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:45.540 [2024-07-12 11:51:35.777079] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:45.540 [2024-07-12 11:51:35.777108] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:45.540 [2024-07-12 11:51:35.777113] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:45.540 [2024-07-12 11:51:35.777118] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:45.799 11:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:10:45.799 11:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:45.799 11:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:45.799 11:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:45.799 11:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:45.799 11:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:45.799 11:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:45.799 11:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:45.799 11:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:45.799 11:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:45.799 11:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:45.799 11:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:45.799 11:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:45.799 "name": "Existed_Raid", 00:10:45.799 "uuid": "d1c814d1-1f70-4085-87f9-79e8b0d9c8f1", 00:10:45.799 "strip_size_kb": 0, 00:10:45.799 "state": "configuring", 00:10:45.799 "raid_level": "raid1", 00:10:45.799 "superblock": true, 00:10:45.799 "num_base_bdevs": 2, 00:10:45.799 "num_base_bdevs_discovered": 0, 00:10:45.799 "num_base_bdevs_operational": 2, 00:10:45.799 "base_bdevs_list": [ 00:10:45.799 { 00:10:45.799 "name": "BaseBdev1", 00:10:45.799 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:45.799 "is_configured": false, 00:10:45.799 "data_offset": 0, 00:10:45.799 "data_size": 0 00:10:45.799 }, 00:10:45.799 { 00:10:45.799 "name": "BaseBdev2", 00:10:45.799 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:45.799 "is_configured": false, 00:10:45.799 "data_offset": 0, 00:10:45.799 "data_size": 0 00:10:45.799 } 00:10:45.799 ] 00:10:45.799 }' 00:10:45.799 11:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:45.799 11:51:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:46.426 11:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:46.426 [2024-07-12 11:51:36.571032] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:46.426 [2024-07-12 11:51:36.571051] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22e51b0 name Existed_Raid, state configuring 00:10:46.426 11:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:46.695 [2024-07-12 11:51:36.743497] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:46.695 [2024-07-12 11:51:36.743516] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:46.695 [2024-07-12 11:51:36.743525] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:46.695 [2024-07-12 11:51:36.743530] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:46.695 11:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:46.696 [2024-07-12 11:51:36.928321] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:46.696 BaseBdev1 00:10:46.970 11:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:10:46.970 11:51:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:10:46.970 11:51:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:46.970 11:51:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:10:46.970 11:51:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:46.970 11:51:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:46.970 11:51:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:46.970 11:51:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:47.278 [ 00:10:47.278 { 00:10:47.278 "name": "BaseBdev1", 00:10:47.278 "aliases": [ 00:10:47.278 "7dc1ec24-0a5e-40d1-9ef4-e919476f976a" 00:10:47.278 ], 00:10:47.278 "product_name": "Malloc disk", 00:10:47.278 "block_size": 512, 00:10:47.278 "num_blocks": 65536, 00:10:47.278 "uuid": "7dc1ec24-0a5e-40d1-9ef4-e919476f976a", 00:10:47.278 "assigned_rate_limits": { 00:10:47.278 "rw_ios_per_sec": 0, 00:10:47.278 "rw_mbytes_per_sec": 0, 00:10:47.278 "r_mbytes_per_sec": 0, 00:10:47.278 "w_mbytes_per_sec": 0 00:10:47.278 }, 00:10:47.278 "claimed": true, 00:10:47.278 "claim_type": "exclusive_write", 00:10:47.278 "zoned": false, 00:10:47.278 "supported_io_types": { 00:10:47.278 "read": true, 00:10:47.278 "write": true, 00:10:47.278 "unmap": true, 00:10:47.278 "flush": true, 00:10:47.278 "reset": true, 00:10:47.278 "nvme_admin": false, 00:10:47.278 "nvme_io": false, 00:10:47.278 "nvme_io_md": false, 00:10:47.278 "write_zeroes": true, 00:10:47.278 "zcopy": true, 00:10:47.278 "get_zone_info": false, 00:10:47.278 "zone_management": false, 00:10:47.278 "zone_append": false, 00:10:47.278 "compare": false, 00:10:47.278 "compare_and_write": false, 00:10:47.279 "abort": true, 00:10:47.279 "seek_hole": false, 00:10:47.279 "seek_data": false, 00:10:47.279 "copy": true, 00:10:47.279 "nvme_iov_md": false 00:10:47.279 }, 00:10:47.279 "memory_domains": [ 00:10:47.279 { 00:10:47.279 "dma_device_id": "system", 00:10:47.279 "dma_device_type": 1 00:10:47.279 }, 00:10:47.279 { 00:10:47.279 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:47.279 "dma_device_type": 2 00:10:47.279 } 00:10:47.279 ], 00:10:47.279 "driver_specific": {} 00:10:47.279 } 00:10:47.279 ] 00:10:47.279 11:51:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:10:47.279 11:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:10:47.279 11:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:47.279 11:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:47.279 11:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:47.279 11:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:47.279 11:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:47.279 11:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:47.279 11:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:47.279 11:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:47.279 11:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:47.279 11:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:47.279 11:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:47.279 11:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:47.279 "name": "Existed_Raid", 00:10:47.279 "uuid": "7d59229f-057b-4ff5-ae8b-363d9aa322c0", 00:10:47.279 "strip_size_kb": 0, 00:10:47.279 "state": "configuring", 00:10:47.279 "raid_level": "raid1", 00:10:47.279 "superblock": true, 00:10:47.279 "num_base_bdevs": 2, 00:10:47.279 "num_base_bdevs_discovered": 1, 00:10:47.279 "num_base_bdevs_operational": 2, 00:10:47.279 "base_bdevs_list": [ 00:10:47.279 { 00:10:47.279 "name": "BaseBdev1", 00:10:47.279 "uuid": "7dc1ec24-0a5e-40d1-9ef4-e919476f976a", 00:10:47.279 "is_configured": true, 00:10:47.279 "data_offset": 2048, 00:10:47.279 "data_size": 63488 00:10:47.279 }, 00:10:47.279 { 00:10:47.279 "name": "BaseBdev2", 00:10:47.279 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:47.279 "is_configured": false, 00:10:47.279 "data_offset": 0, 00:10:47.279 "data_size": 0 00:10:47.279 } 00:10:47.279 ] 00:10:47.279 }' 00:10:47.279 11:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:47.279 11:51:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:47.873 11:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:47.873 [2024-07-12 11:51:38.107370] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:47.873 [2024-07-12 11:51:38.107397] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22e4aa0 name Existed_Raid, state configuring 00:10:48.132 11:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:48.132 [2024-07-12 11:51:38.283852] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:48.132 [2024-07-12 11:51:38.284883] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:48.132 [2024-07-12 11:51:38.284905] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:48.132 11:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:10:48.132 11:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:48.132 11:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:10:48.132 11:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:48.132 11:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:48.132 11:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:48.132 11:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:48.132 11:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:48.132 11:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:48.132 11:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:48.132 11:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:48.132 11:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:48.132 11:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:48.132 11:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:48.392 11:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:48.392 "name": "Existed_Raid", 00:10:48.392 "uuid": "1d9ce780-ac20-4a11-b36e-2907e9c7b4d3", 00:10:48.392 "strip_size_kb": 0, 00:10:48.392 "state": "configuring", 00:10:48.392 "raid_level": "raid1", 00:10:48.392 "superblock": true, 00:10:48.392 "num_base_bdevs": 2, 00:10:48.392 "num_base_bdevs_discovered": 1, 00:10:48.392 "num_base_bdevs_operational": 2, 00:10:48.392 "base_bdevs_list": [ 00:10:48.392 { 00:10:48.392 "name": "BaseBdev1", 00:10:48.392 "uuid": "7dc1ec24-0a5e-40d1-9ef4-e919476f976a", 00:10:48.392 "is_configured": true, 00:10:48.392 "data_offset": 2048, 00:10:48.392 "data_size": 63488 00:10:48.392 }, 00:10:48.392 { 00:10:48.392 "name": "BaseBdev2", 00:10:48.392 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:48.392 "is_configured": false, 00:10:48.392 "data_offset": 0, 00:10:48.392 "data_size": 0 00:10:48.392 } 00:10:48.392 ] 00:10:48.392 }' 00:10:48.392 11:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:48.392 11:51:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:48.958 11:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:48.958 [2024-07-12 11:51:39.112636] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:48.958 [2024-07-12 11:51:39.112741] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x22e5890 00:10:48.959 [2024-07-12 11:51:39.112749] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:10:48.959 [2024-07-12 11:51:39.112859] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22e3c20 00:10:48.959 [2024-07-12 11:51:39.112942] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22e5890 00:10:48.959 [2024-07-12 11:51:39.112951] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x22e5890 00:10:48.959 [2024-07-12 11:51:39.113009] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:48.959 BaseBdev2 00:10:48.959 11:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:10:48.959 11:51:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:10:48.959 11:51:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:48.959 11:51:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:10:48.959 11:51:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:48.959 11:51:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:48.959 11:51:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:49.217 11:51:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:49.217 [ 00:10:49.217 { 00:10:49.217 "name": "BaseBdev2", 00:10:49.217 "aliases": [ 00:10:49.217 "8236d22b-b63d-4f5e-8945-3e2984e4c7b6" 00:10:49.217 ], 00:10:49.217 "product_name": "Malloc disk", 00:10:49.217 "block_size": 512, 00:10:49.217 "num_blocks": 65536, 00:10:49.217 "uuid": "8236d22b-b63d-4f5e-8945-3e2984e4c7b6", 00:10:49.217 "assigned_rate_limits": { 00:10:49.217 "rw_ios_per_sec": 0, 00:10:49.217 "rw_mbytes_per_sec": 0, 00:10:49.217 "r_mbytes_per_sec": 0, 00:10:49.217 "w_mbytes_per_sec": 0 00:10:49.217 }, 00:10:49.217 "claimed": true, 00:10:49.217 "claim_type": "exclusive_write", 00:10:49.217 "zoned": false, 00:10:49.217 "supported_io_types": { 00:10:49.217 "read": true, 00:10:49.217 "write": true, 00:10:49.217 "unmap": true, 00:10:49.217 "flush": true, 00:10:49.217 "reset": true, 00:10:49.217 "nvme_admin": false, 00:10:49.217 "nvme_io": false, 00:10:49.217 "nvme_io_md": false, 00:10:49.217 "write_zeroes": true, 00:10:49.217 "zcopy": true, 00:10:49.217 "get_zone_info": false, 00:10:49.217 "zone_management": false, 00:10:49.217 "zone_append": false, 00:10:49.217 "compare": false, 00:10:49.217 "compare_and_write": false, 00:10:49.217 "abort": true, 00:10:49.217 "seek_hole": false, 00:10:49.217 "seek_data": false, 00:10:49.217 "copy": true, 00:10:49.217 "nvme_iov_md": false 00:10:49.217 }, 00:10:49.217 "memory_domains": [ 00:10:49.217 { 00:10:49.217 "dma_device_id": "system", 00:10:49.217 "dma_device_type": 1 00:10:49.217 }, 00:10:49.217 { 00:10:49.217 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:49.217 "dma_device_type": 2 00:10:49.217 } 00:10:49.217 ], 00:10:49.217 "driver_specific": {} 00:10:49.217 } 00:10:49.217 ] 00:10:49.476 11:51:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:10:49.476 11:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:10:49.476 11:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:49.476 11:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:10:49.476 11:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:49.476 11:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:49.476 11:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:49.476 11:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:49.476 11:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:49.476 11:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:49.476 11:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:49.476 11:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:49.476 11:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:49.476 11:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:49.476 11:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:49.476 11:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:49.476 "name": "Existed_Raid", 00:10:49.476 "uuid": "1d9ce780-ac20-4a11-b36e-2907e9c7b4d3", 00:10:49.476 "strip_size_kb": 0, 00:10:49.476 "state": "online", 00:10:49.476 "raid_level": "raid1", 00:10:49.476 "superblock": true, 00:10:49.476 "num_base_bdevs": 2, 00:10:49.476 "num_base_bdevs_discovered": 2, 00:10:49.476 "num_base_bdevs_operational": 2, 00:10:49.476 "base_bdevs_list": [ 00:10:49.476 { 00:10:49.476 "name": "BaseBdev1", 00:10:49.476 "uuid": "7dc1ec24-0a5e-40d1-9ef4-e919476f976a", 00:10:49.476 "is_configured": true, 00:10:49.476 "data_offset": 2048, 00:10:49.476 "data_size": 63488 00:10:49.476 }, 00:10:49.476 { 00:10:49.477 "name": "BaseBdev2", 00:10:49.477 "uuid": "8236d22b-b63d-4f5e-8945-3e2984e4c7b6", 00:10:49.477 "is_configured": true, 00:10:49.477 "data_offset": 2048, 00:10:49.477 "data_size": 63488 00:10:49.477 } 00:10:49.477 ] 00:10:49.477 }' 00:10:49.477 11:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:49.477 11:51:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:50.044 11:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:10:50.044 11:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:10:50.044 11:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:50.044 11:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:50.044 11:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:50.044 11:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:10:50.044 11:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:50.044 11:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:50.304 [2024-07-12 11:51:40.295886] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:50.304 11:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:50.304 "name": "Existed_Raid", 00:10:50.304 "aliases": [ 00:10:50.304 "1d9ce780-ac20-4a11-b36e-2907e9c7b4d3" 00:10:50.304 ], 00:10:50.304 "product_name": "Raid Volume", 00:10:50.304 "block_size": 512, 00:10:50.304 "num_blocks": 63488, 00:10:50.304 "uuid": "1d9ce780-ac20-4a11-b36e-2907e9c7b4d3", 00:10:50.304 "assigned_rate_limits": { 00:10:50.304 "rw_ios_per_sec": 0, 00:10:50.304 "rw_mbytes_per_sec": 0, 00:10:50.304 "r_mbytes_per_sec": 0, 00:10:50.304 "w_mbytes_per_sec": 0 00:10:50.304 }, 00:10:50.304 "claimed": false, 00:10:50.304 "zoned": false, 00:10:50.304 "supported_io_types": { 00:10:50.304 "read": true, 00:10:50.304 "write": true, 00:10:50.304 "unmap": false, 00:10:50.304 "flush": false, 00:10:50.304 "reset": true, 00:10:50.304 "nvme_admin": false, 00:10:50.304 "nvme_io": false, 00:10:50.304 "nvme_io_md": false, 00:10:50.304 "write_zeroes": true, 00:10:50.304 "zcopy": false, 00:10:50.304 "get_zone_info": false, 00:10:50.304 "zone_management": false, 00:10:50.304 "zone_append": false, 00:10:50.304 "compare": false, 00:10:50.304 "compare_and_write": false, 00:10:50.304 "abort": false, 00:10:50.304 "seek_hole": false, 00:10:50.304 "seek_data": false, 00:10:50.304 "copy": false, 00:10:50.304 "nvme_iov_md": false 00:10:50.304 }, 00:10:50.304 "memory_domains": [ 00:10:50.304 { 00:10:50.304 "dma_device_id": "system", 00:10:50.304 "dma_device_type": 1 00:10:50.304 }, 00:10:50.304 { 00:10:50.304 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:50.304 "dma_device_type": 2 00:10:50.304 }, 00:10:50.304 { 00:10:50.304 "dma_device_id": "system", 00:10:50.304 "dma_device_type": 1 00:10:50.304 }, 00:10:50.304 { 00:10:50.304 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:50.304 "dma_device_type": 2 00:10:50.304 } 00:10:50.304 ], 00:10:50.304 "driver_specific": { 00:10:50.304 "raid": { 00:10:50.304 "uuid": "1d9ce780-ac20-4a11-b36e-2907e9c7b4d3", 00:10:50.304 "strip_size_kb": 0, 00:10:50.304 "state": "online", 00:10:50.304 "raid_level": "raid1", 00:10:50.304 "superblock": true, 00:10:50.304 "num_base_bdevs": 2, 00:10:50.304 "num_base_bdevs_discovered": 2, 00:10:50.304 "num_base_bdevs_operational": 2, 00:10:50.304 "base_bdevs_list": [ 00:10:50.304 { 00:10:50.304 "name": "BaseBdev1", 00:10:50.304 "uuid": "7dc1ec24-0a5e-40d1-9ef4-e919476f976a", 00:10:50.304 "is_configured": true, 00:10:50.304 "data_offset": 2048, 00:10:50.304 "data_size": 63488 00:10:50.304 }, 00:10:50.304 { 00:10:50.304 "name": "BaseBdev2", 00:10:50.304 "uuid": "8236d22b-b63d-4f5e-8945-3e2984e4c7b6", 00:10:50.304 "is_configured": true, 00:10:50.304 "data_offset": 2048, 00:10:50.304 "data_size": 63488 00:10:50.304 } 00:10:50.304 ] 00:10:50.304 } 00:10:50.304 } 00:10:50.304 }' 00:10:50.304 11:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:50.304 11:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:10:50.304 BaseBdev2' 00:10:50.304 11:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:50.304 11:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:50.304 11:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:50.304 11:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:50.304 "name": "BaseBdev1", 00:10:50.304 "aliases": [ 00:10:50.304 "7dc1ec24-0a5e-40d1-9ef4-e919476f976a" 00:10:50.304 ], 00:10:50.304 "product_name": "Malloc disk", 00:10:50.304 "block_size": 512, 00:10:50.304 "num_blocks": 65536, 00:10:50.304 "uuid": "7dc1ec24-0a5e-40d1-9ef4-e919476f976a", 00:10:50.304 "assigned_rate_limits": { 00:10:50.304 "rw_ios_per_sec": 0, 00:10:50.304 "rw_mbytes_per_sec": 0, 00:10:50.304 "r_mbytes_per_sec": 0, 00:10:50.304 "w_mbytes_per_sec": 0 00:10:50.304 }, 00:10:50.304 "claimed": true, 00:10:50.304 "claim_type": "exclusive_write", 00:10:50.304 "zoned": false, 00:10:50.304 "supported_io_types": { 00:10:50.304 "read": true, 00:10:50.304 "write": true, 00:10:50.304 "unmap": true, 00:10:50.304 "flush": true, 00:10:50.304 "reset": true, 00:10:50.304 "nvme_admin": false, 00:10:50.304 "nvme_io": false, 00:10:50.304 "nvme_io_md": false, 00:10:50.304 "write_zeroes": true, 00:10:50.304 "zcopy": true, 00:10:50.304 "get_zone_info": false, 00:10:50.304 "zone_management": false, 00:10:50.304 "zone_append": false, 00:10:50.304 "compare": false, 00:10:50.304 "compare_and_write": false, 00:10:50.304 "abort": true, 00:10:50.304 "seek_hole": false, 00:10:50.304 "seek_data": false, 00:10:50.304 "copy": true, 00:10:50.304 "nvme_iov_md": false 00:10:50.304 }, 00:10:50.304 "memory_domains": [ 00:10:50.304 { 00:10:50.304 "dma_device_id": "system", 00:10:50.304 "dma_device_type": 1 00:10:50.304 }, 00:10:50.304 { 00:10:50.304 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:50.304 "dma_device_type": 2 00:10:50.304 } 00:10:50.304 ], 00:10:50.304 "driver_specific": {} 00:10:50.304 }' 00:10:50.304 11:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:50.563 11:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:50.563 11:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:50.563 11:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:50.563 11:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:50.563 11:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:50.563 11:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:50.563 11:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:50.563 11:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:50.563 11:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:50.822 11:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:50.822 11:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:50.822 11:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:50.822 11:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:50.822 11:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:50.822 11:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:50.822 "name": "BaseBdev2", 00:10:50.822 "aliases": [ 00:10:50.822 "8236d22b-b63d-4f5e-8945-3e2984e4c7b6" 00:10:50.822 ], 00:10:50.822 "product_name": "Malloc disk", 00:10:50.822 "block_size": 512, 00:10:50.822 "num_blocks": 65536, 00:10:50.822 "uuid": "8236d22b-b63d-4f5e-8945-3e2984e4c7b6", 00:10:50.822 "assigned_rate_limits": { 00:10:50.822 "rw_ios_per_sec": 0, 00:10:50.822 "rw_mbytes_per_sec": 0, 00:10:50.822 "r_mbytes_per_sec": 0, 00:10:50.822 "w_mbytes_per_sec": 0 00:10:50.822 }, 00:10:50.822 "claimed": true, 00:10:50.822 "claim_type": "exclusive_write", 00:10:50.822 "zoned": false, 00:10:50.822 "supported_io_types": { 00:10:50.822 "read": true, 00:10:50.822 "write": true, 00:10:50.822 "unmap": true, 00:10:50.822 "flush": true, 00:10:50.822 "reset": true, 00:10:50.822 "nvme_admin": false, 00:10:50.822 "nvme_io": false, 00:10:50.822 "nvme_io_md": false, 00:10:50.822 "write_zeroes": true, 00:10:50.822 "zcopy": true, 00:10:50.822 "get_zone_info": false, 00:10:50.822 "zone_management": false, 00:10:50.822 "zone_append": false, 00:10:50.822 "compare": false, 00:10:50.822 "compare_and_write": false, 00:10:50.822 "abort": true, 00:10:50.822 "seek_hole": false, 00:10:50.822 "seek_data": false, 00:10:50.822 "copy": true, 00:10:50.823 "nvme_iov_md": false 00:10:50.823 }, 00:10:50.823 "memory_domains": [ 00:10:50.823 { 00:10:50.823 "dma_device_id": "system", 00:10:50.823 "dma_device_type": 1 00:10:50.823 }, 00:10:50.823 { 00:10:50.823 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:50.823 "dma_device_type": 2 00:10:50.823 } 00:10:50.823 ], 00:10:50.823 "driver_specific": {} 00:10:50.823 }' 00:10:50.823 11:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:50.823 11:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:51.082 11:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:51.082 11:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:51.082 11:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:51.082 11:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:51.082 11:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:51.082 11:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:51.082 11:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:51.082 11:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:51.082 11:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:51.340 11:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:51.340 11:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:51.340 [2024-07-12 11:51:41.498875] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:51.340 11:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:10:51.340 11:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:10:51.340 11:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:51.340 11:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:10:51.340 11:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:10:51.340 11:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:10:51.340 11:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:51.340 11:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:51.340 11:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:51.340 11:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:51.340 11:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:51.340 11:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:51.340 11:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:51.340 11:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:51.340 11:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:51.340 11:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:51.340 11:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:51.599 11:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:51.599 "name": "Existed_Raid", 00:10:51.599 "uuid": "1d9ce780-ac20-4a11-b36e-2907e9c7b4d3", 00:10:51.599 "strip_size_kb": 0, 00:10:51.599 "state": "online", 00:10:51.599 "raid_level": "raid1", 00:10:51.599 "superblock": true, 00:10:51.599 "num_base_bdevs": 2, 00:10:51.599 "num_base_bdevs_discovered": 1, 00:10:51.599 "num_base_bdevs_operational": 1, 00:10:51.599 "base_bdevs_list": [ 00:10:51.599 { 00:10:51.599 "name": null, 00:10:51.599 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:51.599 "is_configured": false, 00:10:51.599 "data_offset": 2048, 00:10:51.599 "data_size": 63488 00:10:51.599 }, 00:10:51.599 { 00:10:51.599 "name": "BaseBdev2", 00:10:51.599 "uuid": "8236d22b-b63d-4f5e-8945-3e2984e4c7b6", 00:10:51.599 "is_configured": true, 00:10:51.599 "data_offset": 2048, 00:10:51.599 "data_size": 63488 00:10:51.599 } 00:10:51.599 ] 00:10:51.599 }' 00:10:51.599 11:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:51.599 11:51:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:52.167 11:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:10:52.167 11:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:52.167 11:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:52.167 11:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:10:52.167 11:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:10:52.167 11:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:52.167 11:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:52.426 [2024-07-12 11:51:42.510287] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:52.427 [2024-07-12 11:51:42.510350] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:52.427 [2024-07-12 11:51:42.520439] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:52.427 [2024-07-12 11:51:42.520481] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:52.427 [2024-07-12 11:51:42.520487] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22e5890 name Existed_Raid, state offline 00:10:52.427 11:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:10:52.427 11:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:52.427 11:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:52.427 11:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:10:52.686 11:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:10:52.686 11:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:10:52.686 11:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:10:52.686 11:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 604952 00:10:52.686 11:51:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 604952 ']' 00:10:52.686 11:51:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 604952 00:10:52.686 11:51:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:10:52.686 11:51:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:52.686 11:51:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 604952 00:10:52.686 11:51:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:52.686 11:51:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:52.686 11:51:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 604952' 00:10:52.686 killing process with pid 604952 00:10:52.686 11:51:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 604952 00:10:52.686 [2024-07-12 11:51:42.743489] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:52.686 11:51:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 604952 00:10:52.686 [2024-07-12 11:51:42.744260] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:52.686 11:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:10:52.686 00:10:52.686 real 0m8.136s 00:10:52.686 user 0m14.575s 00:10:52.686 sys 0m1.350s 00:10:52.686 11:51:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:52.686 11:51:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:52.686 ************************************ 00:10:52.686 END TEST raid_state_function_test_sb 00:10:52.686 ************************************ 00:10:52.957 11:51:42 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:52.957 11:51:42 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 2 00:10:52.957 11:51:42 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:10:52.957 11:51:42 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:52.957 11:51:42 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:52.957 ************************************ 00:10:52.957 START TEST raid_superblock_test 00:10:52.957 ************************************ 00:10:52.957 11:51:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:10:52.957 11:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:10:52.957 11:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:10:52.957 11:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:10:52.957 11:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:10:52.957 11:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:10:52.957 11:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:10:52.957 11:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:10:52.957 11:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:10:52.957 11:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:10:52.957 11:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:10:52.957 11:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:10:52.957 11:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:10:52.957 11:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:10:52.957 11:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:10:52.957 11:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:10:52.957 11:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=606557 00:10:52.957 11:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 606557 /var/tmp/spdk-raid.sock 00:10:52.957 11:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:10:52.957 11:51:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 606557 ']' 00:10:52.957 11:51:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:52.957 11:51:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:52.957 11:51:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:52.957 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:52.957 11:51:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:52.957 11:51:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:52.957 [2024-07-12 11:51:43.036962] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:10:52.957 [2024-07-12 11:51:43.036997] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid606557 ] 00:10:52.957 [2024-07-12 11:51:43.101431] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:52.957 [2024-07-12 11:51:43.172515] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:53.218 [2024-07-12 11:51:43.230508] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:53.218 [2024-07-12 11:51:43.230540] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:53.786 11:51:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:53.786 11:51:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:10:53.786 11:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:10:53.787 11:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:53.787 11:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:10:53.787 11:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:10:53.787 11:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:10:53.787 11:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:10:53.787 11:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:10:53.787 11:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:10:53.787 11:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:10:53.787 malloc1 00:10:53.787 11:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:10:54.046 [2024-07-12 11:51:44.166384] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:10:54.046 [2024-07-12 11:51:44.166417] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:54.046 [2024-07-12 11:51:44.166428] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1109270 00:10:54.046 [2024-07-12 11:51:44.166434] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:54.046 [2024-07-12 11:51:44.167529] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:54.046 [2024-07-12 11:51:44.167549] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:10:54.046 pt1 00:10:54.046 11:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:10:54.046 11:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:54.046 11:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:10:54.046 11:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:10:54.046 11:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:10:54.046 11:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:10:54.046 11:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:10:54.046 11:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:10:54.046 11:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:10:54.305 malloc2 00:10:54.305 11:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:10:54.305 [2024-07-12 11:51:44.502892] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:10:54.305 [2024-07-12 11:51:44.502924] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:54.305 [2024-07-12 11:51:44.502932] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x110a580 00:10:54.305 [2024-07-12 11:51:44.502957] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:54.305 [2024-07-12 11:51:44.503959] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:54.305 [2024-07-12 11:51:44.503978] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:10:54.305 pt2 00:10:54.305 11:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:10:54.305 11:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:54.305 11:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:10:54.564 [2024-07-12 11:51:44.675342] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:10:54.564 [2024-07-12 11:51:44.676128] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:54.564 [2024-07-12 11:51:44.676220] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12b4890 00:10:54.564 [2024-07-12 11:51:44.676227] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:10:54.564 [2024-07-12 11:51:44.676339] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1120160 00:10:54.564 [2024-07-12 11:51:44.676433] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12b4890 00:10:54.564 [2024-07-12 11:51:44.676437] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12b4890 00:10:54.564 [2024-07-12 11:51:44.676494] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:54.564 11:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:10:54.564 11:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:54.564 11:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:54.564 11:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:54.564 11:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:54.564 11:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:54.564 11:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:54.564 11:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:54.564 11:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:54.564 11:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:54.564 11:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:54.564 11:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:54.823 11:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:54.823 "name": "raid_bdev1", 00:10:54.823 "uuid": "9f30ec17-b986-49dc-a975-193a98a059a2", 00:10:54.823 "strip_size_kb": 0, 00:10:54.823 "state": "online", 00:10:54.823 "raid_level": "raid1", 00:10:54.823 "superblock": true, 00:10:54.823 "num_base_bdevs": 2, 00:10:54.823 "num_base_bdevs_discovered": 2, 00:10:54.823 "num_base_bdevs_operational": 2, 00:10:54.823 "base_bdevs_list": [ 00:10:54.823 { 00:10:54.823 "name": "pt1", 00:10:54.823 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:54.823 "is_configured": true, 00:10:54.823 "data_offset": 2048, 00:10:54.823 "data_size": 63488 00:10:54.823 }, 00:10:54.823 { 00:10:54.823 "name": "pt2", 00:10:54.823 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:54.823 "is_configured": true, 00:10:54.823 "data_offset": 2048, 00:10:54.823 "data_size": 63488 00:10:54.823 } 00:10:54.823 ] 00:10:54.823 }' 00:10:54.823 11:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:54.823 11:51:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:55.389 11:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:10:55.389 11:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:10:55.390 11:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:55.390 11:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:55.390 11:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:55.390 11:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:55.390 11:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:55.390 11:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:55.390 [2024-07-12 11:51:45.505645] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:55.390 11:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:55.390 "name": "raid_bdev1", 00:10:55.390 "aliases": [ 00:10:55.390 "9f30ec17-b986-49dc-a975-193a98a059a2" 00:10:55.390 ], 00:10:55.390 "product_name": "Raid Volume", 00:10:55.390 "block_size": 512, 00:10:55.390 "num_blocks": 63488, 00:10:55.390 "uuid": "9f30ec17-b986-49dc-a975-193a98a059a2", 00:10:55.390 "assigned_rate_limits": { 00:10:55.390 "rw_ios_per_sec": 0, 00:10:55.390 "rw_mbytes_per_sec": 0, 00:10:55.390 "r_mbytes_per_sec": 0, 00:10:55.390 "w_mbytes_per_sec": 0 00:10:55.390 }, 00:10:55.390 "claimed": false, 00:10:55.390 "zoned": false, 00:10:55.390 "supported_io_types": { 00:10:55.390 "read": true, 00:10:55.390 "write": true, 00:10:55.390 "unmap": false, 00:10:55.390 "flush": false, 00:10:55.390 "reset": true, 00:10:55.390 "nvme_admin": false, 00:10:55.390 "nvme_io": false, 00:10:55.390 "nvme_io_md": false, 00:10:55.390 "write_zeroes": true, 00:10:55.390 "zcopy": false, 00:10:55.390 "get_zone_info": false, 00:10:55.390 "zone_management": false, 00:10:55.390 "zone_append": false, 00:10:55.390 "compare": false, 00:10:55.390 "compare_and_write": false, 00:10:55.390 "abort": false, 00:10:55.390 "seek_hole": false, 00:10:55.390 "seek_data": false, 00:10:55.390 "copy": false, 00:10:55.390 "nvme_iov_md": false 00:10:55.390 }, 00:10:55.390 "memory_domains": [ 00:10:55.390 { 00:10:55.390 "dma_device_id": "system", 00:10:55.390 "dma_device_type": 1 00:10:55.390 }, 00:10:55.390 { 00:10:55.390 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:55.390 "dma_device_type": 2 00:10:55.390 }, 00:10:55.390 { 00:10:55.390 "dma_device_id": "system", 00:10:55.390 "dma_device_type": 1 00:10:55.390 }, 00:10:55.390 { 00:10:55.390 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:55.390 "dma_device_type": 2 00:10:55.390 } 00:10:55.390 ], 00:10:55.390 "driver_specific": { 00:10:55.390 "raid": { 00:10:55.390 "uuid": "9f30ec17-b986-49dc-a975-193a98a059a2", 00:10:55.390 "strip_size_kb": 0, 00:10:55.390 "state": "online", 00:10:55.390 "raid_level": "raid1", 00:10:55.390 "superblock": true, 00:10:55.390 "num_base_bdevs": 2, 00:10:55.390 "num_base_bdevs_discovered": 2, 00:10:55.390 "num_base_bdevs_operational": 2, 00:10:55.390 "base_bdevs_list": [ 00:10:55.390 { 00:10:55.390 "name": "pt1", 00:10:55.390 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:55.390 "is_configured": true, 00:10:55.390 "data_offset": 2048, 00:10:55.390 "data_size": 63488 00:10:55.390 }, 00:10:55.390 { 00:10:55.390 "name": "pt2", 00:10:55.390 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:55.390 "is_configured": true, 00:10:55.390 "data_offset": 2048, 00:10:55.390 "data_size": 63488 00:10:55.390 } 00:10:55.390 ] 00:10:55.390 } 00:10:55.390 } 00:10:55.390 }' 00:10:55.390 11:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:55.390 11:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:10:55.390 pt2' 00:10:55.390 11:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:55.390 11:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:10:55.390 11:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:55.649 11:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:55.649 "name": "pt1", 00:10:55.649 "aliases": [ 00:10:55.649 "00000000-0000-0000-0000-000000000001" 00:10:55.649 ], 00:10:55.649 "product_name": "passthru", 00:10:55.649 "block_size": 512, 00:10:55.649 "num_blocks": 65536, 00:10:55.649 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:55.649 "assigned_rate_limits": { 00:10:55.649 "rw_ios_per_sec": 0, 00:10:55.649 "rw_mbytes_per_sec": 0, 00:10:55.649 "r_mbytes_per_sec": 0, 00:10:55.649 "w_mbytes_per_sec": 0 00:10:55.649 }, 00:10:55.649 "claimed": true, 00:10:55.649 "claim_type": "exclusive_write", 00:10:55.649 "zoned": false, 00:10:55.649 "supported_io_types": { 00:10:55.649 "read": true, 00:10:55.649 "write": true, 00:10:55.649 "unmap": true, 00:10:55.649 "flush": true, 00:10:55.649 "reset": true, 00:10:55.649 "nvme_admin": false, 00:10:55.649 "nvme_io": false, 00:10:55.649 "nvme_io_md": false, 00:10:55.649 "write_zeroes": true, 00:10:55.649 "zcopy": true, 00:10:55.649 "get_zone_info": false, 00:10:55.649 "zone_management": false, 00:10:55.649 "zone_append": false, 00:10:55.649 "compare": false, 00:10:55.649 "compare_and_write": false, 00:10:55.649 "abort": true, 00:10:55.649 "seek_hole": false, 00:10:55.649 "seek_data": false, 00:10:55.649 "copy": true, 00:10:55.649 "nvme_iov_md": false 00:10:55.649 }, 00:10:55.649 "memory_domains": [ 00:10:55.649 { 00:10:55.649 "dma_device_id": "system", 00:10:55.649 "dma_device_type": 1 00:10:55.649 }, 00:10:55.649 { 00:10:55.649 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:55.649 "dma_device_type": 2 00:10:55.649 } 00:10:55.649 ], 00:10:55.649 "driver_specific": { 00:10:55.649 "passthru": { 00:10:55.649 "name": "pt1", 00:10:55.649 "base_bdev_name": "malloc1" 00:10:55.649 } 00:10:55.649 } 00:10:55.649 }' 00:10:55.649 11:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:55.649 11:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:55.649 11:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:55.649 11:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:55.649 11:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:55.908 11:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:55.908 11:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:55.908 11:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:55.908 11:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:55.908 11:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:55.908 11:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:55.908 11:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:55.908 11:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:55.908 11:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:10:55.908 11:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:56.168 11:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:56.168 "name": "pt2", 00:10:56.168 "aliases": [ 00:10:56.168 "00000000-0000-0000-0000-000000000002" 00:10:56.168 ], 00:10:56.168 "product_name": "passthru", 00:10:56.168 "block_size": 512, 00:10:56.168 "num_blocks": 65536, 00:10:56.168 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:56.168 "assigned_rate_limits": { 00:10:56.168 "rw_ios_per_sec": 0, 00:10:56.168 "rw_mbytes_per_sec": 0, 00:10:56.168 "r_mbytes_per_sec": 0, 00:10:56.168 "w_mbytes_per_sec": 0 00:10:56.168 }, 00:10:56.168 "claimed": true, 00:10:56.168 "claim_type": "exclusive_write", 00:10:56.168 "zoned": false, 00:10:56.168 "supported_io_types": { 00:10:56.168 "read": true, 00:10:56.168 "write": true, 00:10:56.168 "unmap": true, 00:10:56.168 "flush": true, 00:10:56.168 "reset": true, 00:10:56.168 "nvme_admin": false, 00:10:56.168 "nvme_io": false, 00:10:56.168 "nvme_io_md": false, 00:10:56.168 "write_zeroes": true, 00:10:56.168 "zcopy": true, 00:10:56.168 "get_zone_info": false, 00:10:56.168 "zone_management": false, 00:10:56.168 "zone_append": false, 00:10:56.168 "compare": false, 00:10:56.168 "compare_and_write": false, 00:10:56.168 "abort": true, 00:10:56.168 "seek_hole": false, 00:10:56.168 "seek_data": false, 00:10:56.168 "copy": true, 00:10:56.168 "nvme_iov_md": false 00:10:56.168 }, 00:10:56.168 "memory_domains": [ 00:10:56.168 { 00:10:56.168 "dma_device_id": "system", 00:10:56.168 "dma_device_type": 1 00:10:56.168 }, 00:10:56.168 { 00:10:56.168 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:56.168 "dma_device_type": 2 00:10:56.168 } 00:10:56.168 ], 00:10:56.168 "driver_specific": { 00:10:56.168 "passthru": { 00:10:56.168 "name": "pt2", 00:10:56.168 "base_bdev_name": "malloc2" 00:10:56.168 } 00:10:56.168 } 00:10:56.168 }' 00:10:56.168 11:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:56.168 11:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:56.168 11:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:56.168 11:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:56.168 11:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:56.168 11:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:56.168 11:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:56.428 11:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:56.428 11:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:56.428 11:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:56.428 11:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:56.428 11:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:56.428 11:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:56.428 11:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:10:56.687 [2024-07-12 11:51:46.676696] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:56.687 11:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=9f30ec17-b986-49dc-a975-193a98a059a2 00:10:56.687 11:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 9f30ec17-b986-49dc-a975-193a98a059a2 ']' 00:10:56.687 11:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:56.687 [2024-07-12 11:51:46.844957] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:56.687 [2024-07-12 11:51:46.844972] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:56.687 [2024-07-12 11:51:46.845009] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:56.687 [2024-07-12 11:51:46.845045] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:56.687 [2024-07-12 11:51:46.845051] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12b4890 name raid_bdev1, state offline 00:10:56.687 11:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:56.687 11:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:10:56.946 11:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:10:56.946 11:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:10:56.946 11:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:10:56.946 11:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:10:57.205 11:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:10:57.205 11:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:10:57.205 11:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:10:57.205 11:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:10:57.465 11:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:10:57.465 11:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:10:57.465 11:51:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:10:57.465 11:51:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:10:57.465 11:51:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:57.465 11:51:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:57.465 11:51:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:57.465 11:51:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:57.465 11:51:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:57.465 11:51:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:57.465 11:51:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:57.465 11:51:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:10:57.465 11:51:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:10:57.465 [2024-07-12 11:51:47.675094] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:10:57.465 [2024-07-12 11:51:47.676045] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:10:57.465 [2024-07-12 11:51:47.676086] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:10:57.465 [2024-07-12 11:51:47.676126] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:10:57.465 [2024-07-12 11:51:47.676137] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:57.465 [2024-07-12 11:51:47.676142] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12b3a50 name raid_bdev1, state configuring 00:10:57.465 request: 00:10:57.465 { 00:10:57.465 "name": "raid_bdev1", 00:10:57.465 "raid_level": "raid1", 00:10:57.465 "base_bdevs": [ 00:10:57.465 "malloc1", 00:10:57.465 "malloc2" 00:10:57.465 ], 00:10:57.465 "superblock": false, 00:10:57.465 "method": "bdev_raid_create", 00:10:57.465 "req_id": 1 00:10:57.465 } 00:10:57.465 Got JSON-RPC error response 00:10:57.465 response: 00:10:57.465 { 00:10:57.465 "code": -17, 00:10:57.465 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:10:57.465 } 00:10:57.465 11:51:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:10:57.465 11:51:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:10:57.465 11:51:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:10:57.465 11:51:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:10:57.465 11:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:57.465 11:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:10:57.724 11:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:10:57.724 11:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:10:57.724 11:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:10:57.983 [2024-07-12 11:51:48.019952] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:10:57.983 [2024-07-12 11:51:48.019983] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:57.983 [2024-07-12 11:51:48.019995] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11094a0 00:10:57.983 [2024-07-12 11:51:48.020001] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:57.983 [2024-07-12 11:51:48.021158] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:57.983 [2024-07-12 11:51:48.021178] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:10:57.983 [2024-07-12 11:51:48.021225] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:10:57.983 [2024-07-12 11:51:48.021244] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:10:57.983 pt1 00:10:57.983 11:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:10:57.983 11:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:57.983 11:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:57.983 11:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:57.983 11:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:57.983 11:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:57.983 11:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:57.983 11:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:57.983 11:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:57.983 11:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:57.983 11:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:57.983 11:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:57.983 11:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:57.983 "name": "raid_bdev1", 00:10:57.983 "uuid": "9f30ec17-b986-49dc-a975-193a98a059a2", 00:10:57.983 "strip_size_kb": 0, 00:10:57.983 "state": "configuring", 00:10:57.983 "raid_level": "raid1", 00:10:57.983 "superblock": true, 00:10:57.983 "num_base_bdevs": 2, 00:10:57.983 "num_base_bdevs_discovered": 1, 00:10:57.983 "num_base_bdevs_operational": 2, 00:10:57.983 "base_bdevs_list": [ 00:10:57.983 { 00:10:57.983 "name": "pt1", 00:10:57.983 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:57.983 "is_configured": true, 00:10:57.983 "data_offset": 2048, 00:10:57.983 "data_size": 63488 00:10:57.983 }, 00:10:57.983 { 00:10:57.983 "name": null, 00:10:57.983 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:57.983 "is_configured": false, 00:10:57.983 "data_offset": 2048, 00:10:57.983 "data_size": 63488 00:10:57.983 } 00:10:57.983 ] 00:10:57.983 }' 00:10:57.983 11:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:57.983 11:51:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:58.550 11:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:10:58.550 11:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:10:58.550 11:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:10:58.550 11:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:10:58.809 [2024-07-12 11:51:48.850112] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:10:58.809 [2024-07-12 11:51:48.850151] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:58.809 [2024-07-12 11:51:48.850162] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12b3e10 00:10:58.809 [2024-07-12 11:51:48.850167] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:58.809 [2024-07-12 11:51:48.850414] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:58.809 [2024-07-12 11:51:48.850424] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:10:58.809 [2024-07-12 11:51:48.850466] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:10:58.809 [2024-07-12 11:51:48.850478] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:58.809 [2024-07-12 11:51:48.850565] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x11083b0 00:10:58.809 [2024-07-12 11:51:48.850572] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:10:58.809 [2024-07-12 11:51:48.850681] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1109df0 00:10:58.809 [2024-07-12 11:51:48.850766] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11083b0 00:10:58.809 [2024-07-12 11:51:48.850770] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x11083b0 00:10:58.809 [2024-07-12 11:51:48.850834] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:58.809 pt2 00:10:58.809 11:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:10:58.809 11:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:10:58.809 11:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:10:58.809 11:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:58.809 11:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:58.809 11:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:58.809 11:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:58.809 11:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:58.809 11:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:58.809 11:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:58.809 11:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:58.809 11:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:58.809 11:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:58.809 11:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:58.809 11:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:58.809 "name": "raid_bdev1", 00:10:58.809 "uuid": "9f30ec17-b986-49dc-a975-193a98a059a2", 00:10:58.809 "strip_size_kb": 0, 00:10:58.809 "state": "online", 00:10:58.809 "raid_level": "raid1", 00:10:58.809 "superblock": true, 00:10:58.809 "num_base_bdevs": 2, 00:10:58.809 "num_base_bdevs_discovered": 2, 00:10:58.809 "num_base_bdevs_operational": 2, 00:10:58.809 "base_bdevs_list": [ 00:10:58.809 { 00:10:58.809 "name": "pt1", 00:10:58.809 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:58.809 "is_configured": true, 00:10:58.809 "data_offset": 2048, 00:10:58.809 "data_size": 63488 00:10:58.809 }, 00:10:58.809 { 00:10:58.809 "name": "pt2", 00:10:58.809 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:58.809 "is_configured": true, 00:10:58.809 "data_offset": 2048, 00:10:58.809 "data_size": 63488 00:10:58.809 } 00:10:58.809 ] 00:10:58.809 }' 00:10:58.809 11:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:58.809 11:51:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:59.377 11:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:10:59.377 11:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:10:59.377 11:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:59.377 11:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:59.377 11:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:59.377 11:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:59.377 11:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:59.377 11:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:59.635 [2024-07-12 11:51:49.688669] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:59.635 11:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:59.635 "name": "raid_bdev1", 00:10:59.635 "aliases": [ 00:10:59.635 "9f30ec17-b986-49dc-a975-193a98a059a2" 00:10:59.635 ], 00:10:59.635 "product_name": "Raid Volume", 00:10:59.635 "block_size": 512, 00:10:59.635 "num_blocks": 63488, 00:10:59.635 "uuid": "9f30ec17-b986-49dc-a975-193a98a059a2", 00:10:59.635 "assigned_rate_limits": { 00:10:59.635 "rw_ios_per_sec": 0, 00:10:59.635 "rw_mbytes_per_sec": 0, 00:10:59.635 "r_mbytes_per_sec": 0, 00:10:59.635 "w_mbytes_per_sec": 0 00:10:59.635 }, 00:10:59.635 "claimed": false, 00:10:59.635 "zoned": false, 00:10:59.635 "supported_io_types": { 00:10:59.635 "read": true, 00:10:59.635 "write": true, 00:10:59.635 "unmap": false, 00:10:59.635 "flush": false, 00:10:59.635 "reset": true, 00:10:59.635 "nvme_admin": false, 00:10:59.635 "nvme_io": false, 00:10:59.635 "nvme_io_md": false, 00:10:59.635 "write_zeroes": true, 00:10:59.635 "zcopy": false, 00:10:59.635 "get_zone_info": false, 00:10:59.635 "zone_management": false, 00:10:59.635 "zone_append": false, 00:10:59.635 "compare": false, 00:10:59.635 "compare_and_write": false, 00:10:59.635 "abort": false, 00:10:59.635 "seek_hole": false, 00:10:59.635 "seek_data": false, 00:10:59.635 "copy": false, 00:10:59.635 "nvme_iov_md": false 00:10:59.635 }, 00:10:59.635 "memory_domains": [ 00:10:59.635 { 00:10:59.635 "dma_device_id": "system", 00:10:59.635 "dma_device_type": 1 00:10:59.635 }, 00:10:59.635 { 00:10:59.635 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:59.635 "dma_device_type": 2 00:10:59.635 }, 00:10:59.635 { 00:10:59.635 "dma_device_id": "system", 00:10:59.635 "dma_device_type": 1 00:10:59.635 }, 00:10:59.635 { 00:10:59.635 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:59.635 "dma_device_type": 2 00:10:59.635 } 00:10:59.635 ], 00:10:59.635 "driver_specific": { 00:10:59.635 "raid": { 00:10:59.635 "uuid": "9f30ec17-b986-49dc-a975-193a98a059a2", 00:10:59.635 "strip_size_kb": 0, 00:10:59.635 "state": "online", 00:10:59.635 "raid_level": "raid1", 00:10:59.635 "superblock": true, 00:10:59.635 "num_base_bdevs": 2, 00:10:59.635 "num_base_bdevs_discovered": 2, 00:10:59.636 "num_base_bdevs_operational": 2, 00:10:59.636 "base_bdevs_list": [ 00:10:59.636 { 00:10:59.636 "name": "pt1", 00:10:59.636 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:59.636 "is_configured": true, 00:10:59.636 "data_offset": 2048, 00:10:59.636 "data_size": 63488 00:10:59.636 }, 00:10:59.636 { 00:10:59.636 "name": "pt2", 00:10:59.636 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:59.636 "is_configured": true, 00:10:59.636 "data_offset": 2048, 00:10:59.636 "data_size": 63488 00:10:59.636 } 00:10:59.636 ] 00:10:59.636 } 00:10:59.636 } 00:10:59.636 }' 00:10:59.636 11:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:59.636 11:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:10:59.636 pt2' 00:10:59.636 11:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:59.636 11:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:10:59.636 11:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:59.894 11:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:59.894 "name": "pt1", 00:10:59.894 "aliases": [ 00:10:59.894 "00000000-0000-0000-0000-000000000001" 00:10:59.894 ], 00:10:59.894 "product_name": "passthru", 00:10:59.894 "block_size": 512, 00:10:59.894 "num_blocks": 65536, 00:10:59.894 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:59.894 "assigned_rate_limits": { 00:10:59.894 "rw_ios_per_sec": 0, 00:10:59.894 "rw_mbytes_per_sec": 0, 00:10:59.894 "r_mbytes_per_sec": 0, 00:10:59.894 "w_mbytes_per_sec": 0 00:10:59.894 }, 00:10:59.894 "claimed": true, 00:10:59.894 "claim_type": "exclusive_write", 00:10:59.894 "zoned": false, 00:10:59.894 "supported_io_types": { 00:10:59.894 "read": true, 00:10:59.894 "write": true, 00:10:59.894 "unmap": true, 00:10:59.894 "flush": true, 00:10:59.894 "reset": true, 00:10:59.894 "nvme_admin": false, 00:10:59.894 "nvme_io": false, 00:10:59.894 "nvme_io_md": false, 00:10:59.894 "write_zeroes": true, 00:10:59.894 "zcopy": true, 00:10:59.894 "get_zone_info": false, 00:10:59.894 "zone_management": false, 00:10:59.894 "zone_append": false, 00:10:59.894 "compare": false, 00:10:59.894 "compare_and_write": false, 00:10:59.894 "abort": true, 00:10:59.894 "seek_hole": false, 00:10:59.894 "seek_data": false, 00:10:59.894 "copy": true, 00:10:59.894 "nvme_iov_md": false 00:10:59.894 }, 00:10:59.894 "memory_domains": [ 00:10:59.894 { 00:10:59.894 "dma_device_id": "system", 00:10:59.894 "dma_device_type": 1 00:10:59.894 }, 00:10:59.894 { 00:10:59.894 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:59.894 "dma_device_type": 2 00:10:59.894 } 00:10:59.894 ], 00:10:59.894 "driver_specific": { 00:10:59.894 "passthru": { 00:10:59.894 "name": "pt1", 00:10:59.894 "base_bdev_name": "malloc1" 00:10:59.894 } 00:10:59.894 } 00:10:59.894 }' 00:10:59.894 11:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:59.894 11:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:59.894 11:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:59.894 11:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:59.894 11:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:59.894 11:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:59.894 11:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:59.894 11:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:00.152 11:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:00.152 11:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:00.152 11:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:00.152 11:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:00.152 11:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:00.152 11:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:00.152 11:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:00.411 11:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:00.411 "name": "pt2", 00:11:00.411 "aliases": [ 00:11:00.411 "00000000-0000-0000-0000-000000000002" 00:11:00.411 ], 00:11:00.411 "product_name": "passthru", 00:11:00.411 "block_size": 512, 00:11:00.411 "num_blocks": 65536, 00:11:00.411 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:00.411 "assigned_rate_limits": { 00:11:00.411 "rw_ios_per_sec": 0, 00:11:00.411 "rw_mbytes_per_sec": 0, 00:11:00.411 "r_mbytes_per_sec": 0, 00:11:00.411 "w_mbytes_per_sec": 0 00:11:00.411 }, 00:11:00.411 "claimed": true, 00:11:00.411 "claim_type": "exclusive_write", 00:11:00.411 "zoned": false, 00:11:00.411 "supported_io_types": { 00:11:00.411 "read": true, 00:11:00.411 "write": true, 00:11:00.411 "unmap": true, 00:11:00.411 "flush": true, 00:11:00.411 "reset": true, 00:11:00.411 "nvme_admin": false, 00:11:00.411 "nvme_io": false, 00:11:00.411 "nvme_io_md": false, 00:11:00.411 "write_zeroes": true, 00:11:00.411 "zcopy": true, 00:11:00.411 "get_zone_info": false, 00:11:00.411 "zone_management": false, 00:11:00.411 "zone_append": false, 00:11:00.411 "compare": false, 00:11:00.411 "compare_and_write": false, 00:11:00.411 "abort": true, 00:11:00.411 "seek_hole": false, 00:11:00.411 "seek_data": false, 00:11:00.411 "copy": true, 00:11:00.411 "nvme_iov_md": false 00:11:00.411 }, 00:11:00.411 "memory_domains": [ 00:11:00.411 { 00:11:00.411 "dma_device_id": "system", 00:11:00.411 "dma_device_type": 1 00:11:00.411 }, 00:11:00.411 { 00:11:00.411 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:00.411 "dma_device_type": 2 00:11:00.411 } 00:11:00.411 ], 00:11:00.411 "driver_specific": { 00:11:00.411 "passthru": { 00:11:00.411 "name": "pt2", 00:11:00.411 "base_bdev_name": "malloc2" 00:11:00.411 } 00:11:00.411 } 00:11:00.411 }' 00:11:00.411 11:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:00.411 11:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:00.411 11:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:00.411 11:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:00.411 11:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:00.411 11:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:00.411 11:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:00.412 11:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:00.412 11:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:00.412 11:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:00.670 11:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:00.670 11:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:00.670 11:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:00.670 11:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:11:00.670 [2024-07-12 11:51:50.863719] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:00.670 11:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 9f30ec17-b986-49dc-a975-193a98a059a2 '!=' 9f30ec17-b986-49dc-a975-193a98a059a2 ']' 00:11:00.670 11:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:11:00.670 11:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:00.670 11:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:11:00.670 11:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:11:00.929 [2024-07-12 11:51:51.027995] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:11:00.929 11:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:11:00.929 11:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:00.929 11:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:00.929 11:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:00.929 11:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:00.929 11:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:00.929 11:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:00.929 11:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:00.929 11:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:00.929 11:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:00.929 11:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:00.929 11:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:01.188 11:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:01.188 "name": "raid_bdev1", 00:11:01.188 "uuid": "9f30ec17-b986-49dc-a975-193a98a059a2", 00:11:01.188 "strip_size_kb": 0, 00:11:01.188 "state": "online", 00:11:01.188 "raid_level": "raid1", 00:11:01.188 "superblock": true, 00:11:01.188 "num_base_bdevs": 2, 00:11:01.188 "num_base_bdevs_discovered": 1, 00:11:01.188 "num_base_bdevs_operational": 1, 00:11:01.188 "base_bdevs_list": [ 00:11:01.188 { 00:11:01.188 "name": null, 00:11:01.188 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:01.188 "is_configured": false, 00:11:01.188 "data_offset": 2048, 00:11:01.188 "data_size": 63488 00:11:01.188 }, 00:11:01.188 { 00:11:01.188 "name": "pt2", 00:11:01.188 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:01.188 "is_configured": true, 00:11:01.188 "data_offset": 2048, 00:11:01.188 "data_size": 63488 00:11:01.188 } 00:11:01.188 ] 00:11:01.188 }' 00:11:01.188 11:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:01.188 11:51:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:01.447 11:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:01.706 [2024-07-12 11:51:51.785924] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:01.706 [2024-07-12 11:51:51.785946] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:01.706 [2024-07-12 11:51:51.785987] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:01.706 [2024-07-12 11:51:51.786015] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:01.706 [2024-07-12 11:51:51.786021] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11083b0 name raid_bdev1, state offline 00:11:01.706 11:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:11:01.706 11:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:01.965 11:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:11:01.965 11:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:11:01.965 11:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:11:01.965 11:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:11:01.965 11:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:11:01.965 11:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:11:01.965 11:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:11:01.965 11:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:11:01.965 11:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:11:01.965 11:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=1 00:11:01.965 11:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:02.225 [2024-07-12 11:51:52.271277] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:02.225 [2024-07-12 11:51:52.271320] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:02.225 [2024-07-12 11:51:52.271332] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11096d0 00:11:02.225 [2024-07-12 11:51:52.271338] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:02.225 [2024-07-12 11:51:52.272488] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:02.225 [2024-07-12 11:51:52.272536] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:02.225 [2024-07-12 11:51:52.272589] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:11:02.225 [2024-07-12 11:51:52.272609] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:02.225 [2024-07-12 11:51:52.272674] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12b3280 00:11:02.225 [2024-07-12 11:51:52.272679] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:11:02.225 [2024-07-12 11:51:52.272798] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1108e70 00:11:02.225 [2024-07-12 11:51:52.272886] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12b3280 00:11:02.225 [2024-07-12 11:51:52.272891] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12b3280 00:11:02.225 [2024-07-12 11:51:52.272955] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:02.225 pt2 00:11:02.225 11:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:11:02.225 11:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:02.225 11:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:02.225 11:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:02.225 11:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:02.225 11:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:02.225 11:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:02.225 11:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:02.225 11:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:02.225 11:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:02.225 11:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:02.225 11:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:02.225 11:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:02.225 "name": "raid_bdev1", 00:11:02.225 "uuid": "9f30ec17-b986-49dc-a975-193a98a059a2", 00:11:02.225 "strip_size_kb": 0, 00:11:02.225 "state": "online", 00:11:02.225 "raid_level": "raid1", 00:11:02.225 "superblock": true, 00:11:02.225 "num_base_bdevs": 2, 00:11:02.225 "num_base_bdevs_discovered": 1, 00:11:02.225 "num_base_bdevs_operational": 1, 00:11:02.225 "base_bdevs_list": [ 00:11:02.225 { 00:11:02.225 "name": null, 00:11:02.225 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:02.225 "is_configured": false, 00:11:02.225 "data_offset": 2048, 00:11:02.225 "data_size": 63488 00:11:02.225 }, 00:11:02.225 { 00:11:02.225 "name": "pt2", 00:11:02.225 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:02.225 "is_configured": true, 00:11:02.225 "data_offset": 2048, 00:11:02.225 "data_size": 63488 00:11:02.225 } 00:11:02.225 ] 00:11:02.225 }' 00:11:02.225 11:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:02.225 11:51:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:02.792 11:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:03.049 [2024-07-12 11:51:53.077347] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:03.049 [2024-07-12 11:51:53.077369] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:03.049 [2024-07-12 11:51:53.077414] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:03.049 [2024-07-12 11:51:53.077443] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:03.049 [2024-07-12 11:51:53.077449] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12b3280 name raid_bdev1, state offline 00:11:03.049 11:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:03.049 11:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:11:03.049 11:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:11:03.049 11:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:11:03.049 11:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:11:03.049 11:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:03.307 [2024-07-12 11:51:53.418217] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:03.307 [2024-07-12 11:51:53.418252] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:03.307 [2024-07-12 11:51:53.418261] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12b4040 00:11:03.307 [2024-07-12 11:51:53.418267] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:03.307 [2024-07-12 11:51:53.419417] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:03.307 [2024-07-12 11:51:53.419438] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:03.307 [2024-07-12 11:51:53.419484] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:11:03.307 [2024-07-12 11:51:53.419502] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:03.307 [2024-07-12 11:51:53.419579] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:11:03.307 [2024-07-12 11:51:53.419586] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:03.307 [2024-07-12 11:51:53.419593] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12b9560 name raid_bdev1, state configuring 00:11:03.307 [2024-07-12 11:51:53.419606] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:03.307 [2024-07-12 11:51:53.419644] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12b7150 00:11:03.307 [2024-07-12 11:51:53.419649] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:11:03.307 [2024-07-12 11:51:53.419760] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12b5fc0 00:11:03.307 [2024-07-12 11:51:53.419841] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12b7150 00:11:03.307 [2024-07-12 11:51:53.419845] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12b7150 00:11:03.307 [2024-07-12 11:51:53.419909] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:03.307 pt1 00:11:03.307 11:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:11:03.307 11:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:11:03.307 11:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:03.307 11:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:03.307 11:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:03.307 11:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:03.307 11:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:03.307 11:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:03.307 11:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:03.307 11:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:03.307 11:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:03.307 11:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:03.307 11:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:03.564 11:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:03.564 "name": "raid_bdev1", 00:11:03.565 "uuid": "9f30ec17-b986-49dc-a975-193a98a059a2", 00:11:03.565 "strip_size_kb": 0, 00:11:03.565 "state": "online", 00:11:03.565 "raid_level": "raid1", 00:11:03.565 "superblock": true, 00:11:03.565 "num_base_bdevs": 2, 00:11:03.565 "num_base_bdevs_discovered": 1, 00:11:03.565 "num_base_bdevs_operational": 1, 00:11:03.565 "base_bdevs_list": [ 00:11:03.565 { 00:11:03.565 "name": null, 00:11:03.565 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:03.565 "is_configured": false, 00:11:03.565 "data_offset": 2048, 00:11:03.565 "data_size": 63488 00:11:03.565 }, 00:11:03.565 { 00:11:03.565 "name": "pt2", 00:11:03.565 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:03.565 "is_configured": true, 00:11:03.565 "data_offset": 2048, 00:11:03.565 "data_size": 63488 00:11:03.565 } 00:11:03.565 ] 00:11:03.565 }' 00:11:03.565 11:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:03.565 11:51:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:04.130 11:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:11:04.131 11:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:11:04.131 11:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:11:04.131 11:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:04.131 11:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:11:04.389 [2024-07-12 11:51:54.424961] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:04.389 11:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 9f30ec17-b986-49dc-a975-193a98a059a2 '!=' 9f30ec17-b986-49dc-a975-193a98a059a2 ']' 00:11:04.390 11:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 606557 00:11:04.390 11:51:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 606557 ']' 00:11:04.390 11:51:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 606557 00:11:04.390 11:51:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:11:04.390 11:51:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:04.390 11:51:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 606557 00:11:04.390 11:51:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:04.390 11:51:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:04.390 11:51:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 606557' 00:11:04.390 killing process with pid 606557 00:11:04.390 11:51:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 606557 00:11:04.390 [2024-07-12 11:51:54.480217] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:04.390 [2024-07-12 11:51:54.480262] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:04.390 11:51:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 606557 00:11:04.390 [2024-07-12 11:51:54.480292] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:04.390 [2024-07-12 11:51:54.480299] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12b7150 name raid_bdev1, state offline 00:11:04.390 [2024-07-12 11:51:54.495349] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:04.648 11:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:11:04.649 00:11:04.649 real 0m11.680s 00:11:04.649 user 0m21.451s 00:11:04.649 sys 0m1.848s 00:11:04.649 11:51:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:04.649 11:51:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:04.649 ************************************ 00:11:04.649 END TEST raid_superblock_test 00:11:04.649 ************************************ 00:11:04.649 11:51:54 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:04.649 11:51:54 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 2 read 00:11:04.649 11:51:54 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:04.649 11:51:54 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:04.649 11:51:54 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:04.649 ************************************ 00:11:04.649 START TEST raid_read_error_test 00:11:04.649 ************************************ 00:11:04.649 11:51:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 2 read 00:11:04.649 11:51:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:11:04.649 11:51:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:11:04.649 11:51:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:11:04.649 11:51:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:11:04.649 11:51:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:04.649 11:51:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:11:04.649 11:51:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:04.649 11:51:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:04.649 11:51:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:11:04.649 11:51:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:04.649 11:51:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:04.649 11:51:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:04.649 11:51:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:11:04.649 11:51:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:11:04.649 11:51:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:11:04.649 11:51:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:11:04.649 11:51:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:11:04.649 11:51:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:11:04.649 11:51:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:11:04.649 11:51:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:11:04.649 11:51:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:11:04.649 11:51:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.pGriIbxffa 00:11:04.649 11:51:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=608726 00:11:04.649 11:51:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 608726 /var/tmp/spdk-raid.sock 00:11:04.649 11:51:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 608726 ']' 00:11:04.649 11:51:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:04.649 11:51:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:04.649 11:51:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:04.649 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:04.649 11:51:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:11:04.649 11:51:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:04.649 11:51:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:04.649 [2024-07-12 11:51:54.788363] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:11:04.649 [2024-07-12 11:51:54.788399] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid608726 ] 00:11:04.649 [2024-07-12 11:51:54.853641] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:04.907 [2024-07-12 11:51:54.932013] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:11:04.907 [2024-07-12 11:51:54.992591] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:04.907 [2024-07-12 11:51:54.992616] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:05.475 11:51:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:05.475 11:51:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:11:05.475 11:51:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:05.475 11:51:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:11:05.732 BaseBdev1_malloc 00:11:05.732 11:51:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:11:05.732 true 00:11:05.732 11:51:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:11:05.989 [2024-07-12 11:51:56.069839] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:11:05.989 [2024-07-12 11:51:56.069868] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:05.989 [2024-07-12 11:51:56.069879] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25f82d0 00:11:05.989 [2024-07-12 11:51:56.069885] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:05.989 [2024-07-12 11:51:56.071123] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:05.989 [2024-07-12 11:51:56.071144] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:11:05.989 BaseBdev1 00:11:05.989 11:51:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:05.989 11:51:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:11:05.989 BaseBdev2_malloc 00:11:06.246 11:51:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:11:06.246 true 00:11:06.246 11:51:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:06.504 [2024-07-12 11:51:56.566651] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:06.504 [2024-07-12 11:51:56.566681] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:06.504 [2024-07-12 11:51:56.566693] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25fcf40 00:11:06.504 [2024-07-12 11:51:56.566699] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:06.504 [2024-07-12 11:51:56.567776] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:06.504 [2024-07-12 11:51:56.567796] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:06.504 BaseBdev2 00:11:06.504 11:51:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:11:06.504 [2024-07-12 11:51:56.731103] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:06.504 [2024-07-12 11:51:56.731981] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:06.504 [2024-07-12 11:51:56.732105] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25fdc80 00:11:06.504 [2024-07-12 11:51:56.732113] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:11:06.504 [2024-07-12 11:51:56.732240] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25ff7b0 00:11:06.504 [2024-07-12 11:51:56.732339] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25fdc80 00:11:06.504 [2024-07-12 11:51:56.732344] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x25fdc80 00:11:06.504 [2024-07-12 11:51:56.732411] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:06.504 11:51:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:11:06.504 11:51:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:06.504 11:51:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:06.504 11:51:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:06.504 11:51:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:06.504 11:51:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:06.504 11:51:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:06.504 11:51:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:06.504 11:51:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:06.504 11:51:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:06.504 11:51:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:06.504 11:51:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:06.762 11:51:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:06.762 "name": "raid_bdev1", 00:11:06.762 "uuid": "91726f5d-f725-4ffc-b1e0-1a3ba07ccd66", 00:11:06.762 "strip_size_kb": 0, 00:11:06.762 "state": "online", 00:11:06.762 "raid_level": "raid1", 00:11:06.762 "superblock": true, 00:11:06.762 "num_base_bdevs": 2, 00:11:06.762 "num_base_bdevs_discovered": 2, 00:11:06.762 "num_base_bdevs_operational": 2, 00:11:06.762 "base_bdevs_list": [ 00:11:06.762 { 00:11:06.762 "name": "BaseBdev1", 00:11:06.762 "uuid": "89404c01-a439-5702-b645-5913d70feb84", 00:11:06.762 "is_configured": true, 00:11:06.762 "data_offset": 2048, 00:11:06.762 "data_size": 63488 00:11:06.762 }, 00:11:06.762 { 00:11:06.762 "name": "BaseBdev2", 00:11:06.762 "uuid": "669ec224-c29b-5d5f-b7c4-a6689eaeb5f4", 00:11:06.762 "is_configured": true, 00:11:06.762 "data_offset": 2048, 00:11:06.762 "data_size": 63488 00:11:06.762 } 00:11:06.762 ] 00:11:06.763 }' 00:11:06.763 11:51:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:06.763 11:51:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:07.328 11:51:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:11:07.328 11:51:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:11:07.328 [2024-07-12 11:51:57.461169] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25ff440 00:11:08.262 11:51:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:11:08.520 11:51:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:11:08.520 11:51:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:11:08.520 11:51:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:11:08.520 11:51:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:11:08.520 11:51:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:11:08.520 11:51:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:08.520 11:51:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:08.520 11:51:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:08.520 11:51:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:08.520 11:51:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:08.520 11:51:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:08.520 11:51:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:08.520 11:51:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:08.520 11:51:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:08.520 11:51:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:08.520 11:51:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:08.520 11:51:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:08.520 "name": "raid_bdev1", 00:11:08.520 "uuid": "91726f5d-f725-4ffc-b1e0-1a3ba07ccd66", 00:11:08.520 "strip_size_kb": 0, 00:11:08.520 "state": "online", 00:11:08.520 "raid_level": "raid1", 00:11:08.520 "superblock": true, 00:11:08.520 "num_base_bdevs": 2, 00:11:08.520 "num_base_bdevs_discovered": 2, 00:11:08.520 "num_base_bdevs_operational": 2, 00:11:08.520 "base_bdevs_list": [ 00:11:08.520 { 00:11:08.520 "name": "BaseBdev1", 00:11:08.520 "uuid": "89404c01-a439-5702-b645-5913d70feb84", 00:11:08.520 "is_configured": true, 00:11:08.520 "data_offset": 2048, 00:11:08.520 "data_size": 63488 00:11:08.520 }, 00:11:08.520 { 00:11:08.520 "name": "BaseBdev2", 00:11:08.520 "uuid": "669ec224-c29b-5d5f-b7c4-a6689eaeb5f4", 00:11:08.520 "is_configured": true, 00:11:08.520 "data_offset": 2048, 00:11:08.520 "data_size": 63488 00:11:08.520 } 00:11:08.520 ] 00:11:08.520 }' 00:11:08.520 11:51:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:08.520 11:51:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:09.087 11:51:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:09.346 [2024-07-12 11:51:59.392559] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:09.346 [2024-07-12 11:51:59.392588] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:09.346 [2024-07-12 11:51:59.394636] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:09.346 [2024-07-12 11:51:59.394656] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:09.346 [2024-07-12 11:51:59.394706] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:09.346 [2024-07-12 11:51:59.394712] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25fdc80 name raid_bdev1, state offline 00:11:09.346 0 00:11:09.346 11:51:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 608726 00:11:09.346 11:51:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 608726 ']' 00:11:09.346 11:51:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 608726 00:11:09.346 11:51:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:11:09.346 11:51:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:09.346 11:51:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 608726 00:11:09.346 11:51:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:09.346 11:51:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:09.346 11:51:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 608726' 00:11:09.346 killing process with pid 608726 00:11:09.346 11:51:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 608726 00:11:09.346 [2024-07-12 11:51:59.439636] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:09.346 11:51:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 608726 00:11:09.346 [2024-07-12 11:51:59.449521] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:09.605 11:51:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:11:09.605 11:51:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.pGriIbxffa 00:11:09.605 11:51:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:11:09.605 11:51:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:11:09.605 11:51:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:11:09.605 11:51:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:09.605 11:51:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:11:09.605 11:51:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:11:09.605 00:11:09.605 real 0m4.910s 00:11:09.605 user 0m7.499s 00:11:09.605 sys 0m0.715s 00:11:09.605 11:51:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:09.606 11:51:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:09.606 ************************************ 00:11:09.606 END TEST raid_read_error_test 00:11:09.606 ************************************ 00:11:09.606 11:51:59 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:09.606 11:51:59 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 2 write 00:11:09.606 11:51:59 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:09.606 11:51:59 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:09.606 11:51:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:09.606 ************************************ 00:11:09.606 START TEST raid_write_error_test 00:11:09.606 ************************************ 00:11:09.606 11:51:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 2 write 00:11:09.606 11:51:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:11:09.606 11:51:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:11:09.606 11:51:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:11:09.606 11:51:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:11:09.606 11:51:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:09.606 11:51:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:11:09.606 11:51:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:09.606 11:51:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:09.606 11:51:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:11:09.606 11:51:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:09.606 11:51:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:09.606 11:51:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:09.606 11:51:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:11:09.606 11:51:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:11:09.606 11:51:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:11:09.606 11:51:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:11:09.606 11:51:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:11:09.606 11:51:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:11:09.606 11:51:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:11:09.606 11:51:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:11:09.606 11:51:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:11:09.606 11:51:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.LRt98ajFnp 00:11:09.606 11:51:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=609713 00:11:09.606 11:51:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 609713 /var/tmp/spdk-raid.sock 00:11:09.606 11:51:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:11:09.606 11:51:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 609713 ']' 00:11:09.606 11:51:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:09.606 11:51:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:09.606 11:51:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:09.606 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:09.606 11:51:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:09.606 11:51:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:09.606 [2024-07-12 11:51:59.770990] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:11:09.606 [2024-07-12 11:51:59.771028] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid609713 ] 00:11:09.606 [2024-07-12 11:51:59.833128] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:09.864 [2024-07-12 11:51:59.904128] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:11:09.864 [2024-07-12 11:51:59.959343] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:09.864 [2024-07-12 11:51:59.959369] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:10.429 11:52:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:10.429 11:52:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:11:10.429 11:52:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:10.429 11:52:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:11:10.687 BaseBdev1_malloc 00:11:10.687 11:52:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:11:10.687 true 00:11:10.687 11:52:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:11:10.945 [2024-07-12 11:52:01.023532] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:11:10.945 [2024-07-12 11:52:01.023565] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:10.945 [2024-07-12 11:52:01.023575] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11f82d0 00:11:10.945 [2024-07-12 11:52:01.023581] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:10.945 [2024-07-12 11:52:01.024671] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:10.945 [2024-07-12 11:52:01.024692] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:11:10.945 BaseBdev1 00:11:10.945 11:52:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:10.945 11:52:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:11:11.203 BaseBdev2_malloc 00:11:11.203 11:52:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:11:11.203 true 00:11:11.203 11:52:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:11.461 [2024-07-12 11:52:01.524172] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:11.461 [2024-07-12 11:52:01.524199] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:11.461 [2024-07-12 11:52:01.524208] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11fcf40 00:11:11.462 [2024-07-12 11:52:01.524214] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:11.462 [2024-07-12 11:52:01.525152] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:11.462 [2024-07-12 11:52:01.525171] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:11.462 BaseBdev2 00:11:11.462 11:52:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:11:11.462 [2024-07-12 11:52:01.700655] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:11.462 [2024-07-12 11:52:01.701473] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:11.462 [2024-07-12 11:52:01.701605] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x11fdc80 00:11:11.462 [2024-07-12 11:52:01.701614] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:11:11.462 [2024-07-12 11:52:01.701731] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11ff7b0 00:11:11.462 [2024-07-12 11:52:01.701833] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11fdc80 00:11:11.462 [2024-07-12 11:52:01.701838] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x11fdc80 00:11:11.462 [2024-07-12 11:52:01.701904] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:11.720 11:52:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:11:11.720 11:52:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:11.720 11:52:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:11.720 11:52:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:11.720 11:52:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:11.720 11:52:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:11.720 11:52:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:11.720 11:52:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:11.720 11:52:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:11.720 11:52:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:11.720 11:52:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:11.720 11:52:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:11.720 11:52:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:11.720 "name": "raid_bdev1", 00:11:11.720 "uuid": "3848e01f-447c-48c4-a1ee-3d7e8b03eeba", 00:11:11.720 "strip_size_kb": 0, 00:11:11.720 "state": "online", 00:11:11.720 "raid_level": "raid1", 00:11:11.720 "superblock": true, 00:11:11.720 "num_base_bdevs": 2, 00:11:11.720 "num_base_bdevs_discovered": 2, 00:11:11.720 "num_base_bdevs_operational": 2, 00:11:11.720 "base_bdevs_list": [ 00:11:11.720 { 00:11:11.720 "name": "BaseBdev1", 00:11:11.720 "uuid": "e9bd5e0d-df99-540d-98e1-09cf9805538b", 00:11:11.720 "is_configured": true, 00:11:11.720 "data_offset": 2048, 00:11:11.720 "data_size": 63488 00:11:11.720 }, 00:11:11.720 { 00:11:11.720 "name": "BaseBdev2", 00:11:11.721 "uuid": "83fd440a-dcd7-58c9-8d2c-72edbdcff8a7", 00:11:11.721 "is_configured": true, 00:11:11.721 "data_offset": 2048, 00:11:11.721 "data_size": 63488 00:11:11.721 } 00:11:11.721 ] 00:11:11.721 }' 00:11:11.721 11:52:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:11.721 11:52:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:12.289 11:52:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:11:12.289 11:52:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:11:12.289 [2024-07-12 11:52:02.454787] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11ff440 00:11:13.227 11:52:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:11:13.488 [2024-07-12 11:52:03.531072] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:11:13.488 [2024-07-12 11:52:03.531119] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:13.488 [2024-07-12 11:52:03.531270] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x11ff440 00:11:13.488 11:52:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:11:13.488 11:52:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:11:13.488 11:52:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:11:13.488 11:52:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=1 00:11:13.488 11:52:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:11:13.488 11:52:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:13.488 11:52:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:13.488 11:52:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:13.488 11:52:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:13.488 11:52:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:13.488 11:52:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:13.488 11:52:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:13.488 11:52:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:13.488 11:52:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:13.488 11:52:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:13.488 11:52:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:13.488 11:52:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:13.488 "name": "raid_bdev1", 00:11:13.488 "uuid": "3848e01f-447c-48c4-a1ee-3d7e8b03eeba", 00:11:13.488 "strip_size_kb": 0, 00:11:13.488 "state": "online", 00:11:13.488 "raid_level": "raid1", 00:11:13.488 "superblock": true, 00:11:13.488 "num_base_bdevs": 2, 00:11:13.488 "num_base_bdevs_discovered": 1, 00:11:13.488 "num_base_bdevs_operational": 1, 00:11:13.488 "base_bdevs_list": [ 00:11:13.488 { 00:11:13.488 "name": null, 00:11:13.488 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:13.488 "is_configured": false, 00:11:13.488 "data_offset": 2048, 00:11:13.488 "data_size": 63488 00:11:13.488 }, 00:11:13.488 { 00:11:13.488 "name": "BaseBdev2", 00:11:13.488 "uuid": "83fd440a-dcd7-58c9-8d2c-72edbdcff8a7", 00:11:13.488 "is_configured": true, 00:11:13.488 "data_offset": 2048, 00:11:13.488 "data_size": 63488 00:11:13.488 } 00:11:13.488 ] 00:11:13.488 }' 00:11:13.488 11:52:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:13.488 11:52:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:14.057 11:52:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:14.316 [2024-07-12 11:52:04.351601] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:14.316 [2024-07-12 11:52:04.351629] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:14.316 [2024-07-12 11:52:04.353649] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:14.316 [2024-07-12 11:52:04.353669] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:14.316 [2024-07-12 11:52:04.353707] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:14.316 [2024-07-12 11:52:04.353713] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11fdc80 name raid_bdev1, state offline 00:11:14.316 0 00:11:14.316 11:52:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 609713 00:11:14.316 11:52:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 609713 ']' 00:11:14.316 11:52:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 609713 00:11:14.316 11:52:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:11:14.316 11:52:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:14.316 11:52:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 609713 00:11:14.316 11:52:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:14.316 11:52:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:14.316 11:52:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 609713' 00:11:14.316 killing process with pid 609713 00:11:14.316 11:52:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 609713 00:11:14.316 [2024-07-12 11:52:04.410484] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:14.316 11:52:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 609713 00:11:14.316 [2024-07-12 11:52:04.419278] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:14.575 11:52:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.LRt98ajFnp 00:11:14.575 11:52:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:11:14.575 11:52:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:11:14.575 11:52:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:11:14.575 11:52:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:11:14.575 11:52:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:14.575 11:52:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:11:14.575 11:52:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:11:14.575 00:11:14.575 real 0m4.903s 00:11:14.575 user 0m7.481s 00:11:14.575 sys 0m0.715s 00:11:14.575 11:52:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:14.575 11:52:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:14.575 ************************************ 00:11:14.575 END TEST raid_write_error_test 00:11:14.575 ************************************ 00:11:14.575 11:52:04 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:14.575 11:52:04 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:11:14.575 11:52:04 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:11:14.575 11:52:04 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 3 false 00:11:14.575 11:52:04 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:14.575 11:52:04 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:14.575 11:52:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:14.575 ************************************ 00:11:14.575 START TEST raid_state_function_test 00:11:14.575 ************************************ 00:11:14.575 11:52:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 3 false 00:11:14.575 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:11:14.575 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:11:14.575 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:11:14.575 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:14.575 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:14.575 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:14.575 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:14.575 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:14.575 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:14.575 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:14.576 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:14.576 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:14.576 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:11:14.576 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:14.576 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:14.576 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:11:14.576 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:14.576 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:14.576 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:14.576 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:14.576 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:14.576 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:11:14.576 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:14.576 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:14.576 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:11:14.576 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:11:14.576 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=610707 00:11:14.576 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:14.576 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 610707' 00:11:14.576 Process raid pid: 610707 00:11:14.576 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 610707 /var/tmp/spdk-raid.sock 00:11:14.576 11:52:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 610707 ']' 00:11:14.576 11:52:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:14.576 11:52:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:14.576 11:52:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:14.576 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:14.576 11:52:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:14.576 11:52:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:14.576 [2024-07-12 11:52:04.728546] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:11:14.576 [2024-07-12 11:52:04.728581] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:14.576 [2024-07-12 11:52:04.792604] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:14.835 [2024-07-12 11:52:04.871973] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:11:14.835 [2024-07-12 11:52:04.922520] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:14.835 [2024-07-12 11:52:04.922543] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:15.401 11:52:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:15.401 11:52:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:11:15.401 11:52:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:15.660 [2024-07-12 11:52:05.657005] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:15.660 [2024-07-12 11:52:05.657033] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:15.660 [2024-07-12 11:52:05.657039] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:15.660 [2024-07-12 11:52:05.657044] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:15.660 [2024-07-12 11:52:05.657048] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:15.660 [2024-07-12 11:52:05.657069] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:15.660 11:52:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:15.660 11:52:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:15.660 11:52:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:15.660 11:52:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:15.660 11:52:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:15.660 11:52:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:15.660 11:52:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:15.660 11:52:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:15.660 11:52:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:15.660 11:52:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:15.660 11:52:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:15.660 11:52:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:15.660 11:52:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:15.660 "name": "Existed_Raid", 00:11:15.660 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:15.660 "strip_size_kb": 64, 00:11:15.660 "state": "configuring", 00:11:15.660 "raid_level": "raid0", 00:11:15.660 "superblock": false, 00:11:15.660 "num_base_bdevs": 3, 00:11:15.660 "num_base_bdevs_discovered": 0, 00:11:15.660 "num_base_bdevs_operational": 3, 00:11:15.660 "base_bdevs_list": [ 00:11:15.660 { 00:11:15.660 "name": "BaseBdev1", 00:11:15.660 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:15.660 "is_configured": false, 00:11:15.660 "data_offset": 0, 00:11:15.660 "data_size": 0 00:11:15.660 }, 00:11:15.660 { 00:11:15.660 "name": "BaseBdev2", 00:11:15.660 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:15.660 "is_configured": false, 00:11:15.660 "data_offset": 0, 00:11:15.660 "data_size": 0 00:11:15.660 }, 00:11:15.660 { 00:11:15.660 "name": "BaseBdev3", 00:11:15.660 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:15.660 "is_configured": false, 00:11:15.660 "data_offset": 0, 00:11:15.660 "data_size": 0 00:11:15.660 } 00:11:15.660 ] 00:11:15.660 }' 00:11:15.660 11:52:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:15.660 11:52:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:16.229 11:52:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:16.229 [2024-07-12 11:52:06.450971] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:16.229 [2024-07-12 11:52:06.450992] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19db1d0 name Existed_Raid, state configuring 00:11:16.229 11:52:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:16.487 [2024-07-12 11:52:06.615408] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:16.487 [2024-07-12 11:52:06.615434] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:16.487 [2024-07-12 11:52:06.615439] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:16.487 [2024-07-12 11:52:06.615444] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:16.488 [2024-07-12 11:52:06.615448] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:16.488 [2024-07-12 11:52:06.615453] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:16.488 11:52:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:16.746 [2024-07-12 11:52:06.783937] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:16.746 BaseBdev1 00:11:16.746 11:52:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:16.746 11:52:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:16.746 11:52:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:16.746 11:52:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:16.746 11:52:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:16.746 11:52:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:16.747 11:52:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:16.747 11:52:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:17.006 [ 00:11:17.006 { 00:11:17.006 "name": "BaseBdev1", 00:11:17.006 "aliases": [ 00:11:17.006 "83bd5edf-d670-4503-9039-51c429b95f2d" 00:11:17.006 ], 00:11:17.006 "product_name": "Malloc disk", 00:11:17.006 "block_size": 512, 00:11:17.006 "num_blocks": 65536, 00:11:17.006 "uuid": "83bd5edf-d670-4503-9039-51c429b95f2d", 00:11:17.006 "assigned_rate_limits": { 00:11:17.006 "rw_ios_per_sec": 0, 00:11:17.006 "rw_mbytes_per_sec": 0, 00:11:17.006 "r_mbytes_per_sec": 0, 00:11:17.006 "w_mbytes_per_sec": 0 00:11:17.006 }, 00:11:17.006 "claimed": true, 00:11:17.006 "claim_type": "exclusive_write", 00:11:17.006 "zoned": false, 00:11:17.006 "supported_io_types": { 00:11:17.006 "read": true, 00:11:17.006 "write": true, 00:11:17.006 "unmap": true, 00:11:17.006 "flush": true, 00:11:17.006 "reset": true, 00:11:17.006 "nvme_admin": false, 00:11:17.006 "nvme_io": false, 00:11:17.006 "nvme_io_md": false, 00:11:17.006 "write_zeroes": true, 00:11:17.006 "zcopy": true, 00:11:17.006 "get_zone_info": false, 00:11:17.006 "zone_management": false, 00:11:17.006 "zone_append": false, 00:11:17.006 "compare": false, 00:11:17.006 "compare_and_write": false, 00:11:17.006 "abort": true, 00:11:17.006 "seek_hole": false, 00:11:17.006 "seek_data": false, 00:11:17.006 "copy": true, 00:11:17.006 "nvme_iov_md": false 00:11:17.006 }, 00:11:17.006 "memory_domains": [ 00:11:17.006 { 00:11:17.006 "dma_device_id": "system", 00:11:17.006 "dma_device_type": 1 00:11:17.006 }, 00:11:17.006 { 00:11:17.006 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:17.006 "dma_device_type": 2 00:11:17.006 } 00:11:17.006 ], 00:11:17.006 "driver_specific": {} 00:11:17.006 } 00:11:17.006 ] 00:11:17.006 11:52:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:17.006 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:17.006 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:17.006 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:17.006 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:17.006 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:17.006 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:17.006 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:17.006 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:17.006 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:17.006 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:17.006 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:17.006 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:17.265 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:17.265 "name": "Existed_Raid", 00:11:17.265 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:17.265 "strip_size_kb": 64, 00:11:17.265 "state": "configuring", 00:11:17.265 "raid_level": "raid0", 00:11:17.265 "superblock": false, 00:11:17.265 "num_base_bdevs": 3, 00:11:17.265 "num_base_bdevs_discovered": 1, 00:11:17.265 "num_base_bdevs_operational": 3, 00:11:17.265 "base_bdevs_list": [ 00:11:17.265 { 00:11:17.265 "name": "BaseBdev1", 00:11:17.265 "uuid": "83bd5edf-d670-4503-9039-51c429b95f2d", 00:11:17.265 "is_configured": true, 00:11:17.265 "data_offset": 0, 00:11:17.265 "data_size": 65536 00:11:17.265 }, 00:11:17.265 { 00:11:17.265 "name": "BaseBdev2", 00:11:17.266 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:17.266 "is_configured": false, 00:11:17.266 "data_offset": 0, 00:11:17.266 "data_size": 0 00:11:17.266 }, 00:11:17.266 { 00:11:17.266 "name": "BaseBdev3", 00:11:17.266 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:17.266 "is_configured": false, 00:11:17.266 "data_offset": 0, 00:11:17.266 "data_size": 0 00:11:17.266 } 00:11:17.266 ] 00:11:17.266 }' 00:11:17.266 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:17.266 11:52:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:17.833 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:17.833 [2024-07-12 11:52:07.926894] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:17.833 [2024-07-12 11:52:07.926924] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19daaa0 name Existed_Raid, state configuring 00:11:17.833 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:18.092 [2024-07-12 11:52:08.099358] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:18.092 [2024-07-12 11:52:08.100386] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:18.092 [2024-07-12 11:52:08.100410] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:18.092 [2024-07-12 11:52:08.100415] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:18.092 [2024-07-12 11:52:08.100420] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:18.092 11:52:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:18.092 11:52:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:18.092 11:52:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:18.092 11:52:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:18.092 11:52:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:18.092 11:52:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:18.092 11:52:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:18.092 11:52:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:18.092 11:52:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:18.092 11:52:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:18.092 11:52:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:18.092 11:52:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:18.092 11:52:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:18.093 11:52:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:18.093 11:52:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:18.093 "name": "Existed_Raid", 00:11:18.093 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:18.093 "strip_size_kb": 64, 00:11:18.093 "state": "configuring", 00:11:18.093 "raid_level": "raid0", 00:11:18.093 "superblock": false, 00:11:18.093 "num_base_bdevs": 3, 00:11:18.093 "num_base_bdevs_discovered": 1, 00:11:18.093 "num_base_bdevs_operational": 3, 00:11:18.093 "base_bdevs_list": [ 00:11:18.093 { 00:11:18.093 "name": "BaseBdev1", 00:11:18.093 "uuid": "83bd5edf-d670-4503-9039-51c429b95f2d", 00:11:18.093 "is_configured": true, 00:11:18.093 "data_offset": 0, 00:11:18.093 "data_size": 65536 00:11:18.093 }, 00:11:18.093 { 00:11:18.093 "name": "BaseBdev2", 00:11:18.093 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:18.093 "is_configured": false, 00:11:18.093 "data_offset": 0, 00:11:18.093 "data_size": 0 00:11:18.093 }, 00:11:18.093 { 00:11:18.093 "name": "BaseBdev3", 00:11:18.093 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:18.093 "is_configured": false, 00:11:18.093 "data_offset": 0, 00:11:18.093 "data_size": 0 00:11:18.093 } 00:11:18.093 ] 00:11:18.093 }' 00:11:18.093 11:52:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:18.093 11:52:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:18.661 11:52:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:18.920 [2024-07-12 11:52:08.924189] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:18.920 BaseBdev2 00:11:18.920 11:52:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:18.920 11:52:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:18.920 11:52:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:18.920 11:52:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:18.920 11:52:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:18.920 11:52:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:18.920 11:52:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:18.920 11:52:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:19.184 [ 00:11:19.184 { 00:11:19.184 "name": "BaseBdev2", 00:11:19.184 "aliases": [ 00:11:19.184 "d68f4af0-aaf5-4f8a-9bb2-56cc450b6679" 00:11:19.184 ], 00:11:19.184 "product_name": "Malloc disk", 00:11:19.184 "block_size": 512, 00:11:19.184 "num_blocks": 65536, 00:11:19.184 "uuid": "d68f4af0-aaf5-4f8a-9bb2-56cc450b6679", 00:11:19.184 "assigned_rate_limits": { 00:11:19.184 "rw_ios_per_sec": 0, 00:11:19.184 "rw_mbytes_per_sec": 0, 00:11:19.184 "r_mbytes_per_sec": 0, 00:11:19.184 "w_mbytes_per_sec": 0 00:11:19.184 }, 00:11:19.184 "claimed": true, 00:11:19.184 "claim_type": "exclusive_write", 00:11:19.184 "zoned": false, 00:11:19.184 "supported_io_types": { 00:11:19.184 "read": true, 00:11:19.184 "write": true, 00:11:19.184 "unmap": true, 00:11:19.184 "flush": true, 00:11:19.184 "reset": true, 00:11:19.184 "nvme_admin": false, 00:11:19.184 "nvme_io": false, 00:11:19.184 "nvme_io_md": false, 00:11:19.184 "write_zeroes": true, 00:11:19.184 "zcopy": true, 00:11:19.184 "get_zone_info": false, 00:11:19.184 "zone_management": false, 00:11:19.184 "zone_append": false, 00:11:19.184 "compare": false, 00:11:19.184 "compare_and_write": false, 00:11:19.184 "abort": true, 00:11:19.184 "seek_hole": false, 00:11:19.184 "seek_data": false, 00:11:19.184 "copy": true, 00:11:19.184 "nvme_iov_md": false 00:11:19.184 }, 00:11:19.184 "memory_domains": [ 00:11:19.184 { 00:11:19.184 "dma_device_id": "system", 00:11:19.184 "dma_device_type": 1 00:11:19.184 }, 00:11:19.184 { 00:11:19.184 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:19.184 "dma_device_type": 2 00:11:19.184 } 00:11:19.184 ], 00:11:19.184 "driver_specific": {} 00:11:19.184 } 00:11:19.184 ] 00:11:19.184 11:52:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:19.184 11:52:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:19.184 11:52:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:19.184 11:52:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:19.184 11:52:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:19.184 11:52:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:19.184 11:52:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:19.184 11:52:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:19.184 11:52:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:19.184 11:52:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:19.184 11:52:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:19.184 11:52:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:19.184 11:52:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:19.184 11:52:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:19.184 11:52:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:19.184 11:52:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:19.184 "name": "Existed_Raid", 00:11:19.184 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:19.184 "strip_size_kb": 64, 00:11:19.184 "state": "configuring", 00:11:19.184 "raid_level": "raid0", 00:11:19.184 "superblock": false, 00:11:19.184 "num_base_bdevs": 3, 00:11:19.184 "num_base_bdevs_discovered": 2, 00:11:19.185 "num_base_bdevs_operational": 3, 00:11:19.185 "base_bdevs_list": [ 00:11:19.185 { 00:11:19.185 "name": "BaseBdev1", 00:11:19.185 "uuid": "83bd5edf-d670-4503-9039-51c429b95f2d", 00:11:19.185 "is_configured": true, 00:11:19.185 "data_offset": 0, 00:11:19.185 "data_size": 65536 00:11:19.185 }, 00:11:19.185 { 00:11:19.185 "name": "BaseBdev2", 00:11:19.185 "uuid": "d68f4af0-aaf5-4f8a-9bb2-56cc450b6679", 00:11:19.185 "is_configured": true, 00:11:19.185 "data_offset": 0, 00:11:19.185 "data_size": 65536 00:11:19.185 }, 00:11:19.185 { 00:11:19.185 "name": "BaseBdev3", 00:11:19.185 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:19.185 "is_configured": false, 00:11:19.185 "data_offset": 0, 00:11:19.185 "data_size": 0 00:11:19.185 } 00:11:19.185 ] 00:11:19.185 }' 00:11:19.185 11:52:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:19.444 11:52:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:19.702 11:52:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:11:19.960 [2024-07-12 11:52:10.046212] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:11:19.960 [2024-07-12 11:52:10.046252] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x19db990 00:11:19.960 [2024-07-12 11:52:10.046256] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:11:19.960 [2024-07-12 11:52:10.046398] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16e1210 00:11:19.960 [2024-07-12 11:52:10.046484] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19db990 00:11:19.960 [2024-07-12 11:52:10.046490] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x19db990 00:11:19.960 [2024-07-12 11:52:10.046642] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:19.960 BaseBdev3 00:11:19.960 11:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:11:19.960 11:52:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:11:19.960 11:52:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:19.960 11:52:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:19.960 11:52:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:19.960 11:52:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:19.960 11:52:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:20.219 11:52:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:11:20.219 [ 00:11:20.219 { 00:11:20.219 "name": "BaseBdev3", 00:11:20.219 "aliases": [ 00:11:20.219 "b2568a49-18a5-4a38-9b27-dafd74decab8" 00:11:20.219 ], 00:11:20.219 "product_name": "Malloc disk", 00:11:20.219 "block_size": 512, 00:11:20.219 "num_blocks": 65536, 00:11:20.219 "uuid": "b2568a49-18a5-4a38-9b27-dafd74decab8", 00:11:20.219 "assigned_rate_limits": { 00:11:20.219 "rw_ios_per_sec": 0, 00:11:20.219 "rw_mbytes_per_sec": 0, 00:11:20.219 "r_mbytes_per_sec": 0, 00:11:20.219 "w_mbytes_per_sec": 0 00:11:20.219 }, 00:11:20.219 "claimed": true, 00:11:20.219 "claim_type": "exclusive_write", 00:11:20.219 "zoned": false, 00:11:20.219 "supported_io_types": { 00:11:20.219 "read": true, 00:11:20.219 "write": true, 00:11:20.219 "unmap": true, 00:11:20.219 "flush": true, 00:11:20.219 "reset": true, 00:11:20.219 "nvme_admin": false, 00:11:20.219 "nvme_io": false, 00:11:20.219 "nvme_io_md": false, 00:11:20.219 "write_zeroes": true, 00:11:20.219 "zcopy": true, 00:11:20.219 "get_zone_info": false, 00:11:20.219 "zone_management": false, 00:11:20.219 "zone_append": false, 00:11:20.219 "compare": false, 00:11:20.219 "compare_and_write": false, 00:11:20.219 "abort": true, 00:11:20.219 "seek_hole": false, 00:11:20.219 "seek_data": false, 00:11:20.219 "copy": true, 00:11:20.219 "nvme_iov_md": false 00:11:20.219 }, 00:11:20.219 "memory_domains": [ 00:11:20.219 { 00:11:20.219 "dma_device_id": "system", 00:11:20.219 "dma_device_type": 1 00:11:20.219 }, 00:11:20.219 { 00:11:20.219 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:20.219 "dma_device_type": 2 00:11:20.219 } 00:11:20.219 ], 00:11:20.219 "driver_specific": {} 00:11:20.219 } 00:11:20.219 ] 00:11:20.219 11:52:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:20.219 11:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:20.219 11:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:20.219 11:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:11:20.219 11:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:20.220 11:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:20.220 11:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:20.220 11:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:20.220 11:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:20.220 11:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:20.220 11:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:20.220 11:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:20.220 11:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:20.220 11:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:20.220 11:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:20.479 11:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:20.479 "name": "Existed_Raid", 00:11:20.479 "uuid": "5aea4661-abda-486f-a8ff-a5070ab41fc8", 00:11:20.479 "strip_size_kb": 64, 00:11:20.479 "state": "online", 00:11:20.479 "raid_level": "raid0", 00:11:20.479 "superblock": false, 00:11:20.479 "num_base_bdevs": 3, 00:11:20.479 "num_base_bdevs_discovered": 3, 00:11:20.479 "num_base_bdevs_operational": 3, 00:11:20.479 "base_bdevs_list": [ 00:11:20.479 { 00:11:20.479 "name": "BaseBdev1", 00:11:20.479 "uuid": "83bd5edf-d670-4503-9039-51c429b95f2d", 00:11:20.479 "is_configured": true, 00:11:20.479 "data_offset": 0, 00:11:20.479 "data_size": 65536 00:11:20.479 }, 00:11:20.479 { 00:11:20.479 "name": "BaseBdev2", 00:11:20.479 "uuid": "d68f4af0-aaf5-4f8a-9bb2-56cc450b6679", 00:11:20.479 "is_configured": true, 00:11:20.479 "data_offset": 0, 00:11:20.479 "data_size": 65536 00:11:20.479 }, 00:11:20.479 { 00:11:20.479 "name": "BaseBdev3", 00:11:20.479 "uuid": "b2568a49-18a5-4a38-9b27-dafd74decab8", 00:11:20.479 "is_configured": true, 00:11:20.479 "data_offset": 0, 00:11:20.479 "data_size": 65536 00:11:20.479 } 00:11:20.479 ] 00:11:20.479 }' 00:11:20.479 11:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:20.479 11:52:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:21.047 11:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:21.047 11:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:21.047 11:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:21.047 11:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:21.047 11:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:21.047 11:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:21.047 11:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:21.047 11:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:21.047 [2024-07-12 11:52:11.221431] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:21.047 11:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:21.047 "name": "Existed_Raid", 00:11:21.047 "aliases": [ 00:11:21.047 "5aea4661-abda-486f-a8ff-a5070ab41fc8" 00:11:21.047 ], 00:11:21.047 "product_name": "Raid Volume", 00:11:21.047 "block_size": 512, 00:11:21.047 "num_blocks": 196608, 00:11:21.047 "uuid": "5aea4661-abda-486f-a8ff-a5070ab41fc8", 00:11:21.047 "assigned_rate_limits": { 00:11:21.047 "rw_ios_per_sec": 0, 00:11:21.047 "rw_mbytes_per_sec": 0, 00:11:21.047 "r_mbytes_per_sec": 0, 00:11:21.047 "w_mbytes_per_sec": 0 00:11:21.047 }, 00:11:21.047 "claimed": false, 00:11:21.047 "zoned": false, 00:11:21.047 "supported_io_types": { 00:11:21.047 "read": true, 00:11:21.047 "write": true, 00:11:21.047 "unmap": true, 00:11:21.047 "flush": true, 00:11:21.047 "reset": true, 00:11:21.047 "nvme_admin": false, 00:11:21.047 "nvme_io": false, 00:11:21.047 "nvme_io_md": false, 00:11:21.047 "write_zeroes": true, 00:11:21.047 "zcopy": false, 00:11:21.047 "get_zone_info": false, 00:11:21.047 "zone_management": false, 00:11:21.047 "zone_append": false, 00:11:21.047 "compare": false, 00:11:21.047 "compare_and_write": false, 00:11:21.047 "abort": false, 00:11:21.047 "seek_hole": false, 00:11:21.047 "seek_data": false, 00:11:21.047 "copy": false, 00:11:21.047 "nvme_iov_md": false 00:11:21.047 }, 00:11:21.047 "memory_domains": [ 00:11:21.047 { 00:11:21.047 "dma_device_id": "system", 00:11:21.047 "dma_device_type": 1 00:11:21.047 }, 00:11:21.047 { 00:11:21.047 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:21.047 "dma_device_type": 2 00:11:21.047 }, 00:11:21.047 { 00:11:21.047 "dma_device_id": "system", 00:11:21.047 "dma_device_type": 1 00:11:21.047 }, 00:11:21.047 { 00:11:21.047 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:21.047 "dma_device_type": 2 00:11:21.047 }, 00:11:21.047 { 00:11:21.047 "dma_device_id": "system", 00:11:21.047 "dma_device_type": 1 00:11:21.047 }, 00:11:21.047 { 00:11:21.047 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:21.047 "dma_device_type": 2 00:11:21.047 } 00:11:21.047 ], 00:11:21.047 "driver_specific": { 00:11:21.047 "raid": { 00:11:21.047 "uuid": "5aea4661-abda-486f-a8ff-a5070ab41fc8", 00:11:21.047 "strip_size_kb": 64, 00:11:21.047 "state": "online", 00:11:21.047 "raid_level": "raid0", 00:11:21.047 "superblock": false, 00:11:21.047 "num_base_bdevs": 3, 00:11:21.047 "num_base_bdevs_discovered": 3, 00:11:21.047 "num_base_bdevs_operational": 3, 00:11:21.047 "base_bdevs_list": [ 00:11:21.047 { 00:11:21.047 "name": "BaseBdev1", 00:11:21.047 "uuid": "83bd5edf-d670-4503-9039-51c429b95f2d", 00:11:21.047 "is_configured": true, 00:11:21.047 "data_offset": 0, 00:11:21.047 "data_size": 65536 00:11:21.047 }, 00:11:21.047 { 00:11:21.047 "name": "BaseBdev2", 00:11:21.047 "uuid": "d68f4af0-aaf5-4f8a-9bb2-56cc450b6679", 00:11:21.047 "is_configured": true, 00:11:21.047 "data_offset": 0, 00:11:21.047 "data_size": 65536 00:11:21.047 }, 00:11:21.047 { 00:11:21.047 "name": "BaseBdev3", 00:11:21.047 "uuid": "b2568a49-18a5-4a38-9b27-dafd74decab8", 00:11:21.047 "is_configured": true, 00:11:21.047 "data_offset": 0, 00:11:21.047 "data_size": 65536 00:11:21.047 } 00:11:21.047 ] 00:11:21.047 } 00:11:21.047 } 00:11:21.047 }' 00:11:21.047 11:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:21.047 11:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:21.047 BaseBdev2 00:11:21.047 BaseBdev3' 00:11:21.047 11:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:21.047 11:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:21.047 11:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:21.306 11:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:21.306 "name": "BaseBdev1", 00:11:21.307 "aliases": [ 00:11:21.307 "83bd5edf-d670-4503-9039-51c429b95f2d" 00:11:21.307 ], 00:11:21.307 "product_name": "Malloc disk", 00:11:21.307 "block_size": 512, 00:11:21.307 "num_blocks": 65536, 00:11:21.307 "uuid": "83bd5edf-d670-4503-9039-51c429b95f2d", 00:11:21.307 "assigned_rate_limits": { 00:11:21.307 "rw_ios_per_sec": 0, 00:11:21.307 "rw_mbytes_per_sec": 0, 00:11:21.307 "r_mbytes_per_sec": 0, 00:11:21.307 "w_mbytes_per_sec": 0 00:11:21.307 }, 00:11:21.307 "claimed": true, 00:11:21.307 "claim_type": "exclusive_write", 00:11:21.307 "zoned": false, 00:11:21.307 "supported_io_types": { 00:11:21.307 "read": true, 00:11:21.307 "write": true, 00:11:21.307 "unmap": true, 00:11:21.307 "flush": true, 00:11:21.307 "reset": true, 00:11:21.307 "nvme_admin": false, 00:11:21.307 "nvme_io": false, 00:11:21.307 "nvme_io_md": false, 00:11:21.307 "write_zeroes": true, 00:11:21.307 "zcopy": true, 00:11:21.307 "get_zone_info": false, 00:11:21.307 "zone_management": false, 00:11:21.307 "zone_append": false, 00:11:21.307 "compare": false, 00:11:21.307 "compare_and_write": false, 00:11:21.307 "abort": true, 00:11:21.307 "seek_hole": false, 00:11:21.307 "seek_data": false, 00:11:21.307 "copy": true, 00:11:21.307 "nvme_iov_md": false 00:11:21.307 }, 00:11:21.307 "memory_domains": [ 00:11:21.307 { 00:11:21.307 "dma_device_id": "system", 00:11:21.307 "dma_device_type": 1 00:11:21.307 }, 00:11:21.307 { 00:11:21.307 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:21.307 "dma_device_type": 2 00:11:21.307 } 00:11:21.307 ], 00:11:21.307 "driver_specific": {} 00:11:21.307 }' 00:11:21.307 11:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:21.307 11:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:21.307 11:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:21.307 11:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:21.566 11:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:21.566 11:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:21.566 11:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:21.566 11:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:21.566 11:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:21.566 11:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:21.566 11:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:21.566 11:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:21.566 11:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:21.567 11:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:21.567 11:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:21.826 11:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:21.826 "name": "BaseBdev2", 00:11:21.826 "aliases": [ 00:11:21.826 "d68f4af0-aaf5-4f8a-9bb2-56cc450b6679" 00:11:21.826 ], 00:11:21.826 "product_name": "Malloc disk", 00:11:21.826 "block_size": 512, 00:11:21.826 "num_blocks": 65536, 00:11:21.826 "uuid": "d68f4af0-aaf5-4f8a-9bb2-56cc450b6679", 00:11:21.826 "assigned_rate_limits": { 00:11:21.826 "rw_ios_per_sec": 0, 00:11:21.826 "rw_mbytes_per_sec": 0, 00:11:21.826 "r_mbytes_per_sec": 0, 00:11:21.826 "w_mbytes_per_sec": 0 00:11:21.826 }, 00:11:21.826 "claimed": true, 00:11:21.826 "claim_type": "exclusive_write", 00:11:21.826 "zoned": false, 00:11:21.826 "supported_io_types": { 00:11:21.826 "read": true, 00:11:21.826 "write": true, 00:11:21.826 "unmap": true, 00:11:21.826 "flush": true, 00:11:21.826 "reset": true, 00:11:21.826 "nvme_admin": false, 00:11:21.826 "nvme_io": false, 00:11:21.826 "nvme_io_md": false, 00:11:21.826 "write_zeroes": true, 00:11:21.826 "zcopy": true, 00:11:21.826 "get_zone_info": false, 00:11:21.826 "zone_management": false, 00:11:21.826 "zone_append": false, 00:11:21.826 "compare": false, 00:11:21.826 "compare_and_write": false, 00:11:21.826 "abort": true, 00:11:21.826 "seek_hole": false, 00:11:21.826 "seek_data": false, 00:11:21.826 "copy": true, 00:11:21.826 "nvme_iov_md": false 00:11:21.826 }, 00:11:21.826 "memory_domains": [ 00:11:21.826 { 00:11:21.826 "dma_device_id": "system", 00:11:21.826 "dma_device_type": 1 00:11:21.826 }, 00:11:21.826 { 00:11:21.826 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:21.826 "dma_device_type": 2 00:11:21.826 } 00:11:21.826 ], 00:11:21.826 "driver_specific": {} 00:11:21.826 }' 00:11:21.826 11:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:21.826 11:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:21.826 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:21.826 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:21.826 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:22.084 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:22.084 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:22.084 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:22.084 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:22.084 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:22.084 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:22.084 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:22.084 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:22.084 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:11:22.084 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:22.343 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:22.343 "name": "BaseBdev3", 00:11:22.343 "aliases": [ 00:11:22.343 "b2568a49-18a5-4a38-9b27-dafd74decab8" 00:11:22.343 ], 00:11:22.343 "product_name": "Malloc disk", 00:11:22.343 "block_size": 512, 00:11:22.343 "num_blocks": 65536, 00:11:22.343 "uuid": "b2568a49-18a5-4a38-9b27-dafd74decab8", 00:11:22.343 "assigned_rate_limits": { 00:11:22.343 "rw_ios_per_sec": 0, 00:11:22.343 "rw_mbytes_per_sec": 0, 00:11:22.343 "r_mbytes_per_sec": 0, 00:11:22.343 "w_mbytes_per_sec": 0 00:11:22.343 }, 00:11:22.343 "claimed": true, 00:11:22.343 "claim_type": "exclusive_write", 00:11:22.343 "zoned": false, 00:11:22.343 "supported_io_types": { 00:11:22.343 "read": true, 00:11:22.343 "write": true, 00:11:22.343 "unmap": true, 00:11:22.343 "flush": true, 00:11:22.343 "reset": true, 00:11:22.343 "nvme_admin": false, 00:11:22.343 "nvme_io": false, 00:11:22.343 "nvme_io_md": false, 00:11:22.343 "write_zeroes": true, 00:11:22.343 "zcopy": true, 00:11:22.343 "get_zone_info": false, 00:11:22.343 "zone_management": false, 00:11:22.343 "zone_append": false, 00:11:22.343 "compare": false, 00:11:22.343 "compare_and_write": false, 00:11:22.343 "abort": true, 00:11:22.343 "seek_hole": false, 00:11:22.343 "seek_data": false, 00:11:22.343 "copy": true, 00:11:22.343 "nvme_iov_md": false 00:11:22.343 }, 00:11:22.343 "memory_domains": [ 00:11:22.343 { 00:11:22.343 "dma_device_id": "system", 00:11:22.343 "dma_device_type": 1 00:11:22.343 }, 00:11:22.343 { 00:11:22.343 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:22.343 "dma_device_type": 2 00:11:22.343 } 00:11:22.343 ], 00:11:22.343 "driver_specific": {} 00:11:22.343 }' 00:11:22.343 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:22.343 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:22.343 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:22.343 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:22.343 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:22.343 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:22.343 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:22.602 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:22.602 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:22.602 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:22.602 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:22.602 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:22.602 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:22.861 [2024-07-12 11:52:12.869589] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:22.861 [2024-07-12 11:52:12.869610] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:22.861 [2024-07-12 11:52:12.869637] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:22.861 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:22.861 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:11:22.861 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:22.861 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:22.861 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:22.861 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:11:22.861 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:22.861 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:22.861 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:22.861 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:22.861 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:22.861 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:22.861 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:22.861 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:22.861 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:22.861 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:22.861 11:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:22.861 11:52:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:22.861 "name": "Existed_Raid", 00:11:22.861 "uuid": "5aea4661-abda-486f-a8ff-a5070ab41fc8", 00:11:22.861 "strip_size_kb": 64, 00:11:22.861 "state": "offline", 00:11:22.861 "raid_level": "raid0", 00:11:22.861 "superblock": false, 00:11:22.861 "num_base_bdevs": 3, 00:11:22.861 "num_base_bdevs_discovered": 2, 00:11:22.861 "num_base_bdevs_operational": 2, 00:11:22.861 "base_bdevs_list": [ 00:11:22.861 { 00:11:22.861 "name": null, 00:11:22.861 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:22.861 "is_configured": false, 00:11:22.861 "data_offset": 0, 00:11:22.861 "data_size": 65536 00:11:22.861 }, 00:11:22.861 { 00:11:22.861 "name": "BaseBdev2", 00:11:22.861 "uuid": "d68f4af0-aaf5-4f8a-9bb2-56cc450b6679", 00:11:22.861 "is_configured": true, 00:11:22.861 "data_offset": 0, 00:11:22.861 "data_size": 65536 00:11:22.861 }, 00:11:22.861 { 00:11:22.861 "name": "BaseBdev3", 00:11:22.861 "uuid": "b2568a49-18a5-4a38-9b27-dafd74decab8", 00:11:22.861 "is_configured": true, 00:11:22.861 "data_offset": 0, 00:11:22.861 "data_size": 65536 00:11:22.861 } 00:11:22.861 ] 00:11:22.861 }' 00:11:22.861 11:52:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:22.861 11:52:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:23.428 11:52:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:23.428 11:52:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:23.428 11:52:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:23.428 11:52:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:23.687 11:52:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:23.687 11:52:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:23.687 11:52:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:23.687 [2024-07-12 11:52:13.865025] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:23.687 11:52:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:23.687 11:52:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:23.687 11:52:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:23.687 11:52:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:23.946 11:52:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:23.946 11:52:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:23.946 11:52:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:11:24.205 [2024-07-12 11:52:14.211774] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:11:24.205 [2024-07-12 11:52:14.211804] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19db990 name Existed_Raid, state offline 00:11:24.205 11:52:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:24.205 11:52:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:24.205 11:52:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:24.205 11:52:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:24.205 11:52:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:24.205 11:52:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:24.205 11:52:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:11:24.205 11:52:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:11:24.205 11:52:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:11:24.205 11:52:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:24.464 BaseBdev2 00:11:24.464 11:52:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:11:24.464 11:52:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:24.464 11:52:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:24.464 11:52:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:24.464 11:52:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:24.464 11:52:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:24.464 11:52:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:24.724 11:52:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:24.724 [ 00:11:24.724 { 00:11:24.724 "name": "BaseBdev2", 00:11:24.724 "aliases": [ 00:11:24.724 "1d384465-f30a-445d-bbfd-6cdd189a1e2c" 00:11:24.724 ], 00:11:24.724 "product_name": "Malloc disk", 00:11:24.724 "block_size": 512, 00:11:24.724 "num_blocks": 65536, 00:11:24.724 "uuid": "1d384465-f30a-445d-bbfd-6cdd189a1e2c", 00:11:24.724 "assigned_rate_limits": { 00:11:24.724 "rw_ios_per_sec": 0, 00:11:24.724 "rw_mbytes_per_sec": 0, 00:11:24.724 "r_mbytes_per_sec": 0, 00:11:24.724 "w_mbytes_per_sec": 0 00:11:24.724 }, 00:11:24.724 "claimed": false, 00:11:24.724 "zoned": false, 00:11:24.724 "supported_io_types": { 00:11:24.724 "read": true, 00:11:24.724 "write": true, 00:11:24.724 "unmap": true, 00:11:24.724 "flush": true, 00:11:24.724 "reset": true, 00:11:24.724 "nvme_admin": false, 00:11:24.724 "nvme_io": false, 00:11:24.724 "nvme_io_md": false, 00:11:24.724 "write_zeroes": true, 00:11:24.724 "zcopy": true, 00:11:24.724 "get_zone_info": false, 00:11:24.724 "zone_management": false, 00:11:24.724 "zone_append": false, 00:11:24.724 "compare": false, 00:11:24.724 "compare_and_write": false, 00:11:24.724 "abort": true, 00:11:24.724 "seek_hole": false, 00:11:24.724 "seek_data": false, 00:11:24.724 "copy": true, 00:11:24.724 "nvme_iov_md": false 00:11:24.724 }, 00:11:24.724 "memory_domains": [ 00:11:24.724 { 00:11:24.724 "dma_device_id": "system", 00:11:24.724 "dma_device_type": 1 00:11:24.724 }, 00:11:24.724 { 00:11:24.724 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:24.724 "dma_device_type": 2 00:11:24.724 } 00:11:24.724 ], 00:11:24.724 "driver_specific": {} 00:11:24.724 } 00:11:24.724 ] 00:11:24.724 11:52:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:24.724 11:52:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:11:24.724 11:52:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:11:24.724 11:52:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:11:24.983 BaseBdev3 00:11:24.983 11:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:11:24.983 11:52:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:11:24.983 11:52:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:24.983 11:52:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:24.984 11:52:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:24.984 11:52:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:24.984 11:52:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:24.984 11:52:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:11:25.242 [ 00:11:25.242 { 00:11:25.242 "name": "BaseBdev3", 00:11:25.242 "aliases": [ 00:11:25.242 "d6ac266d-0fcc-4db7-9630-30f3531aaaf7" 00:11:25.242 ], 00:11:25.242 "product_name": "Malloc disk", 00:11:25.242 "block_size": 512, 00:11:25.242 "num_blocks": 65536, 00:11:25.242 "uuid": "d6ac266d-0fcc-4db7-9630-30f3531aaaf7", 00:11:25.242 "assigned_rate_limits": { 00:11:25.242 "rw_ios_per_sec": 0, 00:11:25.242 "rw_mbytes_per_sec": 0, 00:11:25.243 "r_mbytes_per_sec": 0, 00:11:25.243 "w_mbytes_per_sec": 0 00:11:25.243 }, 00:11:25.243 "claimed": false, 00:11:25.243 "zoned": false, 00:11:25.243 "supported_io_types": { 00:11:25.243 "read": true, 00:11:25.243 "write": true, 00:11:25.243 "unmap": true, 00:11:25.243 "flush": true, 00:11:25.243 "reset": true, 00:11:25.243 "nvme_admin": false, 00:11:25.243 "nvme_io": false, 00:11:25.243 "nvme_io_md": false, 00:11:25.243 "write_zeroes": true, 00:11:25.243 "zcopy": true, 00:11:25.243 "get_zone_info": false, 00:11:25.243 "zone_management": false, 00:11:25.243 "zone_append": false, 00:11:25.243 "compare": false, 00:11:25.243 "compare_and_write": false, 00:11:25.243 "abort": true, 00:11:25.243 "seek_hole": false, 00:11:25.243 "seek_data": false, 00:11:25.243 "copy": true, 00:11:25.243 "nvme_iov_md": false 00:11:25.243 }, 00:11:25.243 "memory_domains": [ 00:11:25.243 { 00:11:25.243 "dma_device_id": "system", 00:11:25.243 "dma_device_type": 1 00:11:25.243 }, 00:11:25.243 { 00:11:25.243 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:25.243 "dma_device_type": 2 00:11:25.243 } 00:11:25.243 ], 00:11:25.243 "driver_specific": {} 00:11:25.243 } 00:11:25.243 ] 00:11:25.243 11:52:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:25.243 11:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:11:25.243 11:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:11:25.243 11:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:25.502 [2024-07-12 11:52:15.540454] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:25.502 [2024-07-12 11:52:15.540483] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:25.502 [2024-07-12 11:52:15.540494] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:25.502 [2024-07-12 11:52:15.541454] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:11:25.502 11:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:25.502 11:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:25.502 11:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:25.502 11:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:25.502 11:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:25.502 11:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:25.502 11:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:25.502 11:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:25.502 11:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:25.502 11:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:25.502 11:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:25.502 11:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:25.502 11:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:25.502 "name": "Existed_Raid", 00:11:25.502 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:25.502 "strip_size_kb": 64, 00:11:25.502 "state": "configuring", 00:11:25.502 "raid_level": "raid0", 00:11:25.502 "superblock": false, 00:11:25.502 "num_base_bdevs": 3, 00:11:25.502 "num_base_bdevs_discovered": 2, 00:11:25.502 "num_base_bdevs_operational": 3, 00:11:25.502 "base_bdevs_list": [ 00:11:25.502 { 00:11:25.502 "name": "BaseBdev1", 00:11:25.502 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:25.502 "is_configured": false, 00:11:25.502 "data_offset": 0, 00:11:25.502 "data_size": 0 00:11:25.502 }, 00:11:25.502 { 00:11:25.502 "name": "BaseBdev2", 00:11:25.502 "uuid": "1d384465-f30a-445d-bbfd-6cdd189a1e2c", 00:11:25.502 "is_configured": true, 00:11:25.502 "data_offset": 0, 00:11:25.502 "data_size": 65536 00:11:25.502 }, 00:11:25.502 { 00:11:25.502 "name": "BaseBdev3", 00:11:25.502 "uuid": "d6ac266d-0fcc-4db7-9630-30f3531aaaf7", 00:11:25.502 "is_configured": true, 00:11:25.502 "data_offset": 0, 00:11:25.502 "data_size": 65536 00:11:25.502 } 00:11:25.502 ] 00:11:25.502 }' 00:11:25.502 11:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:25.502 11:52:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:26.070 11:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:11:26.329 [2024-07-12 11:52:16.370598] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:26.329 11:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:26.329 11:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:26.329 11:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:26.329 11:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:26.329 11:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:26.329 11:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:26.329 11:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:26.329 11:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:26.329 11:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:26.329 11:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:26.329 11:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:26.329 11:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:26.329 11:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:26.329 "name": "Existed_Raid", 00:11:26.329 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:26.329 "strip_size_kb": 64, 00:11:26.329 "state": "configuring", 00:11:26.329 "raid_level": "raid0", 00:11:26.329 "superblock": false, 00:11:26.329 "num_base_bdevs": 3, 00:11:26.329 "num_base_bdevs_discovered": 1, 00:11:26.329 "num_base_bdevs_operational": 3, 00:11:26.329 "base_bdevs_list": [ 00:11:26.329 { 00:11:26.329 "name": "BaseBdev1", 00:11:26.329 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:26.329 "is_configured": false, 00:11:26.329 "data_offset": 0, 00:11:26.329 "data_size": 0 00:11:26.329 }, 00:11:26.329 { 00:11:26.329 "name": null, 00:11:26.329 "uuid": "1d384465-f30a-445d-bbfd-6cdd189a1e2c", 00:11:26.329 "is_configured": false, 00:11:26.329 "data_offset": 0, 00:11:26.329 "data_size": 65536 00:11:26.329 }, 00:11:26.329 { 00:11:26.329 "name": "BaseBdev3", 00:11:26.329 "uuid": "d6ac266d-0fcc-4db7-9630-30f3531aaaf7", 00:11:26.329 "is_configured": true, 00:11:26.329 "data_offset": 0, 00:11:26.329 "data_size": 65536 00:11:26.329 } 00:11:26.329 ] 00:11:26.329 }' 00:11:26.329 11:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:26.329 11:52:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:26.897 11:52:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:26.897 11:52:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:11:27.157 11:52:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:11:27.157 11:52:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:27.157 [2024-07-12 11:52:17.367805] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:27.157 BaseBdev1 00:11:27.157 11:52:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:11:27.157 11:52:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:27.157 11:52:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:27.157 11:52:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:27.157 11:52:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:27.157 11:52:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:27.157 11:52:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:27.416 11:52:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:27.676 [ 00:11:27.676 { 00:11:27.676 "name": "BaseBdev1", 00:11:27.676 "aliases": [ 00:11:27.676 "cba81225-b4c8-4678-8d3c-620f7f1517de" 00:11:27.676 ], 00:11:27.676 "product_name": "Malloc disk", 00:11:27.676 "block_size": 512, 00:11:27.676 "num_blocks": 65536, 00:11:27.676 "uuid": "cba81225-b4c8-4678-8d3c-620f7f1517de", 00:11:27.676 "assigned_rate_limits": { 00:11:27.676 "rw_ios_per_sec": 0, 00:11:27.676 "rw_mbytes_per_sec": 0, 00:11:27.676 "r_mbytes_per_sec": 0, 00:11:27.676 "w_mbytes_per_sec": 0 00:11:27.676 }, 00:11:27.676 "claimed": true, 00:11:27.676 "claim_type": "exclusive_write", 00:11:27.676 "zoned": false, 00:11:27.676 "supported_io_types": { 00:11:27.676 "read": true, 00:11:27.676 "write": true, 00:11:27.676 "unmap": true, 00:11:27.676 "flush": true, 00:11:27.676 "reset": true, 00:11:27.676 "nvme_admin": false, 00:11:27.676 "nvme_io": false, 00:11:27.676 "nvme_io_md": false, 00:11:27.676 "write_zeroes": true, 00:11:27.676 "zcopy": true, 00:11:27.676 "get_zone_info": false, 00:11:27.676 "zone_management": false, 00:11:27.676 "zone_append": false, 00:11:27.676 "compare": false, 00:11:27.676 "compare_and_write": false, 00:11:27.676 "abort": true, 00:11:27.676 "seek_hole": false, 00:11:27.676 "seek_data": false, 00:11:27.676 "copy": true, 00:11:27.676 "nvme_iov_md": false 00:11:27.676 }, 00:11:27.676 "memory_domains": [ 00:11:27.676 { 00:11:27.676 "dma_device_id": "system", 00:11:27.676 "dma_device_type": 1 00:11:27.676 }, 00:11:27.676 { 00:11:27.676 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:27.676 "dma_device_type": 2 00:11:27.676 } 00:11:27.676 ], 00:11:27.676 "driver_specific": {} 00:11:27.676 } 00:11:27.676 ] 00:11:27.676 11:52:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:27.676 11:52:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:27.676 11:52:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:27.676 11:52:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:27.676 11:52:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:27.676 11:52:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:27.676 11:52:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:27.676 11:52:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:27.676 11:52:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:27.676 11:52:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:27.676 11:52:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:27.676 11:52:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:27.676 11:52:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:27.676 11:52:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:27.676 "name": "Existed_Raid", 00:11:27.676 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:27.676 "strip_size_kb": 64, 00:11:27.676 "state": "configuring", 00:11:27.676 "raid_level": "raid0", 00:11:27.676 "superblock": false, 00:11:27.676 "num_base_bdevs": 3, 00:11:27.677 "num_base_bdevs_discovered": 2, 00:11:27.677 "num_base_bdevs_operational": 3, 00:11:27.677 "base_bdevs_list": [ 00:11:27.677 { 00:11:27.677 "name": "BaseBdev1", 00:11:27.677 "uuid": "cba81225-b4c8-4678-8d3c-620f7f1517de", 00:11:27.677 "is_configured": true, 00:11:27.677 "data_offset": 0, 00:11:27.677 "data_size": 65536 00:11:27.677 }, 00:11:27.677 { 00:11:27.677 "name": null, 00:11:27.677 "uuid": "1d384465-f30a-445d-bbfd-6cdd189a1e2c", 00:11:27.677 "is_configured": false, 00:11:27.677 "data_offset": 0, 00:11:27.677 "data_size": 65536 00:11:27.677 }, 00:11:27.677 { 00:11:27.677 "name": "BaseBdev3", 00:11:27.677 "uuid": "d6ac266d-0fcc-4db7-9630-30f3531aaaf7", 00:11:27.677 "is_configured": true, 00:11:27.677 "data_offset": 0, 00:11:27.677 "data_size": 65536 00:11:27.677 } 00:11:27.677 ] 00:11:27.677 }' 00:11:27.677 11:52:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:27.677 11:52:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:28.246 11:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:28.246 11:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:11:28.246 11:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:11:28.246 11:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:11:28.506 [2024-07-12 11:52:18.635090] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:11:28.506 11:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:28.506 11:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:28.506 11:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:28.506 11:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:28.506 11:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:28.506 11:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:28.506 11:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:28.506 11:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:28.506 11:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:28.506 11:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:28.506 11:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:28.506 11:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:28.765 11:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:28.765 "name": "Existed_Raid", 00:11:28.765 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:28.765 "strip_size_kb": 64, 00:11:28.765 "state": "configuring", 00:11:28.765 "raid_level": "raid0", 00:11:28.765 "superblock": false, 00:11:28.765 "num_base_bdevs": 3, 00:11:28.765 "num_base_bdevs_discovered": 1, 00:11:28.765 "num_base_bdevs_operational": 3, 00:11:28.765 "base_bdevs_list": [ 00:11:28.765 { 00:11:28.765 "name": "BaseBdev1", 00:11:28.765 "uuid": "cba81225-b4c8-4678-8d3c-620f7f1517de", 00:11:28.765 "is_configured": true, 00:11:28.765 "data_offset": 0, 00:11:28.765 "data_size": 65536 00:11:28.765 }, 00:11:28.765 { 00:11:28.765 "name": null, 00:11:28.765 "uuid": "1d384465-f30a-445d-bbfd-6cdd189a1e2c", 00:11:28.765 "is_configured": false, 00:11:28.765 "data_offset": 0, 00:11:28.765 "data_size": 65536 00:11:28.765 }, 00:11:28.765 { 00:11:28.765 "name": null, 00:11:28.765 "uuid": "d6ac266d-0fcc-4db7-9630-30f3531aaaf7", 00:11:28.765 "is_configured": false, 00:11:28.765 "data_offset": 0, 00:11:28.765 "data_size": 65536 00:11:28.765 } 00:11:28.765 ] 00:11:28.765 }' 00:11:28.765 11:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:28.765 11:52:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:29.333 11:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:29.333 11:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:11:29.333 11:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:11:29.333 11:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:11:29.592 [2024-07-12 11:52:19.641740] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:11:29.592 11:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:29.592 11:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:29.592 11:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:29.592 11:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:29.592 11:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:29.592 11:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:29.592 11:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:29.592 11:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:29.592 11:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:29.592 11:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:29.592 11:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:29.592 11:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:29.850 11:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:29.850 "name": "Existed_Raid", 00:11:29.850 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:29.850 "strip_size_kb": 64, 00:11:29.850 "state": "configuring", 00:11:29.850 "raid_level": "raid0", 00:11:29.850 "superblock": false, 00:11:29.850 "num_base_bdevs": 3, 00:11:29.850 "num_base_bdevs_discovered": 2, 00:11:29.850 "num_base_bdevs_operational": 3, 00:11:29.850 "base_bdevs_list": [ 00:11:29.850 { 00:11:29.850 "name": "BaseBdev1", 00:11:29.850 "uuid": "cba81225-b4c8-4678-8d3c-620f7f1517de", 00:11:29.850 "is_configured": true, 00:11:29.850 "data_offset": 0, 00:11:29.850 "data_size": 65536 00:11:29.850 }, 00:11:29.850 { 00:11:29.850 "name": null, 00:11:29.850 "uuid": "1d384465-f30a-445d-bbfd-6cdd189a1e2c", 00:11:29.850 "is_configured": false, 00:11:29.850 "data_offset": 0, 00:11:29.850 "data_size": 65536 00:11:29.850 }, 00:11:29.850 { 00:11:29.850 "name": "BaseBdev3", 00:11:29.850 "uuid": "d6ac266d-0fcc-4db7-9630-30f3531aaaf7", 00:11:29.850 "is_configured": true, 00:11:29.850 "data_offset": 0, 00:11:29.850 "data_size": 65536 00:11:29.850 } 00:11:29.850 ] 00:11:29.850 }' 00:11:29.850 11:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:29.850 11:52:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:30.109 11:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:11:30.109 11:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:30.367 11:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:11:30.367 11:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:30.627 [2024-07-12 11:52:20.668390] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:30.627 11:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:30.627 11:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:30.627 11:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:30.627 11:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:30.627 11:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:30.627 11:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:30.627 11:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:30.627 11:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:30.627 11:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:30.627 11:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:30.627 11:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:30.627 11:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:30.627 11:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:30.627 "name": "Existed_Raid", 00:11:30.627 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:30.627 "strip_size_kb": 64, 00:11:30.627 "state": "configuring", 00:11:30.627 "raid_level": "raid0", 00:11:30.627 "superblock": false, 00:11:30.627 "num_base_bdevs": 3, 00:11:30.627 "num_base_bdevs_discovered": 1, 00:11:30.627 "num_base_bdevs_operational": 3, 00:11:30.627 "base_bdevs_list": [ 00:11:30.627 { 00:11:30.627 "name": null, 00:11:30.627 "uuid": "cba81225-b4c8-4678-8d3c-620f7f1517de", 00:11:30.627 "is_configured": false, 00:11:30.627 "data_offset": 0, 00:11:30.627 "data_size": 65536 00:11:30.627 }, 00:11:30.627 { 00:11:30.627 "name": null, 00:11:30.627 "uuid": "1d384465-f30a-445d-bbfd-6cdd189a1e2c", 00:11:30.627 "is_configured": false, 00:11:30.627 "data_offset": 0, 00:11:30.627 "data_size": 65536 00:11:30.627 }, 00:11:30.627 { 00:11:30.627 "name": "BaseBdev3", 00:11:30.627 "uuid": "d6ac266d-0fcc-4db7-9630-30f3531aaaf7", 00:11:30.627 "is_configured": true, 00:11:30.627 "data_offset": 0, 00:11:30.627 "data_size": 65536 00:11:30.627 } 00:11:30.627 ] 00:11:30.627 }' 00:11:30.627 11:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:30.627 11:52:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:31.194 11:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:11:31.194 11:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:31.452 11:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:11:31.453 11:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:11:31.453 [2024-07-12 11:52:21.656817] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:31.453 11:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:31.453 11:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:31.453 11:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:31.453 11:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:31.453 11:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:31.453 11:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:31.453 11:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:31.453 11:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:31.453 11:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:31.453 11:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:31.453 11:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:31.453 11:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:31.712 11:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:31.712 "name": "Existed_Raid", 00:11:31.712 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:31.712 "strip_size_kb": 64, 00:11:31.712 "state": "configuring", 00:11:31.712 "raid_level": "raid0", 00:11:31.712 "superblock": false, 00:11:31.712 "num_base_bdevs": 3, 00:11:31.712 "num_base_bdevs_discovered": 2, 00:11:31.712 "num_base_bdevs_operational": 3, 00:11:31.712 "base_bdevs_list": [ 00:11:31.712 { 00:11:31.712 "name": null, 00:11:31.712 "uuid": "cba81225-b4c8-4678-8d3c-620f7f1517de", 00:11:31.712 "is_configured": false, 00:11:31.712 "data_offset": 0, 00:11:31.712 "data_size": 65536 00:11:31.712 }, 00:11:31.712 { 00:11:31.712 "name": "BaseBdev2", 00:11:31.712 "uuid": "1d384465-f30a-445d-bbfd-6cdd189a1e2c", 00:11:31.712 "is_configured": true, 00:11:31.712 "data_offset": 0, 00:11:31.712 "data_size": 65536 00:11:31.712 }, 00:11:31.712 { 00:11:31.712 "name": "BaseBdev3", 00:11:31.712 "uuid": "d6ac266d-0fcc-4db7-9630-30f3531aaaf7", 00:11:31.712 "is_configured": true, 00:11:31.712 "data_offset": 0, 00:11:31.712 "data_size": 65536 00:11:31.712 } 00:11:31.712 ] 00:11:31.712 }' 00:11:31.712 11:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:31.712 11:52:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:32.278 11:52:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:32.278 11:52:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:11:32.278 11:52:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:11:32.278 11:52:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:32.278 11:52:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:11:32.536 11:52:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u cba81225-b4c8-4678-8d3c-620f7f1517de 00:11:32.795 [2024-07-12 11:52:22.838535] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:11:32.795 [2024-07-12 11:52:22.838561] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x19dc8b0 00:11:32.795 [2024-07-12 11:52:22.838565] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:11:32.795 [2024-07-12 11:52:22.838691] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b8f8e0 00:11:32.795 [2024-07-12 11:52:22.838768] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19dc8b0 00:11:32.795 [2024-07-12 11:52:22.838772] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x19dc8b0 00:11:32.795 [2024-07-12 11:52:22.838880] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:32.795 NewBaseBdev 00:11:32.795 11:52:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:11:32.795 11:52:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:11:32.795 11:52:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:32.795 11:52:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:32.795 11:52:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:32.795 11:52:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:32.795 11:52:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:32.795 11:52:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:11:33.052 [ 00:11:33.052 { 00:11:33.052 "name": "NewBaseBdev", 00:11:33.052 "aliases": [ 00:11:33.052 "cba81225-b4c8-4678-8d3c-620f7f1517de" 00:11:33.052 ], 00:11:33.052 "product_name": "Malloc disk", 00:11:33.052 "block_size": 512, 00:11:33.052 "num_blocks": 65536, 00:11:33.052 "uuid": "cba81225-b4c8-4678-8d3c-620f7f1517de", 00:11:33.052 "assigned_rate_limits": { 00:11:33.052 "rw_ios_per_sec": 0, 00:11:33.052 "rw_mbytes_per_sec": 0, 00:11:33.052 "r_mbytes_per_sec": 0, 00:11:33.052 "w_mbytes_per_sec": 0 00:11:33.052 }, 00:11:33.052 "claimed": true, 00:11:33.052 "claim_type": "exclusive_write", 00:11:33.052 "zoned": false, 00:11:33.052 "supported_io_types": { 00:11:33.052 "read": true, 00:11:33.052 "write": true, 00:11:33.053 "unmap": true, 00:11:33.053 "flush": true, 00:11:33.053 "reset": true, 00:11:33.053 "nvme_admin": false, 00:11:33.053 "nvme_io": false, 00:11:33.053 "nvme_io_md": false, 00:11:33.053 "write_zeroes": true, 00:11:33.053 "zcopy": true, 00:11:33.053 "get_zone_info": false, 00:11:33.053 "zone_management": false, 00:11:33.053 "zone_append": false, 00:11:33.053 "compare": false, 00:11:33.053 "compare_and_write": false, 00:11:33.053 "abort": true, 00:11:33.053 "seek_hole": false, 00:11:33.053 "seek_data": false, 00:11:33.053 "copy": true, 00:11:33.053 "nvme_iov_md": false 00:11:33.053 }, 00:11:33.053 "memory_domains": [ 00:11:33.053 { 00:11:33.053 "dma_device_id": "system", 00:11:33.053 "dma_device_type": 1 00:11:33.053 }, 00:11:33.053 { 00:11:33.053 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:33.053 "dma_device_type": 2 00:11:33.053 } 00:11:33.053 ], 00:11:33.053 "driver_specific": {} 00:11:33.053 } 00:11:33.053 ] 00:11:33.053 11:52:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:33.053 11:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:11:33.053 11:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:33.053 11:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:33.053 11:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:33.053 11:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:33.053 11:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:33.053 11:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:33.053 11:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:33.053 11:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:33.053 11:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:33.053 11:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:33.053 11:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:33.310 11:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:33.310 "name": "Existed_Raid", 00:11:33.310 "uuid": "aacf0ccb-0bb9-4e08-8cc8-a743ae5da389", 00:11:33.310 "strip_size_kb": 64, 00:11:33.310 "state": "online", 00:11:33.310 "raid_level": "raid0", 00:11:33.310 "superblock": false, 00:11:33.310 "num_base_bdevs": 3, 00:11:33.310 "num_base_bdevs_discovered": 3, 00:11:33.310 "num_base_bdevs_operational": 3, 00:11:33.310 "base_bdevs_list": [ 00:11:33.310 { 00:11:33.310 "name": "NewBaseBdev", 00:11:33.310 "uuid": "cba81225-b4c8-4678-8d3c-620f7f1517de", 00:11:33.310 "is_configured": true, 00:11:33.310 "data_offset": 0, 00:11:33.310 "data_size": 65536 00:11:33.310 }, 00:11:33.310 { 00:11:33.310 "name": "BaseBdev2", 00:11:33.310 "uuid": "1d384465-f30a-445d-bbfd-6cdd189a1e2c", 00:11:33.310 "is_configured": true, 00:11:33.310 "data_offset": 0, 00:11:33.310 "data_size": 65536 00:11:33.310 }, 00:11:33.310 { 00:11:33.310 "name": "BaseBdev3", 00:11:33.310 "uuid": "d6ac266d-0fcc-4db7-9630-30f3531aaaf7", 00:11:33.310 "is_configured": true, 00:11:33.310 "data_offset": 0, 00:11:33.310 "data_size": 65536 00:11:33.310 } 00:11:33.310 ] 00:11:33.310 }' 00:11:33.310 11:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:33.310 11:52:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:33.876 11:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:11:33.876 11:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:33.876 11:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:33.876 11:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:33.876 11:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:33.876 11:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:33.876 11:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:33.876 11:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:33.876 [2024-07-12 11:52:23.973700] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:33.876 11:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:33.876 "name": "Existed_Raid", 00:11:33.876 "aliases": [ 00:11:33.876 "aacf0ccb-0bb9-4e08-8cc8-a743ae5da389" 00:11:33.876 ], 00:11:33.876 "product_name": "Raid Volume", 00:11:33.876 "block_size": 512, 00:11:33.876 "num_blocks": 196608, 00:11:33.876 "uuid": "aacf0ccb-0bb9-4e08-8cc8-a743ae5da389", 00:11:33.876 "assigned_rate_limits": { 00:11:33.876 "rw_ios_per_sec": 0, 00:11:33.876 "rw_mbytes_per_sec": 0, 00:11:33.876 "r_mbytes_per_sec": 0, 00:11:33.876 "w_mbytes_per_sec": 0 00:11:33.876 }, 00:11:33.876 "claimed": false, 00:11:33.876 "zoned": false, 00:11:33.876 "supported_io_types": { 00:11:33.876 "read": true, 00:11:33.876 "write": true, 00:11:33.876 "unmap": true, 00:11:33.876 "flush": true, 00:11:33.876 "reset": true, 00:11:33.876 "nvme_admin": false, 00:11:33.876 "nvme_io": false, 00:11:33.876 "nvme_io_md": false, 00:11:33.876 "write_zeroes": true, 00:11:33.876 "zcopy": false, 00:11:33.876 "get_zone_info": false, 00:11:33.876 "zone_management": false, 00:11:33.876 "zone_append": false, 00:11:33.876 "compare": false, 00:11:33.876 "compare_and_write": false, 00:11:33.876 "abort": false, 00:11:33.876 "seek_hole": false, 00:11:33.876 "seek_data": false, 00:11:33.876 "copy": false, 00:11:33.876 "nvme_iov_md": false 00:11:33.876 }, 00:11:33.876 "memory_domains": [ 00:11:33.876 { 00:11:33.876 "dma_device_id": "system", 00:11:33.876 "dma_device_type": 1 00:11:33.876 }, 00:11:33.876 { 00:11:33.876 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:33.876 "dma_device_type": 2 00:11:33.876 }, 00:11:33.876 { 00:11:33.876 "dma_device_id": "system", 00:11:33.876 "dma_device_type": 1 00:11:33.876 }, 00:11:33.876 { 00:11:33.876 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:33.876 "dma_device_type": 2 00:11:33.876 }, 00:11:33.876 { 00:11:33.876 "dma_device_id": "system", 00:11:33.876 "dma_device_type": 1 00:11:33.876 }, 00:11:33.876 { 00:11:33.876 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:33.876 "dma_device_type": 2 00:11:33.876 } 00:11:33.876 ], 00:11:33.876 "driver_specific": { 00:11:33.876 "raid": { 00:11:33.876 "uuid": "aacf0ccb-0bb9-4e08-8cc8-a743ae5da389", 00:11:33.876 "strip_size_kb": 64, 00:11:33.876 "state": "online", 00:11:33.876 "raid_level": "raid0", 00:11:33.876 "superblock": false, 00:11:33.876 "num_base_bdevs": 3, 00:11:33.876 "num_base_bdevs_discovered": 3, 00:11:33.876 "num_base_bdevs_operational": 3, 00:11:33.876 "base_bdevs_list": [ 00:11:33.876 { 00:11:33.876 "name": "NewBaseBdev", 00:11:33.876 "uuid": "cba81225-b4c8-4678-8d3c-620f7f1517de", 00:11:33.876 "is_configured": true, 00:11:33.876 "data_offset": 0, 00:11:33.876 "data_size": 65536 00:11:33.876 }, 00:11:33.876 { 00:11:33.876 "name": "BaseBdev2", 00:11:33.876 "uuid": "1d384465-f30a-445d-bbfd-6cdd189a1e2c", 00:11:33.876 "is_configured": true, 00:11:33.876 "data_offset": 0, 00:11:33.876 "data_size": 65536 00:11:33.876 }, 00:11:33.876 { 00:11:33.876 "name": "BaseBdev3", 00:11:33.876 "uuid": "d6ac266d-0fcc-4db7-9630-30f3531aaaf7", 00:11:33.876 "is_configured": true, 00:11:33.876 "data_offset": 0, 00:11:33.876 "data_size": 65536 00:11:33.876 } 00:11:33.876 ] 00:11:33.876 } 00:11:33.876 } 00:11:33.876 }' 00:11:33.876 11:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:33.876 11:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:11:33.876 BaseBdev2 00:11:33.876 BaseBdev3' 00:11:33.876 11:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:33.876 11:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:11:33.876 11:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:34.135 11:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:34.135 "name": "NewBaseBdev", 00:11:34.135 "aliases": [ 00:11:34.135 "cba81225-b4c8-4678-8d3c-620f7f1517de" 00:11:34.135 ], 00:11:34.135 "product_name": "Malloc disk", 00:11:34.135 "block_size": 512, 00:11:34.135 "num_blocks": 65536, 00:11:34.135 "uuid": "cba81225-b4c8-4678-8d3c-620f7f1517de", 00:11:34.135 "assigned_rate_limits": { 00:11:34.135 "rw_ios_per_sec": 0, 00:11:34.135 "rw_mbytes_per_sec": 0, 00:11:34.135 "r_mbytes_per_sec": 0, 00:11:34.135 "w_mbytes_per_sec": 0 00:11:34.135 }, 00:11:34.135 "claimed": true, 00:11:34.135 "claim_type": "exclusive_write", 00:11:34.135 "zoned": false, 00:11:34.135 "supported_io_types": { 00:11:34.135 "read": true, 00:11:34.135 "write": true, 00:11:34.135 "unmap": true, 00:11:34.135 "flush": true, 00:11:34.135 "reset": true, 00:11:34.135 "nvme_admin": false, 00:11:34.135 "nvme_io": false, 00:11:34.135 "nvme_io_md": false, 00:11:34.135 "write_zeroes": true, 00:11:34.135 "zcopy": true, 00:11:34.135 "get_zone_info": false, 00:11:34.135 "zone_management": false, 00:11:34.135 "zone_append": false, 00:11:34.135 "compare": false, 00:11:34.135 "compare_and_write": false, 00:11:34.135 "abort": true, 00:11:34.135 "seek_hole": false, 00:11:34.135 "seek_data": false, 00:11:34.135 "copy": true, 00:11:34.135 "nvme_iov_md": false 00:11:34.135 }, 00:11:34.135 "memory_domains": [ 00:11:34.135 { 00:11:34.135 "dma_device_id": "system", 00:11:34.135 "dma_device_type": 1 00:11:34.135 }, 00:11:34.135 { 00:11:34.135 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:34.135 "dma_device_type": 2 00:11:34.135 } 00:11:34.135 ], 00:11:34.135 "driver_specific": {} 00:11:34.135 }' 00:11:34.135 11:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:34.135 11:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:34.135 11:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:34.135 11:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:34.135 11:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:34.395 11:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:34.395 11:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:34.395 11:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:34.395 11:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:34.395 11:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:34.395 11:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:34.395 11:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:34.395 11:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:34.395 11:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:34.395 11:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:34.654 11:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:34.654 "name": "BaseBdev2", 00:11:34.654 "aliases": [ 00:11:34.654 "1d384465-f30a-445d-bbfd-6cdd189a1e2c" 00:11:34.654 ], 00:11:34.654 "product_name": "Malloc disk", 00:11:34.654 "block_size": 512, 00:11:34.654 "num_blocks": 65536, 00:11:34.654 "uuid": "1d384465-f30a-445d-bbfd-6cdd189a1e2c", 00:11:34.654 "assigned_rate_limits": { 00:11:34.654 "rw_ios_per_sec": 0, 00:11:34.654 "rw_mbytes_per_sec": 0, 00:11:34.654 "r_mbytes_per_sec": 0, 00:11:34.654 "w_mbytes_per_sec": 0 00:11:34.654 }, 00:11:34.654 "claimed": true, 00:11:34.654 "claim_type": "exclusive_write", 00:11:34.654 "zoned": false, 00:11:34.654 "supported_io_types": { 00:11:34.654 "read": true, 00:11:34.654 "write": true, 00:11:34.654 "unmap": true, 00:11:34.654 "flush": true, 00:11:34.654 "reset": true, 00:11:34.654 "nvme_admin": false, 00:11:34.654 "nvme_io": false, 00:11:34.654 "nvme_io_md": false, 00:11:34.654 "write_zeroes": true, 00:11:34.654 "zcopy": true, 00:11:34.654 "get_zone_info": false, 00:11:34.654 "zone_management": false, 00:11:34.654 "zone_append": false, 00:11:34.654 "compare": false, 00:11:34.654 "compare_and_write": false, 00:11:34.654 "abort": true, 00:11:34.654 "seek_hole": false, 00:11:34.654 "seek_data": false, 00:11:34.654 "copy": true, 00:11:34.654 "nvme_iov_md": false 00:11:34.654 }, 00:11:34.654 "memory_domains": [ 00:11:34.654 { 00:11:34.654 "dma_device_id": "system", 00:11:34.654 "dma_device_type": 1 00:11:34.654 }, 00:11:34.654 { 00:11:34.654 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:34.654 "dma_device_type": 2 00:11:34.654 } 00:11:34.654 ], 00:11:34.654 "driver_specific": {} 00:11:34.654 }' 00:11:34.654 11:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:34.654 11:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:34.654 11:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:34.654 11:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:34.654 11:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:34.654 11:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:34.654 11:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:34.913 11:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:34.913 11:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:34.913 11:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:34.913 11:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:34.913 11:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:34.913 11:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:34.913 11:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:11:34.913 11:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:35.171 11:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:35.171 "name": "BaseBdev3", 00:11:35.171 "aliases": [ 00:11:35.171 "d6ac266d-0fcc-4db7-9630-30f3531aaaf7" 00:11:35.171 ], 00:11:35.171 "product_name": "Malloc disk", 00:11:35.171 "block_size": 512, 00:11:35.171 "num_blocks": 65536, 00:11:35.171 "uuid": "d6ac266d-0fcc-4db7-9630-30f3531aaaf7", 00:11:35.171 "assigned_rate_limits": { 00:11:35.171 "rw_ios_per_sec": 0, 00:11:35.171 "rw_mbytes_per_sec": 0, 00:11:35.171 "r_mbytes_per_sec": 0, 00:11:35.171 "w_mbytes_per_sec": 0 00:11:35.171 }, 00:11:35.171 "claimed": true, 00:11:35.171 "claim_type": "exclusive_write", 00:11:35.171 "zoned": false, 00:11:35.171 "supported_io_types": { 00:11:35.171 "read": true, 00:11:35.171 "write": true, 00:11:35.171 "unmap": true, 00:11:35.171 "flush": true, 00:11:35.171 "reset": true, 00:11:35.171 "nvme_admin": false, 00:11:35.171 "nvme_io": false, 00:11:35.171 "nvme_io_md": false, 00:11:35.171 "write_zeroes": true, 00:11:35.171 "zcopy": true, 00:11:35.171 "get_zone_info": false, 00:11:35.171 "zone_management": false, 00:11:35.171 "zone_append": false, 00:11:35.171 "compare": false, 00:11:35.171 "compare_and_write": false, 00:11:35.171 "abort": true, 00:11:35.171 "seek_hole": false, 00:11:35.171 "seek_data": false, 00:11:35.171 "copy": true, 00:11:35.171 "nvme_iov_md": false 00:11:35.171 }, 00:11:35.171 "memory_domains": [ 00:11:35.171 { 00:11:35.171 "dma_device_id": "system", 00:11:35.171 "dma_device_type": 1 00:11:35.171 }, 00:11:35.171 { 00:11:35.171 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:35.171 "dma_device_type": 2 00:11:35.171 } 00:11:35.171 ], 00:11:35.171 "driver_specific": {} 00:11:35.171 }' 00:11:35.171 11:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:35.171 11:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:35.171 11:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:35.171 11:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:35.171 11:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:35.171 11:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:35.171 11:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:35.171 11:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:35.430 11:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:35.430 11:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:35.430 11:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:35.430 11:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:35.430 11:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:35.430 [2024-07-12 11:52:25.673894] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:35.430 [2024-07-12 11:52:25.673913] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:35.430 [2024-07-12 11:52:25.673950] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:35.430 [2024-07-12 11:52:25.673985] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:35.430 [2024-07-12 11:52:25.673991] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19dc8b0 name Existed_Raid, state offline 00:11:35.689 11:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 610707 00:11:35.689 11:52:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 610707 ']' 00:11:35.689 11:52:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 610707 00:11:35.689 11:52:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:11:35.689 11:52:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:35.689 11:52:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 610707 00:11:35.689 11:52:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:35.689 11:52:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:35.689 11:52:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 610707' 00:11:35.689 killing process with pid 610707 00:11:35.689 11:52:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 610707 00:11:35.689 [2024-07-12 11:52:25.738472] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:35.689 11:52:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 610707 00:11:35.689 [2024-07-12 11:52:25.761615] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:35.689 11:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:11:35.689 00:11:35.689 real 0m21.257s 00:11:35.689 user 0m39.628s 00:11:35.689 sys 0m3.299s 00:11:35.689 11:52:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:35.689 11:52:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:35.689 ************************************ 00:11:35.689 END TEST raid_state_function_test 00:11:35.689 ************************************ 00:11:35.948 11:52:25 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:35.948 11:52:25 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 3 true 00:11:35.948 11:52:25 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:35.948 11:52:25 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:35.948 11:52:25 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:35.948 ************************************ 00:11:35.948 START TEST raid_state_function_test_sb 00:11:35.948 ************************************ 00:11:35.948 11:52:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 3 true 00:11:35.948 11:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:11:35.948 11:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:11:35.948 11:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:11:35.948 11:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:35.948 11:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:35.948 11:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:35.948 11:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:35.948 11:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:35.948 11:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:35.948 11:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:35.948 11:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:35.948 11:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:35.948 11:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:11:35.948 11:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:35.948 11:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:35.948 11:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:11:35.948 11:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:35.948 11:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:35.948 11:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:35.948 11:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:35.948 11:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:35.948 11:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:11:35.948 11:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:35.948 11:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:35.948 11:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:11:35.948 11:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:11:35.948 11:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=614746 00:11:35.948 11:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 614746' 00:11:35.948 Process raid pid: 614746 00:11:35.948 11:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:35.948 11:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 614746 /var/tmp/spdk-raid.sock 00:11:35.948 11:52:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 614746 ']' 00:11:35.948 11:52:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:35.948 11:52:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:35.948 11:52:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:35.948 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:35.948 11:52:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:35.948 11:52:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:35.948 [2024-07-12 11:52:26.062026] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:11:35.948 [2024-07-12 11:52:26.062067] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:35.948 [2024-07-12 11:52:26.127939] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:36.208 [2024-07-12 11:52:26.198450] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:11:36.208 [2024-07-12 11:52:26.248564] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:36.208 [2024-07-12 11:52:26.248587] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:36.775 11:52:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:36.775 11:52:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:11:36.775 11:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:36.775 [2024-07-12 11:52:27.003184] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:36.775 [2024-07-12 11:52:27.003216] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:36.775 [2024-07-12 11:52:27.003222] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:36.775 [2024-07-12 11:52:27.003228] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:36.775 [2024-07-12 11:52:27.003232] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:36.775 [2024-07-12 11:52:27.003237] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:37.032 11:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:37.032 11:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:37.032 11:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:37.032 11:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:37.032 11:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:37.032 11:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:37.032 11:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:37.032 11:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:37.032 11:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:37.033 11:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:37.033 11:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:37.033 11:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:37.033 11:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:37.033 "name": "Existed_Raid", 00:11:37.033 "uuid": "0863ee01-bb1a-4cc7-a834-39f5fa5b419d", 00:11:37.033 "strip_size_kb": 64, 00:11:37.033 "state": "configuring", 00:11:37.033 "raid_level": "raid0", 00:11:37.033 "superblock": true, 00:11:37.033 "num_base_bdevs": 3, 00:11:37.033 "num_base_bdevs_discovered": 0, 00:11:37.033 "num_base_bdevs_operational": 3, 00:11:37.033 "base_bdevs_list": [ 00:11:37.033 { 00:11:37.033 "name": "BaseBdev1", 00:11:37.033 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:37.033 "is_configured": false, 00:11:37.033 "data_offset": 0, 00:11:37.033 "data_size": 0 00:11:37.033 }, 00:11:37.033 { 00:11:37.033 "name": "BaseBdev2", 00:11:37.033 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:37.033 "is_configured": false, 00:11:37.033 "data_offset": 0, 00:11:37.033 "data_size": 0 00:11:37.033 }, 00:11:37.033 { 00:11:37.033 "name": "BaseBdev3", 00:11:37.033 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:37.033 "is_configured": false, 00:11:37.033 "data_offset": 0, 00:11:37.033 "data_size": 0 00:11:37.033 } 00:11:37.033 ] 00:11:37.033 }' 00:11:37.033 11:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:37.033 11:52:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:37.600 11:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:37.600 [2024-07-12 11:52:27.781100] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:37.600 [2024-07-12 11:52:27.781120] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15bc1d0 name Existed_Raid, state configuring 00:11:37.600 11:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:37.858 [2024-07-12 11:52:27.961583] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:37.858 [2024-07-12 11:52:27.961600] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:37.858 [2024-07-12 11:52:27.961604] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:37.858 [2024-07-12 11:52:27.961610] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:37.858 [2024-07-12 11:52:27.961614] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:37.858 [2024-07-12 11:52:27.961618] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:37.858 11:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:38.116 [2024-07-12 11:52:28.150221] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:38.116 BaseBdev1 00:11:38.116 11:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:38.116 11:52:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:38.116 11:52:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:38.116 11:52:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:38.116 11:52:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:38.116 11:52:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:38.116 11:52:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:38.116 11:52:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:38.374 [ 00:11:38.374 { 00:11:38.374 "name": "BaseBdev1", 00:11:38.374 "aliases": [ 00:11:38.374 "7bbe4162-200e-41c4-a197-1509a773229b" 00:11:38.374 ], 00:11:38.374 "product_name": "Malloc disk", 00:11:38.374 "block_size": 512, 00:11:38.374 "num_blocks": 65536, 00:11:38.374 "uuid": "7bbe4162-200e-41c4-a197-1509a773229b", 00:11:38.374 "assigned_rate_limits": { 00:11:38.374 "rw_ios_per_sec": 0, 00:11:38.374 "rw_mbytes_per_sec": 0, 00:11:38.375 "r_mbytes_per_sec": 0, 00:11:38.375 "w_mbytes_per_sec": 0 00:11:38.375 }, 00:11:38.375 "claimed": true, 00:11:38.375 "claim_type": "exclusive_write", 00:11:38.375 "zoned": false, 00:11:38.375 "supported_io_types": { 00:11:38.375 "read": true, 00:11:38.375 "write": true, 00:11:38.375 "unmap": true, 00:11:38.375 "flush": true, 00:11:38.375 "reset": true, 00:11:38.375 "nvme_admin": false, 00:11:38.375 "nvme_io": false, 00:11:38.375 "nvme_io_md": false, 00:11:38.375 "write_zeroes": true, 00:11:38.375 "zcopy": true, 00:11:38.375 "get_zone_info": false, 00:11:38.375 "zone_management": false, 00:11:38.375 "zone_append": false, 00:11:38.375 "compare": false, 00:11:38.375 "compare_and_write": false, 00:11:38.375 "abort": true, 00:11:38.375 "seek_hole": false, 00:11:38.375 "seek_data": false, 00:11:38.375 "copy": true, 00:11:38.375 "nvme_iov_md": false 00:11:38.375 }, 00:11:38.375 "memory_domains": [ 00:11:38.375 { 00:11:38.375 "dma_device_id": "system", 00:11:38.375 "dma_device_type": 1 00:11:38.375 }, 00:11:38.375 { 00:11:38.375 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:38.375 "dma_device_type": 2 00:11:38.375 } 00:11:38.375 ], 00:11:38.375 "driver_specific": {} 00:11:38.375 } 00:11:38.375 ] 00:11:38.375 11:52:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:38.375 11:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:38.375 11:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:38.375 11:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:38.375 11:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:38.375 11:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:38.375 11:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:38.375 11:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:38.375 11:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:38.375 11:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:38.375 11:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:38.375 11:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:38.375 11:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:38.634 11:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:38.634 "name": "Existed_Raid", 00:11:38.634 "uuid": "e996fba9-1eb3-4004-a2f2-15d8552b84c2", 00:11:38.634 "strip_size_kb": 64, 00:11:38.634 "state": "configuring", 00:11:38.634 "raid_level": "raid0", 00:11:38.634 "superblock": true, 00:11:38.634 "num_base_bdevs": 3, 00:11:38.634 "num_base_bdevs_discovered": 1, 00:11:38.634 "num_base_bdevs_operational": 3, 00:11:38.634 "base_bdevs_list": [ 00:11:38.634 { 00:11:38.634 "name": "BaseBdev1", 00:11:38.634 "uuid": "7bbe4162-200e-41c4-a197-1509a773229b", 00:11:38.634 "is_configured": true, 00:11:38.634 "data_offset": 2048, 00:11:38.634 "data_size": 63488 00:11:38.634 }, 00:11:38.634 { 00:11:38.634 "name": "BaseBdev2", 00:11:38.634 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:38.634 "is_configured": false, 00:11:38.634 "data_offset": 0, 00:11:38.634 "data_size": 0 00:11:38.634 }, 00:11:38.634 { 00:11:38.634 "name": "BaseBdev3", 00:11:38.634 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:38.634 "is_configured": false, 00:11:38.634 "data_offset": 0, 00:11:38.634 "data_size": 0 00:11:38.634 } 00:11:38.634 ] 00:11:38.634 }' 00:11:38.634 11:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:38.634 11:52:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:39.203 11:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:39.203 [2024-07-12 11:52:29.305198] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:39.203 [2024-07-12 11:52:29.305229] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15bbaa0 name Existed_Raid, state configuring 00:11:39.203 11:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:39.462 [2024-07-12 11:52:29.481686] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:39.462 [2024-07-12 11:52:29.482704] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:39.462 [2024-07-12 11:52:29.482730] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:39.462 [2024-07-12 11:52:29.482735] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:39.462 [2024-07-12 11:52:29.482740] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:39.462 11:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:39.462 11:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:39.462 11:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:39.462 11:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:39.462 11:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:39.462 11:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:39.462 11:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:39.462 11:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:39.462 11:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:39.462 11:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:39.462 11:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:39.462 11:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:39.462 11:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:39.462 11:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:39.462 11:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:39.462 "name": "Existed_Raid", 00:11:39.462 "uuid": "bef7d298-8844-4666-8606-4ad2ed631f43", 00:11:39.462 "strip_size_kb": 64, 00:11:39.462 "state": "configuring", 00:11:39.462 "raid_level": "raid0", 00:11:39.462 "superblock": true, 00:11:39.462 "num_base_bdevs": 3, 00:11:39.462 "num_base_bdevs_discovered": 1, 00:11:39.462 "num_base_bdevs_operational": 3, 00:11:39.462 "base_bdevs_list": [ 00:11:39.462 { 00:11:39.462 "name": "BaseBdev1", 00:11:39.462 "uuid": "7bbe4162-200e-41c4-a197-1509a773229b", 00:11:39.462 "is_configured": true, 00:11:39.462 "data_offset": 2048, 00:11:39.462 "data_size": 63488 00:11:39.462 }, 00:11:39.462 { 00:11:39.462 "name": "BaseBdev2", 00:11:39.462 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:39.462 "is_configured": false, 00:11:39.462 "data_offset": 0, 00:11:39.462 "data_size": 0 00:11:39.462 }, 00:11:39.462 { 00:11:39.462 "name": "BaseBdev3", 00:11:39.462 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:39.462 "is_configured": false, 00:11:39.462 "data_offset": 0, 00:11:39.462 "data_size": 0 00:11:39.462 } 00:11:39.462 ] 00:11:39.462 }' 00:11:39.462 11:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:39.462 11:52:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:40.030 11:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:40.288 [2024-07-12 11:52:30.302451] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:40.288 BaseBdev2 00:11:40.288 11:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:40.288 11:52:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:40.288 11:52:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:40.288 11:52:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:40.288 11:52:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:40.288 11:52:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:40.288 11:52:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:40.288 11:52:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:40.547 [ 00:11:40.547 { 00:11:40.547 "name": "BaseBdev2", 00:11:40.547 "aliases": [ 00:11:40.547 "fb27c1d1-89aa-49bc-bd23-469b73dfca30" 00:11:40.547 ], 00:11:40.547 "product_name": "Malloc disk", 00:11:40.547 "block_size": 512, 00:11:40.547 "num_blocks": 65536, 00:11:40.547 "uuid": "fb27c1d1-89aa-49bc-bd23-469b73dfca30", 00:11:40.547 "assigned_rate_limits": { 00:11:40.547 "rw_ios_per_sec": 0, 00:11:40.547 "rw_mbytes_per_sec": 0, 00:11:40.547 "r_mbytes_per_sec": 0, 00:11:40.547 "w_mbytes_per_sec": 0 00:11:40.547 }, 00:11:40.547 "claimed": true, 00:11:40.547 "claim_type": "exclusive_write", 00:11:40.547 "zoned": false, 00:11:40.547 "supported_io_types": { 00:11:40.547 "read": true, 00:11:40.547 "write": true, 00:11:40.547 "unmap": true, 00:11:40.547 "flush": true, 00:11:40.547 "reset": true, 00:11:40.547 "nvme_admin": false, 00:11:40.547 "nvme_io": false, 00:11:40.547 "nvme_io_md": false, 00:11:40.547 "write_zeroes": true, 00:11:40.547 "zcopy": true, 00:11:40.547 "get_zone_info": false, 00:11:40.547 "zone_management": false, 00:11:40.547 "zone_append": false, 00:11:40.547 "compare": false, 00:11:40.547 "compare_and_write": false, 00:11:40.547 "abort": true, 00:11:40.547 "seek_hole": false, 00:11:40.547 "seek_data": false, 00:11:40.547 "copy": true, 00:11:40.547 "nvme_iov_md": false 00:11:40.547 }, 00:11:40.547 "memory_domains": [ 00:11:40.547 { 00:11:40.547 "dma_device_id": "system", 00:11:40.547 "dma_device_type": 1 00:11:40.547 }, 00:11:40.547 { 00:11:40.547 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:40.547 "dma_device_type": 2 00:11:40.547 } 00:11:40.547 ], 00:11:40.547 "driver_specific": {} 00:11:40.547 } 00:11:40.547 ] 00:11:40.547 11:52:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:40.547 11:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:40.547 11:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:40.547 11:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:40.547 11:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:40.547 11:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:40.547 11:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:40.547 11:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:40.547 11:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:40.547 11:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:40.547 11:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:40.547 11:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:40.547 11:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:40.547 11:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:40.547 11:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:40.806 11:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:40.806 "name": "Existed_Raid", 00:11:40.806 "uuid": "bef7d298-8844-4666-8606-4ad2ed631f43", 00:11:40.806 "strip_size_kb": 64, 00:11:40.806 "state": "configuring", 00:11:40.806 "raid_level": "raid0", 00:11:40.806 "superblock": true, 00:11:40.806 "num_base_bdevs": 3, 00:11:40.806 "num_base_bdevs_discovered": 2, 00:11:40.806 "num_base_bdevs_operational": 3, 00:11:40.806 "base_bdevs_list": [ 00:11:40.806 { 00:11:40.806 "name": "BaseBdev1", 00:11:40.806 "uuid": "7bbe4162-200e-41c4-a197-1509a773229b", 00:11:40.806 "is_configured": true, 00:11:40.806 "data_offset": 2048, 00:11:40.806 "data_size": 63488 00:11:40.806 }, 00:11:40.806 { 00:11:40.806 "name": "BaseBdev2", 00:11:40.806 "uuid": "fb27c1d1-89aa-49bc-bd23-469b73dfca30", 00:11:40.806 "is_configured": true, 00:11:40.806 "data_offset": 2048, 00:11:40.806 "data_size": 63488 00:11:40.806 }, 00:11:40.806 { 00:11:40.806 "name": "BaseBdev3", 00:11:40.806 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:40.806 "is_configured": false, 00:11:40.806 "data_offset": 0, 00:11:40.806 "data_size": 0 00:11:40.806 } 00:11:40.806 ] 00:11:40.806 }' 00:11:40.806 11:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:40.806 11:52:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:41.373 11:52:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:11:41.373 [2024-07-12 11:52:31.468182] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:11:41.373 [2024-07-12 11:52:31.468299] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x15bc990 00:11:41.373 [2024-07-12 11:52:31.468308] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:11:41.373 [2024-07-12 11:52:31.468431] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12c2210 00:11:41.373 [2024-07-12 11:52:31.468525] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15bc990 00:11:41.373 [2024-07-12 11:52:31.468531] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x15bc990 00:11:41.373 [2024-07-12 11:52:31.468600] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:41.373 BaseBdev3 00:11:41.373 11:52:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:11:41.373 11:52:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:11:41.373 11:52:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:41.373 11:52:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:41.373 11:52:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:41.373 11:52:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:41.373 11:52:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:41.632 11:52:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:11:41.632 [ 00:11:41.632 { 00:11:41.632 "name": "BaseBdev3", 00:11:41.632 "aliases": [ 00:11:41.632 "e87d2ac3-b383-4c5a-a918-091d5d43602f" 00:11:41.632 ], 00:11:41.632 "product_name": "Malloc disk", 00:11:41.632 "block_size": 512, 00:11:41.632 "num_blocks": 65536, 00:11:41.632 "uuid": "e87d2ac3-b383-4c5a-a918-091d5d43602f", 00:11:41.632 "assigned_rate_limits": { 00:11:41.632 "rw_ios_per_sec": 0, 00:11:41.632 "rw_mbytes_per_sec": 0, 00:11:41.632 "r_mbytes_per_sec": 0, 00:11:41.632 "w_mbytes_per_sec": 0 00:11:41.632 }, 00:11:41.632 "claimed": true, 00:11:41.632 "claim_type": "exclusive_write", 00:11:41.632 "zoned": false, 00:11:41.632 "supported_io_types": { 00:11:41.632 "read": true, 00:11:41.632 "write": true, 00:11:41.632 "unmap": true, 00:11:41.632 "flush": true, 00:11:41.632 "reset": true, 00:11:41.632 "nvme_admin": false, 00:11:41.632 "nvme_io": false, 00:11:41.632 "nvme_io_md": false, 00:11:41.632 "write_zeroes": true, 00:11:41.632 "zcopy": true, 00:11:41.632 "get_zone_info": false, 00:11:41.632 "zone_management": false, 00:11:41.632 "zone_append": false, 00:11:41.632 "compare": false, 00:11:41.632 "compare_and_write": false, 00:11:41.632 "abort": true, 00:11:41.632 "seek_hole": false, 00:11:41.632 "seek_data": false, 00:11:41.632 "copy": true, 00:11:41.632 "nvme_iov_md": false 00:11:41.632 }, 00:11:41.632 "memory_domains": [ 00:11:41.632 { 00:11:41.632 "dma_device_id": "system", 00:11:41.632 "dma_device_type": 1 00:11:41.632 }, 00:11:41.632 { 00:11:41.632 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:41.632 "dma_device_type": 2 00:11:41.632 } 00:11:41.632 ], 00:11:41.632 "driver_specific": {} 00:11:41.632 } 00:11:41.632 ] 00:11:41.632 11:52:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:41.632 11:52:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:41.632 11:52:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:41.632 11:52:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:11:41.632 11:52:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:41.632 11:52:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:41.632 11:52:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:41.632 11:52:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:41.633 11:52:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:41.633 11:52:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:41.633 11:52:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:41.633 11:52:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:41.633 11:52:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:41.633 11:52:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:41.633 11:52:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:41.926 11:52:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:41.926 "name": "Existed_Raid", 00:11:41.926 "uuid": "bef7d298-8844-4666-8606-4ad2ed631f43", 00:11:41.926 "strip_size_kb": 64, 00:11:41.926 "state": "online", 00:11:41.926 "raid_level": "raid0", 00:11:41.926 "superblock": true, 00:11:41.926 "num_base_bdevs": 3, 00:11:41.926 "num_base_bdevs_discovered": 3, 00:11:41.926 "num_base_bdevs_operational": 3, 00:11:41.926 "base_bdevs_list": [ 00:11:41.926 { 00:11:41.927 "name": "BaseBdev1", 00:11:41.927 "uuid": "7bbe4162-200e-41c4-a197-1509a773229b", 00:11:41.927 "is_configured": true, 00:11:41.927 "data_offset": 2048, 00:11:41.927 "data_size": 63488 00:11:41.927 }, 00:11:41.927 { 00:11:41.927 "name": "BaseBdev2", 00:11:41.927 "uuid": "fb27c1d1-89aa-49bc-bd23-469b73dfca30", 00:11:41.927 "is_configured": true, 00:11:41.927 "data_offset": 2048, 00:11:41.927 "data_size": 63488 00:11:41.927 }, 00:11:41.927 { 00:11:41.927 "name": "BaseBdev3", 00:11:41.927 "uuid": "e87d2ac3-b383-4c5a-a918-091d5d43602f", 00:11:41.927 "is_configured": true, 00:11:41.927 "data_offset": 2048, 00:11:41.927 "data_size": 63488 00:11:41.927 } 00:11:41.927 ] 00:11:41.927 }' 00:11:41.927 11:52:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:41.927 11:52:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:42.588 11:52:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:42.588 11:52:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:42.588 11:52:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:42.588 11:52:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:42.588 11:52:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:42.588 11:52:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:11:42.588 11:52:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:42.588 11:52:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:42.588 [2024-07-12 11:52:32.647442] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:42.588 11:52:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:42.588 "name": "Existed_Raid", 00:11:42.588 "aliases": [ 00:11:42.588 "bef7d298-8844-4666-8606-4ad2ed631f43" 00:11:42.588 ], 00:11:42.588 "product_name": "Raid Volume", 00:11:42.588 "block_size": 512, 00:11:42.588 "num_blocks": 190464, 00:11:42.588 "uuid": "bef7d298-8844-4666-8606-4ad2ed631f43", 00:11:42.588 "assigned_rate_limits": { 00:11:42.588 "rw_ios_per_sec": 0, 00:11:42.588 "rw_mbytes_per_sec": 0, 00:11:42.588 "r_mbytes_per_sec": 0, 00:11:42.588 "w_mbytes_per_sec": 0 00:11:42.588 }, 00:11:42.588 "claimed": false, 00:11:42.588 "zoned": false, 00:11:42.588 "supported_io_types": { 00:11:42.588 "read": true, 00:11:42.588 "write": true, 00:11:42.588 "unmap": true, 00:11:42.588 "flush": true, 00:11:42.588 "reset": true, 00:11:42.588 "nvme_admin": false, 00:11:42.588 "nvme_io": false, 00:11:42.588 "nvme_io_md": false, 00:11:42.588 "write_zeroes": true, 00:11:42.588 "zcopy": false, 00:11:42.588 "get_zone_info": false, 00:11:42.588 "zone_management": false, 00:11:42.588 "zone_append": false, 00:11:42.588 "compare": false, 00:11:42.588 "compare_and_write": false, 00:11:42.588 "abort": false, 00:11:42.588 "seek_hole": false, 00:11:42.588 "seek_data": false, 00:11:42.588 "copy": false, 00:11:42.588 "nvme_iov_md": false 00:11:42.588 }, 00:11:42.588 "memory_domains": [ 00:11:42.588 { 00:11:42.588 "dma_device_id": "system", 00:11:42.588 "dma_device_type": 1 00:11:42.588 }, 00:11:42.588 { 00:11:42.588 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:42.588 "dma_device_type": 2 00:11:42.588 }, 00:11:42.588 { 00:11:42.588 "dma_device_id": "system", 00:11:42.588 "dma_device_type": 1 00:11:42.588 }, 00:11:42.588 { 00:11:42.588 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:42.588 "dma_device_type": 2 00:11:42.588 }, 00:11:42.588 { 00:11:42.588 "dma_device_id": "system", 00:11:42.588 "dma_device_type": 1 00:11:42.588 }, 00:11:42.588 { 00:11:42.588 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:42.589 "dma_device_type": 2 00:11:42.589 } 00:11:42.589 ], 00:11:42.589 "driver_specific": { 00:11:42.589 "raid": { 00:11:42.589 "uuid": "bef7d298-8844-4666-8606-4ad2ed631f43", 00:11:42.589 "strip_size_kb": 64, 00:11:42.589 "state": "online", 00:11:42.589 "raid_level": "raid0", 00:11:42.589 "superblock": true, 00:11:42.589 "num_base_bdevs": 3, 00:11:42.589 "num_base_bdevs_discovered": 3, 00:11:42.589 "num_base_bdevs_operational": 3, 00:11:42.589 "base_bdevs_list": [ 00:11:42.589 { 00:11:42.589 "name": "BaseBdev1", 00:11:42.589 "uuid": "7bbe4162-200e-41c4-a197-1509a773229b", 00:11:42.589 "is_configured": true, 00:11:42.589 "data_offset": 2048, 00:11:42.589 "data_size": 63488 00:11:42.589 }, 00:11:42.589 { 00:11:42.589 "name": "BaseBdev2", 00:11:42.589 "uuid": "fb27c1d1-89aa-49bc-bd23-469b73dfca30", 00:11:42.589 "is_configured": true, 00:11:42.589 "data_offset": 2048, 00:11:42.589 "data_size": 63488 00:11:42.589 }, 00:11:42.589 { 00:11:42.589 "name": "BaseBdev3", 00:11:42.589 "uuid": "e87d2ac3-b383-4c5a-a918-091d5d43602f", 00:11:42.589 "is_configured": true, 00:11:42.589 "data_offset": 2048, 00:11:42.589 "data_size": 63488 00:11:42.589 } 00:11:42.589 ] 00:11:42.589 } 00:11:42.589 } 00:11:42.589 }' 00:11:42.589 11:52:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:42.589 11:52:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:42.589 BaseBdev2 00:11:42.589 BaseBdev3' 00:11:42.589 11:52:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:42.589 11:52:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:42.589 11:52:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:42.866 11:52:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:42.866 "name": "BaseBdev1", 00:11:42.866 "aliases": [ 00:11:42.866 "7bbe4162-200e-41c4-a197-1509a773229b" 00:11:42.866 ], 00:11:42.866 "product_name": "Malloc disk", 00:11:42.866 "block_size": 512, 00:11:42.866 "num_blocks": 65536, 00:11:42.866 "uuid": "7bbe4162-200e-41c4-a197-1509a773229b", 00:11:42.866 "assigned_rate_limits": { 00:11:42.866 "rw_ios_per_sec": 0, 00:11:42.866 "rw_mbytes_per_sec": 0, 00:11:42.866 "r_mbytes_per_sec": 0, 00:11:42.866 "w_mbytes_per_sec": 0 00:11:42.866 }, 00:11:42.866 "claimed": true, 00:11:42.866 "claim_type": "exclusive_write", 00:11:42.866 "zoned": false, 00:11:42.866 "supported_io_types": { 00:11:42.866 "read": true, 00:11:42.866 "write": true, 00:11:42.866 "unmap": true, 00:11:42.866 "flush": true, 00:11:42.866 "reset": true, 00:11:42.866 "nvme_admin": false, 00:11:42.866 "nvme_io": false, 00:11:42.866 "nvme_io_md": false, 00:11:42.866 "write_zeroes": true, 00:11:42.866 "zcopy": true, 00:11:42.866 "get_zone_info": false, 00:11:42.866 "zone_management": false, 00:11:42.866 "zone_append": false, 00:11:42.866 "compare": false, 00:11:42.866 "compare_and_write": false, 00:11:42.866 "abort": true, 00:11:42.866 "seek_hole": false, 00:11:42.866 "seek_data": false, 00:11:42.866 "copy": true, 00:11:42.866 "nvme_iov_md": false 00:11:42.866 }, 00:11:42.866 "memory_domains": [ 00:11:42.866 { 00:11:42.866 "dma_device_id": "system", 00:11:42.866 "dma_device_type": 1 00:11:42.866 }, 00:11:42.866 { 00:11:42.866 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:42.866 "dma_device_type": 2 00:11:42.866 } 00:11:42.866 ], 00:11:42.866 "driver_specific": {} 00:11:42.866 }' 00:11:42.866 11:52:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:42.866 11:52:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:42.866 11:52:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:42.866 11:52:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:42.866 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:42.866 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:42.866 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:42.866 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:43.178 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:43.178 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:43.178 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:43.178 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:43.178 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:43.178 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:43.178 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:43.178 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:43.178 "name": "BaseBdev2", 00:11:43.178 "aliases": [ 00:11:43.178 "fb27c1d1-89aa-49bc-bd23-469b73dfca30" 00:11:43.178 ], 00:11:43.178 "product_name": "Malloc disk", 00:11:43.178 "block_size": 512, 00:11:43.178 "num_blocks": 65536, 00:11:43.178 "uuid": "fb27c1d1-89aa-49bc-bd23-469b73dfca30", 00:11:43.178 "assigned_rate_limits": { 00:11:43.178 "rw_ios_per_sec": 0, 00:11:43.178 "rw_mbytes_per_sec": 0, 00:11:43.178 "r_mbytes_per_sec": 0, 00:11:43.178 "w_mbytes_per_sec": 0 00:11:43.178 }, 00:11:43.178 "claimed": true, 00:11:43.178 "claim_type": "exclusive_write", 00:11:43.178 "zoned": false, 00:11:43.178 "supported_io_types": { 00:11:43.178 "read": true, 00:11:43.178 "write": true, 00:11:43.178 "unmap": true, 00:11:43.178 "flush": true, 00:11:43.178 "reset": true, 00:11:43.178 "nvme_admin": false, 00:11:43.178 "nvme_io": false, 00:11:43.178 "nvme_io_md": false, 00:11:43.178 "write_zeroes": true, 00:11:43.178 "zcopy": true, 00:11:43.178 "get_zone_info": false, 00:11:43.178 "zone_management": false, 00:11:43.178 "zone_append": false, 00:11:43.178 "compare": false, 00:11:43.178 "compare_and_write": false, 00:11:43.178 "abort": true, 00:11:43.178 "seek_hole": false, 00:11:43.178 "seek_data": false, 00:11:43.178 "copy": true, 00:11:43.178 "nvme_iov_md": false 00:11:43.178 }, 00:11:43.178 "memory_domains": [ 00:11:43.178 { 00:11:43.178 "dma_device_id": "system", 00:11:43.178 "dma_device_type": 1 00:11:43.178 }, 00:11:43.178 { 00:11:43.178 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:43.178 "dma_device_type": 2 00:11:43.178 } 00:11:43.178 ], 00:11:43.178 "driver_specific": {} 00:11:43.178 }' 00:11:43.178 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:43.178 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:43.178 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:43.178 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:43.436 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:43.436 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:43.436 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:43.436 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:43.436 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:43.436 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:43.436 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:43.436 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:43.436 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:43.436 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:11:43.436 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:43.693 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:43.693 "name": "BaseBdev3", 00:11:43.693 "aliases": [ 00:11:43.693 "e87d2ac3-b383-4c5a-a918-091d5d43602f" 00:11:43.693 ], 00:11:43.693 "product_name": "Malloc disk", 00:11:43.693 "block_size": 512, 00:11:43.693 "num_blocks": 65536, 00:11:43.693 "uuid": "e87d2ac3-b383-4c5a-a918-091d5d43602f", 00:11:43.693 "assigned_rate_limits": { 00:11:43.693 "rw_ios_per_sec": 0, 00:11:43.693 "rw_mbytes_per_sec": 0, 00:11:43.693 "r_mbytes_per_sec": 0, 00:11:43.693 "w_mbytes_per_sec": 0 00:11:43.693 }, 00:11:43.693 "claimed": true, 00:11:43.693 "claim_type": "exclusive_write", 00:11:43.693 "zoned": false, 00:11:43.693 "supported_io_types": { 00:11:43.693 "read": true, 00:11:43.693 "write": true, 00:11:43.693 "unmap": true, 00:11:43.693 "flush": true, 00:11:43.693 "reset": true, 00:11:43.693 "nvme_admin": false, 00:11:43.694 "nvme_io": false, 00:11:43.694 "nvme_io_md": false, 00:11:43.694 "write_zeroes": true, 00:11:43.694 "zcopy": true, 00:11:43.694 "get_zone_info": false, 00:11:43.694 "zone_management": false, 00:11:43.694 "zone_append": false, 00:11:43.694 "compare": false, 00:11:43.694 "compare_and_write": false, 00:11:43.694 "abort": true, 00:11:43.694 "seek_hole": false, 00:11:43.694 "seek_data": false, 00:11:43.694 "copy": true, 00:11:43.694 "nvme_iov_md": false 00:11:43.694 }, 00:11:43.694 "memory_domains": [ 00:11:43.694 { 00:11:43.694 "dma_device_id": "system", 00:11:43.694 "dma_device_type": 1 00:11:43.694 }, 00:11:43.694 { 00:11:43.694 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:43.694 "dma_device_type": 2 00:11:43.694 } 00:11:43.694 ], 00:11:43.694 "driver_specific": {} 00:11:43.694 }' 00:11:43.694 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:43.694 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:43.694 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:43.694 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:43.951 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:43.951 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:43.951 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:43.952 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:43.952 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:43.952 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:43.952 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:43.952 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:43.952 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:44.210 [2024-07-12 11:52:34.255439] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:44.210 [2024-07-12 11:52:34.255462] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:44.210 [2024-07-12 11:52:34.255490] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:44.210 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:44.210 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:11:44.210 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:44.210 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:11:44.210 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:44.210 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:11:44.210 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:44.210 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:44.210 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:44.210 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:44.210 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:44.210 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:44.210 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:44.210 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:44.210 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:44.210 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:44.210 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:44.210 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:44.210 "name": "Existed_Raid", 00:11:44.210 "uuid": "bef7d298-8844-4666-8606-4ad2ed631f43", 00:11:44.210 "strip_size_kb": 64, 00:11:44.210 "state": "offline", 00:11:44.210 "raid_level": "raid0", 00:11:44.210 "superblock": true, 00:11:44.210 "num_base_bdevs": 3, 00:11:44.210 "num_base_bdevs_discovered": 2, 00:11:44.210 "num_base_bdevs_operational": 2, 00:11:44.210 "base_bdevs_list": [ 00:11:44.210 { 00:11:44.210 "name": null, 00:11:44.210 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:44.210 "is_configured": false, 00:11:44.210 "data_offset": 2048, 00:11:44.210 "data_size": 63488 00:11:44.210 }, 00:11:44.210 { 00:11:44.210 "name": "BaseBdev2", 00:11:44.210 "uuid": "fb27c1d1-89aa-49bc-bd23-469b73dfca30", 00:11:44.210 "is_configured": true, 00:11:44.210 "data_offset": 2048, 00:11:44.210 "data_size": 63488 00:11:44.210 }, 00:11:44.210 { 00:11:44.210 "name": "BaseBdev3", 00:11:44.210 "uuid": "e87d2ac3-b383-4c5a-a918-091d5d43602f", 00:11:44.210 "is_configured": true, 00:11:44.210 "data_offset": 2048, 00:11:44.210 "data_size": 63488 00:11:44.210 } 00:11:44.210 ] 00:11:44.210 }' 00:11:44.210 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:44.210 11:52:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:44.778 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:44.778 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:44.778 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:44.778 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:45.037 11:52:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:45.037 11:52:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:45.037 11:52:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:45.037 [2024-07-12 11:52:35.226692] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:45.037 11:52:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:45.037 11:52:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:45.037 11:52:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:45.037 11:52:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:45.296 11:52:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:45.296 11:52:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:45.296 11:52:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:11:45.555 [2024-07-12 11:52:35.573228] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:11:45.555 [2024-07-12 11:52:35.573261] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15bc990 name Existed_Raid, state offline 00:11:45.555 11:52:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:45.555 11:52:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:45.555 11:52:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:45.555 11:52:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:45.555 11:52:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:45.555 11:52:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:45.555 11:52:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:11:45.555 11:52:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:11:45.555 11:52:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:11:45.555 11:52:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:45.813 BaseBdev2 00:11:45.813 11:52:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:11:45.814 11:52:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:45.814 11:52:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:45.814 11:52:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:45.814 11:52:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:45.814 11:52:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:45.814 11:52:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:46.072 11:52:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:46.072 [ 00:11:46.072 { 00:11:46.072 "name": "BaseBdev2", 00:11:46.072 "aliases": [ 00:11:46.072 "79022b95-57cc-4882-a0ba-117643bc4456" 00:11:46.072 ], 00:11:46.072 "product_name": "Malloc disk", 00:11:46.072 "block_size": 512, 00:11:46.072 "num_blocks": 65536, 00:11:46.072 "uuid": "79022b95-57cc-4882-a0ba-117643bc4456", 00:11:46.072 "assigned_rate_limits": { 00:11:46.072 "rw_ios_per_sec": 0, 00:11:46.072 "rw_mbytes_per_sec": 0, 00:11:46.072 "r_mbytes_per_sec": 0, 00:11:46.072 "w_mbytes_per_sec": 0 00:11:46.072 }, 00:11:46.072 "claimed": false, 00:11:46.072 "zoned": false, 00:11:46.072 "supported_io_types": { 00:11:46.072 "read": true, 00:11:46.072 "write": true, 00:11:46.072 "unmap": true, 00:11:46.072 "flush": true, 00:11:46.072 "reset": true, 00:11:46.072 "nvme_admin": false, 00:11:46.072 "nvme_io": false, 00:11:46.072 "nvme_io_md": false, 00:11:46.072 "write_zeroes": true, 00:11:46.072 "zcopy": true, 00:11:46.072 "get_zone_info": false, 00:11:46.072 "zone_management": false, 00:11:46.072 "zone_append": false, 00:11:46.072 "compare": false, 00:11:46.072 "compare_and_write": false, 00:11:46.072 "abort": true, 00:11:46.072 "seek_hole": false, 00:11:46.072 "seek_data": false, 00:11:46.072 "copy": true, 00:11:46.072 "nvme_iov_md": false 00:11:46.072 }, 00:11:46.072 "memory_domains": [ 00:11:46.072 { 00:11:46.072 "dma_device_id": "system", 00:11:46.072 "dma_device_type": 1 00:11:46.072 }, 00:11:46.072 { 00:11:46.072 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:46.072 "dma_device_type": 2 00:11:46.072 } 00:11:46.072 ], 00:11:46.072 "driver_specific": {} 00:11:46.072 } 00:11:46.072 ] 00:11:46.072 11:52:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:46.072 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:11:46.072 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:11:46.072 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:11:46.332 BaseBdev3 00:11:46.332 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:11:46.332 11:52:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:11:46.332 11:52:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:46.332 11:52:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:46.332 11:52:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:46.332 11:52:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:46.332 11:52:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:46.591 11:52:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:11:46.591 [ 00:11:46.591 { 00:11:46.591 "name": "BaseBdev3", 00:11:46.591 "aliases": [ 00:11:46.591 "0a149009-5fe5-405e-8570-f7c9c20a1a1b" 00:11:46.591 ], 00:11:46.591 "product_name": "Malloc disk", 00:11:46.591 "block_size": 512, 00:11:46.591 "num_blocks": 65536, 00:11:46.591 "uuid": "0a149009-5fe5-405e-8570-f7c9c20a1a1b", 00:11:46.591 "assigned_rate_limits": { 00:11:46.591 "rw_ios_per_sec": 0, 00:11:46.591 "rw_mbytes_per_sec": 0, 00:11:46.591 "r_mbytes_per_sec": 0, 00:11:46.591 "w_mbytes_per_sec": 0 00:11:46.591 }, 00:11:46.591 "claimed": false, 00:11:46.591 "zoned": false, 00:11:46.591 "supported_io_types": { 00:11:46.591 "read": true, 00:11:46.591 "write": true, 00:11:46.591 "unmap": true, 00:11:46.591 "flush": true, 00:11:46.591 "reset": true, 00:11:46.591 "nvme_admin": false, 00:11:46.591 "nvme_io": false, 00:11:46.591 "nvme_io_md": false, 00:11:46.591 "write_zeroes": true, 00:11:46.591 "zcopy": true, 00:11:46.591 "get_zone_info": false, 00:11:46.591 "zone_management": false, 00:11:46.591 "zone_append": false, 00:11:46.591 "compare": false, 00:11:46.591 "compare_and_write": false, 00:11:46.591 "abort": true, 00:11:46.591 "seek_hole": false, 00:11:46.591 "seek_data": false, 00:11:46.591 "copy": true, 00:11:46.591 "nvme_iov_md": false 00:11:46.591 }, 00:11:46.591 "memory_domains": [ 00:11:46.591 { 00:11:46.591 "dma_device_id": "system", 00:11:46.591 "dma_device_type": 1 00:11:46.591 }, 00:11:46.591 { 00:11:46.591 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:46.591 "dma_device_type": 2 00:11:46.591 } 00:11:46.591 ], 00:11:46.591 "driver_specific": {} 00:11:46.591 } 00:11:46.591 ] 00:11:46.591 11:52:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:46.591 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:11:46.591 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:11:46.592 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:46.851 [2024-07-12 11:52:36.897813] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:46.851 [2024-07-12 11:52:36.897842] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:46.851 [2024-07-12 11:52:36.897853] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:46.851 [2024-07-12 11:52:36.898820] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:11:46.851 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:46.851 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:46.851 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:46.851 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:46.851 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:46.851 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:46.851 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:46.851 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:46.851 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:46.851 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:46.851 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:46.851 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:46.851 11:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:46.851 "name": "Existed_Raid", 00:11:46.851 "uuid": "4c6133de-f0e8-4bac-a62d-72bf309880ee", 00:11:46.851 "strip_size_kb": 64, 00:11:46.851 "state": "configuring", 00:11:46.851 "raid_level": "raid0", 00:11:46.851 "superblock": true, 00:11:46.851 "num_base_bdevs": 3, 00:11:46.851 "num_base_bdevs_discovered": 2, 00:11:46.851 "num_base_bdevs_operational": 3, 00:11:46.851 "base_bdevs_list": [ 00:11:46.851 { 00:11:46.851 "name": "BaseBdev1", 00:11:46.851 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:46.851 "is_configured": false, 00:11:46.851 "data_offset": 0, 00:11:46.851 "data_size": 0 00:11:46.851 }, 00:11:46.851 { 00:11:46.851 "name": "BaseBdev2", 00:11:46.851 "uuid": "79022b95-57cc-4882-a0ba-117643bc4456", 00:11:46.851 "is_configured": true, 00:11:46.851 "data_offset": 2048, 00:11:46.851 "data_size": 63488 00:11:46.851 }, 00:11:46.851 { 00:11:46.851 "name": "BaseBdev3", 00:11:46.851 "uuid": "0a149009-5fe5-405e-8570-f7c9c20a1a1b", 00:11:46.851 "is_configured": true, 00:11:46.851 "data_offset": 2048, 00:11:46.851 "data_size": 63488 00:11:46.851 } 00:11:46.851 ] 00:11:46.851 }' 00:11:46.851 11:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:46.851 11:52:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:47.417 11:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:11:47.674 [2024-07-12 11:52:37.707903] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:47.674 11:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:47.674 11:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:47.674 11:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:47.674 11:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:47.674 11:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:47.674 11:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:47.674 11:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:47.674 11:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:47.674 11:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:47.674 11:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:47.674 11:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:47.674 11:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:47.674 11:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:47.674 "name": "Existed_Raid", 00:11:47.674 "uuid": "4c6133de-f0e8-4bac-a62d-72bf309880ee", 00:11:47.674 "strip_size_kb": 64, 00:11:47.674 "state": "configuring", 00:11:47.674 "raid_level": "raid0", 00:11:47.674 "superblock": true, 00:11:47.674 "num_base_bdevs": 3, 00:11:47.674 "num_base_bdevs_discovered": 1, 00:11:47.674 "num_base_bdevs_operational": 3, 00:11:47.674 "base_bdevs_list": [ 00:11:47.674 { 00:11:47.674 "name": "BaseBdev1", 00:11:47.674 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:47.675 "is_configured": false, 00:11:47.675 "data_offset": 0, 00:11:47.675 "data_size": 0 00:11:47.675 }, 00:11:47.675 { 00:11:47.675 "name": null, 00:11:47.675 "uuid": "79022b95-57cc-4882-a0ba-117643bc4456", 00:11:47.675 "is_configured": false, 00:11:47.675 "data_offset": 2048, 00:11:47.675 "data_size": 63488 00:11:47.675 }, 00:11:47.675 { 00:11:47.675 "name": "BaseBdev3", 00:11:47.675 "uuid": "0a149009-5fe5-405e-8570-f7c9c20a1a1b", 00:11:47.675 "is_configured": true, 00:11:47.675 "data_offset": 2048, 00:11:47.675 "data_size": 63488 00:11:47.675 } 00:11:47.675 ] 00:11:47.675 }' 00:11:47.675 11:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:47.675 11:52:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:48.240 11:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:48.240 11:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:11:48.499 11:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:11:48.499 11:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:48.499 [2024-07-12 11:52:38.681133] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:48.499 BaseBdev1 00:11:48.499 11:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:11:48.499 11:52:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:48.499 11:52:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:48.499 11:52:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:48.499 11:52:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:48.499 11:52:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:48.499 11:52:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:48.758 11:52:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:48.758 [ 00:11:48.758 { 00:11:48.758 "name": "BaseBdev1", 00:11:48.758 "aliases": [ 00:11:48.758 "4d555bfa-5700-4005-9bbc-9eaa93afd6e8" 00:11:48.758 ], 00:11:48.758 "product_name": "Malloc disk", 00:11:48.758 "block_size": 512, 00:11:48.758 "num_blocks": 65536, 00:11:48.758 "uuid": "4d555bfa-5700-4005-9bbc-9eaa93afd6e8", 00:11:48.758 "assigned_rate_limits": { 00:11:48.758 "rw_ios_per_sec": 0, 00:11:48.758 "rw_mbytes_per_sec": 0, 00:11:48.758 "r_mbytes_per_sec": 0, 00:11:48.758 "w_mbytes_per_sec": 0 00:11:48.758 }, 00:11:48.758 "claimed": true, 00:11:48.758 "claim_type": "exclusive_write", 00:11:48.758 "zoned": false, 00:11:48.758 "supported_io_types": { 00:11:48.758 "read": true, 00:11:48.758 "write": true, 00:11:48.758 "unmap": true, 00:11:48.758 "flush": true, 00:11:48.758 "reset": true, 00:11:48.758 "nvme_admin": false, 00:11:48.758 "nvme_io": false, 00:11:48.758 "nvme_io_md": false, 00:11:48.758 "write_zeroes": true, 00:11:48.758 "zcopy": true, 00:11:48.758 "get_zone_info": false, 00:11:48.758 "zone_management": false, 00:11:48.758 "zone_append": false, 00:11:48.758 "compare": false, 00:11:48.758 "compare_and_write": false, 00:11:48.758 "abort": true, 00:11:48.758 "seek_hole": false, 00:11:48.758 "seek_data": false, 00:11:48.758 "copy": true, 00:11:48.758 "nvme_iov_md": false 00:11:48.758 }, 00:11:48.758 "memory_domains": [ 00:11:48.758 { 00:11:48.758 "dma_device_id": "system", 00:11:48.758 "dma_device_type": 1 00:11:48.758 }, 00:11:48.758 { 00:11:48.758 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:48.758 "dma_device_type": 2 00:11:48.758 } 00:11:48.758 ], 00:11:48.758 "driver_specific": {} 00:11:48.758 } 00:11:48.758 ] 00:11:49.017 11:52:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:49.017 11:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:49.017 11:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:49.017 11:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:49.017 11:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:49.017 11:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:49.017 11:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:49.017 11:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:49.017 11:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:49.017 11:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:49.017 11:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:49.017 11:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:49.017 11:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:49.017 11:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:49.017 "name": "Existed_Raid", 00:11:49.017 "uuid": "4c6133de-f0e8-4bac-a62d-72bf309880ee", 00:11:49.017 "strip_size_kb": 64, 00:11:49.017 "state": "configuring", 00:11:49.017 "raid_level": "raid0", 00:11:49.017 "superblock": true, 00:11:49.017 "num_base_bdevs": 3, 00:11:49.017 "num_base_bdevs_discovered": 2, 00:11:49.017 "num_base_bdevs_operational": 3, 00:11:49.017 "base_bdevs_list": [ 00:11:49.017 { 00:11:49.017 "name": "BaseBdev1", 00:11:49.017 "uuid": "4d555bfa-5700-4005-9bbc-9eaa93afd6e8", 00:11:49.017 "is_configured": true, 00:11:49.017 "data_offset": 2048, 00:11:49.017 "data_size": 63488 00:11:49.017 }, 00:11:49.017 { 00:11:49.017 "name": null, 00:11:49.017 "uuid": "79022b95-57cc-4882-a0ba-117643bc4456", 00:11:49.017 "is_configured": false, 00:11:49.017 "data_offset": 2048, 00:11:49.017 "data_size": 63488 00:11:49.017 }, 00:11:49.017 { 00:11:49.017 "name": "BaseBdev3", 00:11:49.017 "uuid": "0a149009-5fe5-405e-8570-f7c9c20a1a1b", 00:11:49.017 "is_configured": true, 00:11:49.017 "data_offset": 2048, 00:11:49.017 "data_size": 63488 00:11:49.017 } 00:11:49.017 ] 00:11:49.017 }' 00:11:49.017 11:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:49.017 11:52:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:49.586 11:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:49.586 11:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:11:49.844 11:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:11:49.844 11:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:11:49.844 [2024-07-12 11:52:40.004575] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:11:49.844 11:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:49.844 11:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:49.844 11:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:49.844 11:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:49.844 11:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:49.844 11:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:49.844 11:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:49.844 11:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:49.844 11:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:49.844 11:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:49.844 11:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:49.844 11:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:50.102 11:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:50.102 "name": "Existed_Raid", 00:11:50.102 "uuid": "4c6133de-f0e8-4bac-a62d-72bf309880ee", 00:11:50.102 "strip_size_kb": 64, 00:11:50.102 "state": "configuring", 00:11:50.102 "raid_level": "raid0", 00:11:50.102 "superblock": true, 00:11:50.102 "num_base_bdevs": 3, 00:11:50.102 "num_base_bdevs_discovered": 1, 00:11:50.102 "num_base_bdevs_operational": 3, 00:11:50.102 "base_bdevs_list": [ 00:11:50.102 { 00:11:50.102 "name": "BaseBdev1", 00:11:50.102 "uuid": "4d555bfa-5700-4005-9bbc-9eaa93afd6e8", 00:11:50.102 "is_configured": true, 00:11:50.102 "data_offset": 2048, 00:11:50.102 "data_size": 63488 00:11:50.102 }, 00:11:50.102 { 00:11:50.102 "name": null, 00:11:50.102 "uuid": "79022b95-57cc-4882-a0ba-117643bc4456", 00:11:50.102 "is_configured": false, 00:11:50.102 "data_offset": 2048, 00:11:50.102 "data_size": 63488 00:11:50.102 }, 00:11:50.102 { 00:11:50.102 "name": null, 00:11:50.102 "uuid": "0a149009-5fe5-405e-8570-f7c9c20a1a1b", 00:11:50.102 "is_configured": false, 00:11:50.102 "data_offset": 2048, 00:11:50.102 "data_size": 63488 00:11:50.102 } 00:11:50.102 ] 00:11:50.102 }' 00:11:50.102 11:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:50.102 11:52:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:50.669 11:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:50.669 11:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:11:50.669 11:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:11:50.669 11:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:11:50.927 [2024-07-12 11:52:40.983120] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:11:50.927 11:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:50.927 11:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:50.927 11:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:50.927 11:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:50.927 11:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:50.928 11:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:50.928 11:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:50.928 11:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:50.928 11:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:50.928 11:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:50.928 11:52:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:50.928 11:52:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:50.928 11:52:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:50.928 "name": "Existed_Raid", 00:11:50.928 "uuid": "4c6133de-f0e8-4bac-a62d-72bf309880ee", 00:11:50.928 "strip_size_kb": 64, 00:11:50.928 "state": "configuring", 00:11:50.928 "raid_level": "raid0", 00:11:50.928 "superblock": true, 00:11:50.928 "num_base_bdevs": 3, 00:11:50.928 "num_base_bdevs_discovered": 2, 00:11:50.928 "num_base_bdevs_operational": 3, 00:11:50.928 "base_bdevs_list": [ 00:11:50.928 { 00:11:50.928 "name": "BaseBdev1", 00:11:50.928 "uuid": "4d555bfa-5700-4005-9bbc-9eaa93afd6e8", 00:11:50.928 "is_configured": true, 00:11:50.928 "data_offset": 2048, 00:11:50.928 "data_size": 63488 00:11:50.928 }, 00:11:50.928 { 00:11:50.928 "name": null, 00:11:50.928 "uuid": "79022b95-57cc-4882-a0ba-117643bc4456", 00:11:50.928 "is_configured": false, 00:11:50.928 "data_offset": 2048, 00:11:50.928 "data_size": 63488 00:11:50.928 }, 00:11:50.928 { 00:11:50.928 "name": "BaseBdev3", 00:11:50.928 "uuid": "0a149009-5fe5-405e-8570-f7c9c20a1a1b", 00:11:50.928 "is_configured": true, 00:11:50.928 "data_offset": 2048, 00:11:50.928 "data_size": 63488 00:11:50.928 } 00:11:50.928 ] 00:11:50.928 }' 00:11:50.928 11:52:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:50.928 11:52:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:51.494 11:52:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:51.494 11:52:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:11:51.753 11:52:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:11:51.753 11:52:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:51.753 [2024-07-12 11:52:41.985742] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:52.011 11:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:52.011 11:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:52.011 11:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:52.011 11:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:52.011 11:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:52.011 11:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:52.011 11:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:52.011 11:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:52.011 11:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:52.011 11:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:52.011 11:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:52.011 11:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:52.011 11:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:52.011 "name": "Existed_Raid", 00:11:52.011 "uuid": "4c6133de-f0e8-4bac-a62d-72bf309880ee", 00:11:52.011 "strip_size_kb": 64, 00:11:52.011 "state": "configuring", 00:11:52.011 "raid_level": "raid0", 00:11:52.011 "superblock": true, 00:11:52.011 "num_base_bdevs": 3, 00:11:52.011 "num_base_bdevs_discovered": 1, 00:11:52.011 "num_base_bdevs_operational": 3, 00:11:52.011 "base_bdevs_list": [ 00:11:52.011 { 00:11:52.011 "name": null, 00:11:52.011 "uuid": "4d555bfa-5700-4005-9bbc-9eaa93afd6e8", 00:11:52.011 "is_configured": false, 00:11:52.011 "data_offset": 2048, 00:11:52.011 "data_size": 63488 00:11:52.011 }, 00:11:52.011 { 00:11:52.011 "name": null, 00:11:52.011 "uuid": "79022b95-57cc-4882-a0ba-117643bc4456", 00:11:52.011 "is_configured": false, 00:11:52.011 "data_offset": 2048, 00:11:52.011 "data_size": 63488 00:11:52.011 }, 00:11:52.011 { 00:11:52.011 "name": "BaseBdev3", 00:11:52.011 "uuid": "0a149009-5fe5-405e-8570-f7c9c20a1a1b", 00:11:52.011 "is_configured": true, 00:11:52.011 "data_offset": 2048, 00:11:52.011 "data_size": 63488 00:11:52.011 } 00:11:52.011 ] 00:11:52.011 }' 00:11:52.011 11:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:52.011 11:52:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:52.577 11:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:52.577 11:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:11:52.835 11:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:11:52.835 11:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:11:52.835 [2024-07-12 11:52:42.986060] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:52.835 11:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:52.835 11:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:52.835 11:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:52.835 11:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:52.835 11:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:52.835 11:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:52.835 11:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:52.835 11:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:52.835 11:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:52.835 11:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:52.835 11:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:52.835 11:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:53.094 11:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:53.094 "name": "Existed_Raid", 00:11:53.094 "uuid": "4c6133de-f0e8-4bac-a62d-72bf309880ee", 00:11:53.094 "strip_size_kb": 64, 00:11:53.094 "state": "configuring", 00:11:53.094 "raid_level": "raid0", 00:11:53.094 "superblock": true, 00:11:53.094 "num_base_bdevs": 3, 00:11:53.094 "num_base_bdevs_discovered": 2, 00:11:53.094 "num_base_bdevs_operational": 3, 00:11:53.094 "base_bdevs_list": [ 00:11:53.094 { 00:11:53.094 "name": null, 00:11:53.094 "uuid": "4d555bfa-5700-4005-9bbc-9eaa93afd6e8", 00:11:53.094 "is_configured": false, 00:11:53.094 "data_offset": 2048, 00:11:53.094 "data_size": 63488 00:11:53.094 }, 00:11:53.094 { 00:11:53.094 "name": "BaseBdev2", 00:11:53.094 "uuid": "79022b95-57cc-4882-a0ba-117643bc4456", 00:11:53.094 "is_configured": true, 00:11:53.094 "data_offset": 2048, 00:11:53.094 "data_size": 63488 00:11:53.094 }, 00:11:53.094 { 00:11:53.094 "name": "BaseBdev3", 00:11:53.094 "uuid": "0a149009-5fe5-405e-8570-f7c9c20a1a1b", 00:11:53.094 "is_configured": true, 00:11:53.094 "data_offset": 2048, 00:11:53.094 "data_size": 63488 00:11:53.094 } 00:11:53.094 ] 00:11:53.094 }' 00:11:53.094 11:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:53.094 11:52:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:53.662 11:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:53.662 11:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:11:53.662 11:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:11:53.662 11:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:53.662 11:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:11:53.921 11:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 4d555bfa-5700-4005-9bbc-9eaa93afd6e8 00:11:54.180 [2024-07-12 11:52:44.179883] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:11:54.180 [2024-07-12 11:52:44.180007] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x15b52a0 00:11:54.180 [2024-07-12 11:52:44.180016] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:11:54.180 [2024-07-12 11:52:44.180144] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15a8940 00:11:54.180 [2024-07-12 11:52:44.180224] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15b52a0 00:11:54.180 [2024-07-12 11:52:44.180229] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x15b52a0 00:11:54.180 [2024-07-12 11:52:44.180293] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:54.180 NewBaseBdev 00:11:54.180 11:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:11:54.180 11:52:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:11:54.180 11:52:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:54.180 11:52:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:54.180 11:52:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:54.180 11:52:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:54.180 11:52:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:54.180 11:52:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:11:54.439 [ 00:11:54.439 { 00:11:54.439 "name": "NewBaseBdev", 00:11:54.439 "aliases": [ 00:11:54.439 "4d555bfa-5700-4005-9bbc-9eaa93afd6e8" 00:11:54.439 ], 00:11:54.439 "product_name": "Malloc disk", 00:11:54.439 "block_size": 512, 00:11:54.439 "num_blocks": 65536, 00:11:54.439 "uuid": "4d555bfa-5700-4005-9bbc-9eaa93afd6e8", 00:11:54.439 "assigned_rate_limits": { 00:11:54.439 "rw_ios_per_sec": 0, 00:11:54.439 "rw_mbytes_per_sec": 0, 00:11:54.439 "r_mbytes_per_sec": 0, 00:11:54.439 "w_mbytes_per_sec": 0 00:11:54.439 }, 00:11:54.439 "claimed": true, 00:11:54.439 "claim_type": "exclusive_write", 00:11:54.439 "zoned": false, 00:11:54.439 "supported_io_types": { 00:11:54.439 "read": true, 00:11:54.439 "write": true, 00:11:54.439 "unmap": true, 00:11:54.439 "flush": true, 00:11:54.439 "reset": true, 00:11:54.439 "nvme_admin": false, 00:11:54.439 "nvme_io": false, 00:11:54.439 "nvme_io_md": false, 00:11:54.439 "write_zeroes": true, 00:11:54.439 "zcopy": true, 00:11:54.439 "get_zone_info": false, 00:11:54.439 "zone_management": false, 00:11:54.439 "zone_append": false, 00:11:54.439 "compare": false, 00:11:54.439 "compare_and_write": false, 00:11:54.439 "abort": true, 00:11:54.439 "seek_hole": false, 00:11:54.439 "seek_data": false, 00:11:54.439 "copy": true, 00:11:54.439 "nvme_iov_md": false 00:11:54.439 }, 00:11:54.439 "memory_domains": [ 00:11:54.439 { 00:11:54.439 "dma_device_id": "system", 00:11:54.439 "dma_device_type": 1 00:11:54.439 }, 00:11:54.439 { 00:11:54.439 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:54.439 "dma_device_type": 2 00:11:54.439 } 00:11:54.439 ], 00:11:54.439 "driver_specific": {} 00:11:54.439 } 00:11:54.439 ] 00:11:54.439 11:52:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:54.439 11:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:11:54.439 11:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:54.439 11:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:54.439 11:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:54.439 11:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:54.439 11:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:54.439 11:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:54.439 11:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:54.439 11:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:54.439 11:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:54.439 11:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:54.439 11:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:54.697 11:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:54.697 "name": "Existed_Raid", 00:11:54.697 "uuid": "4c6133de-f0e8-4bac-a62d-72bf309880ee", 00:11:54.697 "strip_size_kb": 64, 00:11:54.697 "state": "online", 00:11:54.697 "raid_level": "raid0", 00:11:54.697 "superblock": true, 00:11:54.697 "num_base_bdevs": 3, 00:11:54.697 "num_base_bdevs_discovered": 3, 00:11:54.697 "num_base_bdevs_operational": 3, 00:11:54.697 "base_bdevs_list": [ 00:11:54.697 { 00:11:54.697 "name": "NewBaseBdev", 00:11:54.697 "uuid": "4d555bfa-5700-4005-9bbc-9eaa93afd6e8", 00:11:54.697 "is_configured": true, 00:11:54.697 "data_offset": 2048, 00:11:54.697 "data_size": 63488 00:11:54.697 }, 00:11:54.697 { 00:11:54.697 "name": "BaseBdev2", 00:11:54.697 "uuid": "79022b95-57cc-4882-a0ba-117643bc4456", 00:11:54.697 "is_configured": true, 00:11:54.697 "data_offset": 2048, 00:11:54.697 "data_size": 63488 00:11:54.697 }, 00:11:54.697 { 00:11:54.697 "name": "BaseBdev3", 00:11:54.697 "uuid": "0a149009-5fe5-405e-8570-f7c9c20a1a1b", 00:11:54.697 "is_configured": true, 00:11:54.697 "data_offset": 2048, 00:11:54.697 "data_size": 63488 00:11:54.697 } 00:11:54.697 ] 00:11:54.697 }' 00:11:54.697 11:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:54.697 11:52:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:54.955 11:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:11:54.955 11:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:54.955 11:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:54.955 11:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:54.955 11:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:54.955 11:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:11:54.955 11:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:54.955 11:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:55.214 [2024-07-12 11:52:45.315033] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:55.214 11:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:55.214 "name": "Existed_Raid", 00:11:55.214 "aliases": [ 00:11:55.214 "4c6133de-f0e8-4bac-a62d-72bf309880ee" 00:11:55.214 ], 00:11:55.214 "product_name": "Raid Volume", 00:11:55.214 "block_size": 512, 00:11:55.214 "num_blocks": 190464, 00:11:55.214 "uuid": "4c6133de-f0e8-4bac-a62d-72bf309880ee", 00:11:55.214 "assigned_rate_limits": { 00:11:55.214 "rw_ios_per_sec": 0, 00:11:55.214 "rw_mbytes_per_sec": 0, 00:11:55.214 "r_mbytes_per_sec": 0, 00:11:55.214 "w_mbytes_per_sec": 0 00:11:55.214 }, 00:11:55.214 "claimed": false, 00:11:55.214 "zoned": false, 00:11:55.214 "supported_io_types": { 00:11:55.214 "read": true, 00:11:55.214 "write": true, 00:11:55.214 "unmap": true, 00:11:55.214 "flush": true, 00:11:55.214 "reset": true, 00:11:55.214 "nvme_admin": false, 00:11:55.214 "nvme_io": false, 00:11:55.214 "nvme_io_md": false, 00:11:55.214 "write_zeroes": true, 00:11:55.214 "zcopy": false, 00:11:55.214 "get_zone_info": false, 00:11:55.214 "zone_management": false, 00:11:55.214 "zone_append": false, 00:11:55.214 "compare": false, 00:11:55.214 "compare_and_write": false, 00:11:55.214 "abort": false, 00:11:55.214 "seek_hole": false, 00:11:55.214 "seek_data": false, 00:11:55.214 "copy": false, 00:11:55.214 "nvme_iov_md": false 00:11:55.214 }, 00:11:55.214 "memory_domains": [ 00:11:55.214 { 00:11:55.214 "dma_device_id": "system", 00:11:55.214 "dma_device_type": 1 00:11:55.214 }, 00:11:55.214 { 00:11:55.214 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:55.214 "dma_device_type": 2 00:11:55.214 }, 00:11:55.214 { 00:11:55.214 "dma_device_id": "system", 00:11:55.214 "dma_device_type": 1 00:11:55.214 }, 00:11:55.214 { 00:11:55.214 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:55.214 "dma_device_type": 2 00:11:55.214 }, 00:11:55.215 { 00:11:55.215 "dma_device_id": "system", 00:11:55.215 "dma_device_type": 1 00:11:55.215 }, 00:11:55.215 { 00:11:55.215 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:55.215 "dma_device_type": 2 00:11:55.215 } 00:11:55.215 ], 00:11:55.215 "driver_specific": { 00:11:55.215 "raid": { 00:11:55.215 "uuid": "4c6133de-f0e8-4bac-a62d-72bf309880ee", 00:11:55.215 "strip_size_kb": 64, 00:11:55.215 "state": "online", 00:11:55.215 "raid_level": "raid0", 00:11:55.215 "superblock": true, 00:11:55.215 "num_base_bdevs": 3, 00:11:55.215 "num_base_bdevs_discovered": 3, 00:11:55.215 "num_base_bdevs_operational": 3, 00:11:55.215 "base_bdevs_list": [ 00:11:55.215 { 00:11:55.215 "name": "NewBaseBdev", 00:11:55.215 "uuid": "4d555bfa-5700-4005-9bbc-9eaa93afd6e8", 00:11:55.215 "is_configured": true, 00:11:55.215 "data_offset": 2048, 00:11:55.215 "data_size": 63488 00:11:55.215 }, 00:11:55.215 { 00:11:55.215 "name": "BaseBdev2", 00:11:55.215 "uuid": "79022b95-57cc-4882-a0ba-117643bc4456", 00:11:55.215 "is_configured": true, 00:11:55.215 "data_offset": 2048, 00:11:55.215 "data_size": 63488 00:11:55.215 }, 00:11:55.215 { 00:11:55.215 "name": "BaseBdev3", 00:11:55.215 "uuid": "0a149009-5fe5-405e-8570-f7c9c20a1a1b", 00:11:55.215 "is_configured": true, 00:11:55.215 "data_offset": 2048, 00:11:55.215 "data_size": 63488 00:11:55.215 } 00:11:55.215 ] 00:11:55.215 } 00:11:55.215 } 00:11:55.215 }' 00:11:55.215 11:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:55.215 11:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:11:55.215 BaseBdev2 00:11:55.215 BaseBdev3' 00:11:55.215 11:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:55.215 11:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:11:55.215 11:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:55.474 11:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:55.474 "name": "NewBaseBdev", 00:11:55.474 "aliases": [ 00:11:55.474 "4d555bfa-5700-4005-9bbc-9eaa93afd6e8" 00:11:55.474 ], 00:11:55.474 "product_name": "Malloc disk", 00:11:55.474 "block_size": 512, 00:11:55.474 "num_blocks": 65536, 00:11:55.474 "uuid": "4d555bfa-5700-4005-9bbc-9eaa93afd6e8", 00:11:55.474 "assigned_rate_limits": { 00:11:55.474 "rw_ios_per_sec": 0, 00:11:55.474 "rw_mbytes_per_sec": 0, 00:11:55.474 "r_mbytes_per_sec": 0, 00:11:55.474 "w_mbytes_per_sec": 0 00:11:55.474 }, 00:11:55.474 "claimed": true, 00:11:55.474 "claim_type": "exclusive_write", 00:11:55.474 "zoned": false, 00:11:55.474 "supported_io_types": { 00:11:55.474 "read": true, 00:11:55.474 "write": true, 00:11:55.474 "unmap": true, 00:11:55.474 "flush": true, 00:11:55.474 "reset": true, 00:11:55.474 "nvme_admin": false, 00:11:55.474 "nvme_io": false, 00:11:55.474 "nvme_io_md": false, 00:11:55.474 "write_zeroes": true, 00:11:55.474 "zcopy": true, 00:11:55.474 "get_zone_info": false, 00:11:55.474 "zone_management": false, 00:11:55.474 "zone_append": false, 00:11:55.474 "compare": false, 00:11:55.474 "compare_and_write": false, 00:11:55.474 "abort": true, 00:11:55.474 "seek_hole": false, 00:11:55.474 "seek_data": false, 00:11:55.474 "copy": true, 00:11:55.474 "nvme_iov_md": false 00:11:55.474 }, 00:11:55.474 "memory_domains": [ 00:11:55.474 { 00:11:55.474 "dma_device_id": "system", 00:11:55.474 "dma_device_type": 1 00:11:55.474 }, 00:11:55.474 { 00:11:55.474 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:55.474 "dma_device_type": 2 00:11:55.474 } 00:11:55.474 ], 00:11:55.474 "driver_specific": {} 00:11:55.474 }' 00:11:55.474 11:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:55.474 11:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:55.474 11:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:55.474 11:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:55.474 11:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:55.474 11:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:55.474 11:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:55.474 11:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:55.733 11:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:55.733 11:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:55.733 11:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:55.733 11:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:55.733 11:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:55.733 11:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:55.733 11:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:55.993 11:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:55.993 "name": "BaseBdev2", 00:11:55.993 "aliases": [ 00:11:55.993 "79022b95-57cc-4882-a0ba-117643bc4456" 00:11:55.993 ], 00:11:55.993 "product_name": "Malloc disk", 00:11:55.993 "block_size": 512, 00:11:55.993 "num_blocks": 65536, 00:11:55.993 "uuid": "79022b95-57cc-4882-a0ba-117643bc4456", 00:11:55.993 "assigned_rate_limits": { 00:11:55.993 "rw_ios_per_sec": 0, 00:11:55.993 "rw_mbytes_per_sec": 0, 00:11:55.993 "r_mbytes_per_sec": 0, 00:11:55.993 "w_mbytes_per_sec": 0 00:11:55.993 }, 00:11:55.993 "claimed": true, 00:11:55.993 "claim_type": "exclusive_write", 00:11:55.993 "zoned": false, 00:11:55.993 "supported_io_types": { 00:11:55.993 "read": true, 00:11:55.993 "write": true, 00:11:55.993 "unmap": true, 00:11:55.993 "flush": true, 00:11:55.993 "reset": true, 00:11:55.993 "nvme_admin": false, 00:11:55.993 "nvme_io": false, 00:11:55.993 "nvme_io_md": false, 00:11:55.993 "write_zeroes": true, 00:11:55.993 "zcopy": true, 00:11:55.993 "get_zone_info": false, 00:11:55.993 "zone_management": false, 00:11:55.993 "zone_append": false, 00:11:55.993 "compare": false, 00:11:55.993 "compare_and_write": false, 00:11:55.993 "abort": true, 00:11:55.993 "seek_hole": false, 00:11:55.993 "seek_data": false, 00:11:55.993 "copy": true, 00:11:55.993 "nvme_iov_md": false 00:11:55.993 }, 00:11:55.993 "memory_domains": [ 00:11:55.993 { 00:11:55.993 "dma_device_id": "system", 00:11:55.993 "dma_device_type": 1 00:11:55.993 }, 00:11:55.993 { 00:11:55.993 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:55.993 "dma_device_type": 2 00:11:55.993 } 00:11:55.993 ], 00:11:55.993 "driver_specific": {} 00:11:55.993 }' 00:11:55.993 11:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:55.993 11:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:55.993 11:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:55.993 11:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:55.993 11:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:55.993 11:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:55.993 11:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:55.993 11:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:56.252 11:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:56.252 11:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:56.252 11:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:56.252 11:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:56.252 11:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:56.252 11:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:11:56.252 11:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:56.511 11:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:56.511 "name": "BaseBdev3", 00:11:56.511 "aliases": [ 00:11:56.511 "0a149009-5fe5-405e-8570-f7c9c20a1a1b" 00:11:56.511 ], 00:11:56.511 "product_name": "Malloc disk", 00:11:56.511 "block_size": 512, 00:11:56.511 "num_blocks": 65536, 00:11:56.511 "uuid": "0a149009-5fe5-405e-8570-f7c9c20a1a1b", 00:11:56.511 "assigned_rate_limits": { 00:11:56.511 "rw_ios_per_sec": 0, 00:11:56.511 "rw_mbytes_per_sec": 0, 00:11:56.511 "r_mbytes_per_sec": 0, 00:11:56.511 "w_mbytes_per_sec": 0 00:11:56.511 }, 00:11:56.511 "claimed": true, 00:11:56.511 "claim_type": "exclusive_write", 00:11:56.511 "zoned": false, 00:11:56.511 "supported_io_types": { 00:11:56.511 "read": true, 00:11:56.511 "write": true, 00:11:56.511 "unmap": true, 00:11:56.511 "flush": true, 00:11:56.511 "reset": true, 00:11:56.511 "nvme_admin": false, 00:11:56.511 "nvme_io": false, 00:11:56.511 "nvme_io_md": false, 00:11:56.511 "write_zeroes": true, 00:11:56.511 "zcopy": true, 00:11:56.511 "get_zone_info": false, 00:11:56.511 "zone_management": false, 00:11:56.511 "zone_append": false, 00:11:56.511 "compare": false, 00:11:56.511 "compare_and_write": false, 00:11:56.511 "abort": true, 00:11:56.511 "seek_hole": false, 00:11:56.511 "seek_data": false, 00:11:56.511 "copy": true, 00:11:56.511 "nvme_iov_md": false 00:11:56.511 }, 00:11:56.511 "memory_domains": [ 00:11:56.511 { 00:11:56.511 "dma_device_id": "system", 00:11:56.511 "dma_device_type": 1 00:11:56.511 }, 00:11:56.511 { 00:11:56.511 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:56.511 "dma_device_type": 2 00:11:56.511 } 00:11:56.511 ], 00:11:56.511 "driver_specific": {} 00:11:56.511 }' 00:11:56.511 11:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:56.511 11:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:56.511 11:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:56.511 11:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:56.511 11:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:56.511 11:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:56.511 11:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:56.511 11:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:56.511 11:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:56.511 11:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:56.771 11:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:56.771 11:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:56.771 11:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:56.771 [2024-07-12 11:52:46.947049] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:56.771 [2024-07-12 11:52:46.947068] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:56.771 [2024-07-12 11:52:46.947105] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:56.771 [2024-07-12 11:52:46.947140] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:56.771 [2024-07-12 11:52:46.947146] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15b52a0 name Existed_Raid, state offline 00:11:56.771 11:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 614746 00:11:56.771 11:52:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 614746 ']' 00:11:56.771 11:52:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 614746 00:11:56.771 11:52:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:11:56.771 11:52:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:56.771 11:52:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 614746 00:11:56.771 11:52:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:56.771 11:52:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:56.771 11:52:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 614746' 00:11:56.771 killing process with pid 614746 00:11:56.771 11:52:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 614746 00:11:56.771 [2024-07-12 11:52:46.998599] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:56.771 11:52:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 614746 00:11:57.031 [2024-07-12 11:52:47.021566] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:57.031 11:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:11:57.031 00:11:57.031 real 0m21.194s 00:11:57.031 user 0m39.417s 00:11:57.031 sys 0m3.376s 00:11:57.031 11:52:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:57.031 11:52:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:57.031 ************************************ 00:11:57.031 END TEST raid_state_function_test_sb 00:11:57.031 ************************************ 00:11:57.031 11:52:47 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:57.031 11:52:47 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 3 00:11:57.031 11:52:47 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:11:57.031 11:52:47 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:57.031 11:52:47 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:57.031 ************************************ 00:11:57.031 START TEST raid_superblock_test 00:11:57.031 ************************************ 00:11:57.031 11:52:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 3 00:11:57.031 11:52:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:11:57.031 11:52:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:11:57.031 11:52:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:11:57.031 11:52:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:11:57.031 11:52:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:11:57.031 11:52:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:11:57.031 11:52:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:11:57.031 11:52:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:11:57.031 11:52:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:11:57.031 11:52:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:11:57.031 11:52:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:11:57.031 11:52:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:11:57.031 11:52:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:11:57.031 11:52:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:11:57.031 11:52:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:11:57.031 11:52:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:11:57.031 11:52:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=618947 00:11:57.031 11:52:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 618947 /var/tmp/spdk-raid.sock 00:11:57.031 11:52:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:11:57.031 11:52:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 618947 ']' 00:11:57.031 11:52:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:57.031 11:52:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:57.031 11:52:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:57.031 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:57.031 11:52:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:57.031 11:52:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:57.289 [2024-07-12 11:52:47.312209] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:11:57.289 [2024-07-12 11:52:47.312249] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid618947 ] 00:11:57.289 [2024-07-12 11:52:47.374030] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:57.289 [2024-07-12 11:52:47.452354] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:11:57.289 [2024-07-12 11:52:47.506324] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:57.289 [2024-07-12 11:52:47.506351] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:58.225 11:52:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:58.225 11:52:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:11:58.225 11:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:11:58.225 11:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:58.225 11:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:11:58.225 11:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:11:58.225 11:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:11:58.225 11:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:58.225 11:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:58.225 11:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:58.225 11:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:11:58.225 malloc1 00:11:58.225 11:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:58.225 [2024-07-12 11:52:48.410622] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:58.225 [2024-07-12 11:52:48.410655] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:58.225 [2024-07-12 11:52:48.410667] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1530270 00:11:58.225 [2024-07-12 11:52:48.410688] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:58.225 [2024-07-12 11:52:48.411840] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:58.225 [2024-07-12 11:52:48.411861] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:58.225 pt1 00:11:58.225 11:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:58.225 11:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:58.225 11:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:11:58.225 11:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:11:58.225 11:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:11:58.225 11:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:58.225 11:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:58.225 11:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:58.225 11:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:11:58.484 malloc2 00:11:58.484 11:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:58.743 [2024-07-12 11:52:48.739141] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:58.743 [2024-07-12 11:52:48.739173] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:58.743 [2024-07-12 11:52:48.739183] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1531580 00:11:58.743 [2024-07-12 11:52:48.739189] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:58.743 [2024-07-12 11:52:48.740201] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:58.743 [2024-07-12 11:52:48.740222] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:58.743 pt2 00:11:58.743 11:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:58.743 11:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:58.743 11:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:11:58.743 11:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:11:58.743 11:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:11:58.743 11:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:58.743 11:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:58.743 11:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:58.743 11:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:11:58.743 malloc3 00:11:58.743 11:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:11:59.003 [2024-07-12 11:52:49.071442] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:11:59.003 [2024-07-12 11:52:49.071474] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:59.003 [2024-07-12 11:52:49.071483] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16dbe30 00:11:59.003 [2024-07-12 11:52:49.071490] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:59.003 [2024-07-12 11:52:49.072562] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:59.003 [2024-07-12 11:52:49.072581] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:11:59.003 pt3 00:11:59.003 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:59.003 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:59.003 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:11:59.003 [2024-07-12 11:52:49.231869] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:59.003 [2024-07-12 11:52:49.232761] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:59.003 [2024-07-12 11:52:49.232799] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:11:59.003 [2024-07-12 11:52:49.232900] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16df390 00:11:59.003 [2024-07-12 11:52:49.232906] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:11:59.003 [2024-07-12 11:52:49.233041] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16e1c00 00:11:59.003 [2024-07-12 11:52:49.233138] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16df390 00:11:59.003 [2024-07-12 11:52:49.233147] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x16df390 00:11:59.003 [2024-07-12 11:52:49.233210] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:59.003 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:11:59.003 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:59.003 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:59.003 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:59.003 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:59.003 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:59.003 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:59.003 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:59.003 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:59.003 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:59.262 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:59.262 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:59.262 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:59.262 "name": "raid_bdev1", 00:11:59.262 "uuid": "0f720a8d-3699-43c7-8ae2-27028515c11b", 00:11:59.262 "strip_size_kb": 64, 00:11:59.262 "state": "online", 00:11:59.262 "raid_level": "raid0", 00:11:59.262 "superblock": true, 00:11:59.262 "num_base_bdevs": 3, 00:11:59.262 "num_base_bdevs_discovered": 3, 00:11:59.262 "num_base_bdevs_operational": 3, 00:11:59.262 "base_bdevs_list": [ 00:11:59.262 { 00:11:59.262 "name": "pt1", 00:11:59.262 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:59.262 "is_configured": true, 00:11:59.262 "data_offset": 2048, 00:11:59.262 "data_size": 63488 00:11:59.262 }, 00:11:59.262 { 00:11:59.262 "name": "pt2", 00:11:59.262 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:59.262 "is_configured": true, 00:11:59.262 "data_offset": 2048, 00:11:59.262 "data_size": 63488 00:11:59.262 }, 00:11:59.262 { 00:11:59.262 "name": "pt3", 00:11:59.262 "uuid": "00000000-0000-0000-0000-000000000003", 00:11:59.262 "is_configured": true, 00:11:59.262 "data_offset": 2048, 00:11:59.262 "data_size": 63488 00:11:59.262 } 00:11:59.262 ] 00:11:59.262 }' 00:11:59.262 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:59.262 11:52:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:59.832 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:11:59.832 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:59.832 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:59.832 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:59.832 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:59.832 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:59.832 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:59.832 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:59.832 [2024-07-12 11:52:50.002045] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:59.832 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:59.832 "name": "raid_bdev1", 00:11:59.832 "aliases": [ 00:11:59.832 "0f720a8d-3699-43c7-8ae2-27028515c11b" 00:11:59.832 ], 00:11:59.832 "product_name": "Raid Volume", 00:11:59.832 "block_size": 512, 00:11:59.832 "num_blocks": 190464, 00:11:59.832 "uuid": "0f720a8d-3699-43c7-8ae2-27028515c11b", 00:11:59.832 "assigned_rate_limits": { 00:11:59.832 "rw_ios_per_sec": 0, 00:11:59.832 "rw_mbytes_per_sec": 0, 00:11:59.832 "r_mbytes_per_sec": 0, 00:11:59.832 "w_mbytes_per_sec": 0 00:11:59.832 }, 00:11:59.832 "claimed": false, 00:11:59.832 "zoned": false, 00:11:59.832 "supported_io_types": { 00:11:59.832 "read": true, 00:11:59.832 "write": true, 00:11:59.832 "unmap": true, 00:11:59.832 "flush": true, 00:11:59.832 "reset": true, 00:11:59.832 "nvme_admin": false, 00:11:59.832 "nvme_io": false, 00:11:59.832 "nvme_io_md": false, 00:11:59.832 "write_zeroes": true, 00:11:59.832 "zcopy": false, 00:11:59.832 "get_zone_info": false, 00:11:59.832 "zone_management": false, 00:11:59.832 "zone_append": false, 00:11:59.832 "compare": false, 00:11:59.832 "compare_and_write": false, 00:11:59.832 "abort": false, 00:11:59.832 "seek_hole": false, 00:11:59.832 "seek_data": false, 00:11:59.832 "copy": false, 00:11:59.832 "nvme_iov_md": false 00:11:59.832 }, 00:11:59.832 "memory_domains": [ 00:11:59.832 { 00:11:59.832 "dma_device_id": "system", 00:11:59.832 "dma_device_type": 1 00:11:59.832 }, 00:11:59.832 { 00:11:59.832 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:59.832 "dma_device_type": 2 00:11:59.832 }, 00:11:59.832 { 00:11:59.832 "dma_device_id": "system", 00:11:59.832 "dma_device_type": 1 00:11:59.832 }, 00:11:59.832 { 00:11:59.832 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:59.832 "dma_device_type": 2 00:11:59.832 }, 00:11:59.832 { 00:11:59.832 "dma_device_id": "system", 00:11:59.832 "dma_device_type": 1 00:11:59.832 }, 00:11:59.832 { 00:11:59.832 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:59.832 "dma_device_type": 2 00:11:59.832 } 00:11:59.832 ], 00:11:59.832 "driver_specific": { 00:11:59.833 "raid": { 00:11:59.833 "uuid": "0f720a8d-3699-43c7-8ae2-27028515c11b", 00:11:59.833 "strip_size_kb": 64, 00:11:59.833 "state": "online", 00:11:59.833 "raid_level": "raid0", 00:11:59.833 "superblock": true, 00:11:59.833 "num_base_bdevs": 3, 00:11:59.833 "num_base_bdevs_discovered": 3, 00:11:59.833 "num_base_bdevs_operational": 3, 00:11:59.833 "base_bdevs_list": [ 00:11:59.833 { 00:11:59.833 "name": "pt1", 00:11:59.833 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:59.833 "is_configured": true, 00:11:59.833 "data_offset": 2048, 00:11:59.833 "data_size": 63488 00:11:59.833 }, 00:11:59.833 { 00:11:59.833 "name": "pt2", 00:11:59.833 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:59.833 "is_configured": true, 00:11:59.833 "data_offset": 2048, 00:11:59.833 "data_size": 63488 00:11:59.833 }, 00:11:59.833 { 00:11:59.833 "name": "pt3", 00:11:59.833 "uuid": "00000000-0000-0000-0000-000000000003", 00:11:59.833 "is_configured": true, 00:11:59.833 "data_offset": 2048, 00:11:59.833 "data_size": 63488 00:11:59.833 } 00:11:59.833 ] 00:11:59.833 } 00:11:59.833 } 00:11:59.833 }' 00:11:59.833 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:59.833 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:59.833 pt2 00:11:59.833 pt3' 00:11:59.833 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:59.833 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:59.833 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:00.092 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:00.092 "name": "pt1", 00:12:00.092 "aliases": [ 00:12:00.092 "00000000-0000-0000-0000-000000000001" 00:12:00.092 ], 00:12:00.092 "product_name": "passthru", 00:12:00.092 "block_size": 512, 00:12:00.092 "num_blocks": 65536, 00:12:00.092 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:00.092 "assigned_rate_limits": { 00:12:00.092 "rw_ios_per_sec": 0, 00:12:00.092 "rw_mbytes_per_sec": 0, 00:12:00.092 "r_mbytes_per_sec": 0, 00:12:00.092 "w_mbytes_per_sec": 0 00:12:00.092 }, 00:12:00.092 "claimed": true, 00:12:00.092 "claim_type": "exclusive_write", 00:12:00.092 "zoned": false, 00:12:00.092 "supported_io_types": { 00:12:00.092 "read": true, 00:12:00.092 "write": true, 00:12:00.092 "unmap": true, 00:12:00.092 "flush": true, 00:12:00.092 "reset": true, 00:12:00.092 "nvme_admin": false, 00:12:00.092 "nvme_io": false, 00:12:00.092 "nvme_io_md": false, 00:12:00.092 "write_zeroes": true, 00:12:00.092 "zcopy": true, 00:12:00.092 "get_zone_info": false, 00:12:00.092 "zone_management": false, 00:12:00.092 "zone_append": false, 00:12:00.093 "compare": false, 00:12:00.093 "compare_and_write": false, 00:12:00.093 "abort": true, 00:12:00.093 "seek_hole": false, 00:12:00.093 "seek_data": false, 00:12:00.093 "copy": true, 00:12:00.093 "nvme_iov_md": false 00:12:00.093 }, 00:12:00.093 "memory_domains": [ 00:12:00.093 { 00:12:00.093 "dma_device_id": "system", 00:12:00.093 "dma_device_type": 1 00:12:00.093 }, 00:12:00.093 { 00:12:00.093 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:00.093 "dma_device_type": 2 00:12:00.093 } 00:12:00.093 ], 00:12:00.093 "driver_specific": { 00:12:00.093 "passthru": { 00:12:00.093 "name": "pt1", 00:12:00.093 "base_bdev_name": "malloc1" 00:12:00.093 } 00:12:00.093 } 00:12:00.093 }' 00:12:00.093 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:00.093 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:00.093 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:00.093 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:00.093 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:00.352 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:00.352 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:00.352 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:00.352 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:00.352 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:00.352 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:00.352 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:00.352 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:00.352 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:00.352 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:00.612 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:00.612 "name": "pt2", 00:12:00.612 "aliases": [ 00:12:00.612 "00000000-0000-0000-0000-000000000002" 00:12:00.612 ], 00:12:00.612 "product_name": "passthru", 00:12:00.612 "block_size": 512, 00:12:00.612 "num_blocks": 65536, 00:12:00.612 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:00.612 "assigned_rate_limits": { 00:12:00.612 "rw_ios_per_sec": 0, 00:12:00.612 "rw_mbytes_per_sec": 0, 00:12:00.612 "r_mbytes_per_sec": 0, 00:12:00.612 "w_mbytes_per_sec": 0 00:12:00.612 }, 00:12:00.612 "claimed": true, 00:12:00.612 "claim_type": "exclusive_write", 00:12:00.612 "zoned": false, 00:12:00.612 "supported_io_types": { 00:12:00.612 "read": true, 00:12:00.612 "write": true, 00:12:00.612 "unmap": true, 00:12:00.612 "flush": true, 00:12:00.612 "reset": true, 00:12:00.612 "nvme_admin": false, 00:12:00.612 "nvme_io": false, 00:12:00.612 "nvme_io_md": false, 00:12:00.612 "write_zeroes": true, 00:12:00.612 "zcopy": true, 00:12:00.612 "get_zone_info": false, 00:12:00.612 "zone_management": false, 00:12:00.612 "zone_append": false, 00:12:00.612 "compare": false, 00:12:00.612 "compare_and_write": false, 00:12:00.612 "abort": true, 00:12:00.612 "seek_hole": false, 00:12:00.612 "seek_data": false, 00:12:00.612 "copy": true, 00:12:00.612 "nvme_iov_md": false 00:12:00.612 }, 00:12:00.612 "memory_domains": [ 00:12:00.612 { 00:12:00.612 "dma_device_id": "system", 00:12:00.612 "dma_device_type": 1 00:12:00.612 }, 00:12:00.612 { 00:12:00.612 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:00.612 "dma_device_type": 2 00:12:00.612 } 00:12:00.612 ], 00:12:00.612 "driver_specific": { 00:12:00.612 "passthru": { 00:12:00.612 "name": "pt2", 00:12:00.612 "base_bdev_name": "malloc2" 00:12:00.612 } 00:12:00.612 } 00:12:00.612 }' 00:12:00.612 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:00.612 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:00.612 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:00.612 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:00.612 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:00.612 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:00.612 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:00.871 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:00.871 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:00.871 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:00.871 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:00.871 11:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:00.871 11:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:00.871 11:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:12:00.871 11:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:01.130 11:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:01.130 "name": "pt3", 00:12:01.130 "aliases": [ 00:12:01.130 "00000000-0000-0000-0000-000000000003" 00:12:01.130 ], 00:12:01.130 "product_name": "passthru", 00:12:01.130 "block_size": 512, 00:12:01.130 "num_blocks": 65536, 00:12:01.130 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:01.130 "assigned_rate_limits": { 00:12:01.130 "rw_ios_per_sec": 0, 00:12:01.130 "rw_mbytes_per_sec": 0, 00:12:01.130 "r_mbytes_per_sec": 0, 00:12:01.130 "w_mbytes_per_sec": 0 00:12:01.130 }, 00:12:01.130 "claimed": true, 00:12:01.130 "claim_type": "exclusive_write", 00:12:01.130 "zoned": false, 00:12:01.130 "supported_io_types": { 00:12:01.130 "read": true, 00:12:01.130 "write": true, 00:12:01.130 "unmap": true, 00:12:01.130 "flush": true, 00:12:01.130 "reset": true, 00:12:01.130 "nvme_admin": false, 00:12:01.130 "nvme_io": false, 00:12:01.130 "nvme_io_md": false, 00:12:01.130 "write_zeroes": true, 00:12:01.130 "zcopy": true, 00:12:01.130 "get_zone_info": false, 00:12:01.130 "zone_management": false, 00:12:01.130 "zone_append": false, 00:12:01.130 "compare": false, 00:12:01.130 "compare_and_write": false, 00:12:01.130 "abort": true, 00:12:01.130 "seek_hole": false, 00:12:01.130 "seek_data": false, 00:12:01.130 "copy": true, 00:12:01.130 "nvme_iov_md": false 00:12:01.130 }, 00:12:01.130 "memory_domains": [ 00:12:01.130 { 00:12:01.130 "dma_device_id": "system", 00:12:01.130 "dma_device_type": 1 00:12:01.130 }, 00:12:01.130 { 00:12:01.130 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:01.130 "dma_device_type": 2 00:12:01.130 } 00:12:01.130 ], 00:12:01.130 "driver_specific": { 00:12:01.131 "passthru": { 00:12:01.131 "name": "pt3", 00:12:01.131 "base_bdev_name": "malloc3" 00:12:01.131 } 00:12:01.131 } 00:12:01.131 }' 00:12:01.131 11:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:01.131 11:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:01.131 11:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:01.131 11:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:01.131 11:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:01.131 11:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:01.131 11:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:01.131 11:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:01.390 11:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:01.390 11:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:01.390 11:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:01.390 11:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:01.390 11:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:01.390 11:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:12:01.390 [2024-07-12 11:52:51.618223] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:01.390 11:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=0f720a8d-3699-43c7-8ae2-27028515c11b 00:12:01.390 11:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 0f720a8d-3699-43c7-8ae2-27028515c11b ']' 00:12:01.390 11:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:01.649 [2024-07-12 11:52:51.782453] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:01.649 [2024-07-12 11:52:51.782470] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:01.649 [2024-07-12 11:52:51.782508] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:01.649 [2024-07-12 11:52:51.782546] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:01.649 [2024-07-12 11:52:51.782553] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16df390 name raid_bdev1, state offline 00:12:01.649 11:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:01.649 11:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:12:01.909 11:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:12:01.909 11:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:12:01.909 11:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:01.909 11:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:01.909 11:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:01.909 11:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:02.169 11:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:02.169 11:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:12:02.428 11:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:12:02.428 11:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:12:02.428 11:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:12:02.428 11:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:12:02.428 11:52:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:12:02.428 11:52:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:12:02.428 11:52:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:02.428 11:52:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:02.428 11:52:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:02.428 11:52:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:02.428 11:52:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:02.428 11:52:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:02.428 11:52:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:02.428 11:52:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:12:02.428 11:52:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:12:02.687 [2024-07-12 11:52:52.740916] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:12:02.687 [2024-07-12 11:52:52.741904] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:12:02.687 [2024-07-12 11:52:52.741935] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:12:02.687 [2024-07-12 11:52:52.741967] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:12:02.687 [2024-07-12 11:52:52.741993] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:12:02.687 [2024-07-12 11:52:52.742006] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:12:02.687 [2024-07-12 11:52:52.742015] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:02.687 [2024-07-12 11:52:52.742022] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16d9470 name raid_bdev1, state configuring 00:12:02.687 request: 00:12:02.687 { 00:12:02.687 "name": "raid_bdev1", 00:12:02.687 "raid_level": "raid0", 00:12:02.687 "base_bdevs": [ 00:12:02.687 "malloc1", 00:12:02.687 "malloc2", 00:12:02.687 "malloc3" 00:12:02.687 ], 00:12:02.687 "superblock": false, 00:12:02.687 "strip_size_kb": 64, 00:12:02.687 "method": "bdev_raid_create", 00:12:02.687 "req_id": 1 00:12:02.687 } 00:12:02.687 Got JSON-RPC error response 00:12:02.687 response: 00:12:02.687 { 00:12:02.687 "code": -17, 00:12:02.687 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:12:02.688 } 00:12:02.688 11:52:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:12:02.688 11:52:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:12:02.688 11:52:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:12:02.688 11:52:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:12:02.688 11:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:02.688 11:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:12:02.688 11:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:12:02.688 11:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:12:02.688 11:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:02.956 [2024-07-12 11:52:53.057696] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:02.956 [2024-07-12 11:52:53.057728] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:02.956 [2024-07-12 11:52:53.057739] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16dc060 00:12:02.956 [2024-07-12 11:52:53.057760] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:02.956 [2024-07-12 11:52:53.058947] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:02.956 [2024-07-12 11:52:53.058969] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:02.956 [2024-07-12 11:52:53.059015] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:02.956 [2024-07-12 11:52:53.059033] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:02.956 pt1 00:12:02.956 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:12:02.956 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:02.956 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:02.956 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:02.956 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:02.956 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:02.956 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:02.956 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:02.956 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:02.956 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:02.956 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:02.956 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:03.220 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:03.220 "name": "raid_bdev1", 00:12:03.220 "uuid": "0f720a8d-3699-43c7-8ae2-27028515c11b", 00:12:03.220 "strip_size_kb": 64, 00:12:03.220 "state": "configuring", 00:12:03.220 "raid_level": "raid0", 00:12:03.220 "superblock": true, 00:12:03.220 "num_base_bdevs": 3, 00:12:03.220 "num_base_bdevs_discovered": 1, 00:12:03.220 "num_base_bdevs_operational": 3, 00:12:03.220 "base_bdevs_list": [ 00:12:03.220 { 00:12:03.220 "name": "pt1", 00:12:03.220 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:03.220 "is_configured": true, 00:12:03.220 "data_offset": 2048, 00:12:03.220 "data_size": 63488 00:12:03.220 }, 00:12:03.220 { 00:12:03.220 "name": null, 00:12:03.220 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:03.220 "is_configured": false, 00:12:03.220 "data_offset": 2048, 00:12:03.220 "data_size": 63488 00:12:03.220 }, 00:12:03.220 { 00:12:03.220 "name": null, 00:12:03.220 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:03.220 "is_configured": false, 00:12:03.220 "data_offset": 2048, 00:12:03.220 "data_size": 63488 00:12:03.220 } 00:12:03.220 ] 00:12:03.220 }' 00:12:03.220 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:03.220 11:52:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:03.478 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:12:03.478 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:03.736 [2024-07-12 11:52:53.871812] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:03.736 [2024-07-12 11:52:53.871846] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:03.736 [2024-07-12 11:52:53.871859] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16dd2d0 00:12:03.736 [2024-07-12 11:52:53.871866] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:03.736 [2024-07-12 11:52:53.872115] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:03.736 [2024-07-12 11:52:53.872125] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:03.736 [2024-07-12 11:52:53.872169] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:03.736 [2024-07-12 11:52:53.872182] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:03.736 pt2 00:12:03.736 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:03.996 [2024-07-12 11:52:54.048280] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:12:03.996 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:12:03.996 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:03.996 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:03.996 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:03.996 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:03.996 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:03.996 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:03.996 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:03.996 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:03.996 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:03.996 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:03.996 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:03.996 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:03.996 "name": "raid_bdev1", 00:12:03.996 "uuid": "0f720a8d-3699-43c7-8ae2-27028515c11b", 00:12:03.996 "strip_size_kb": 64, 00:12:03.996 "state": "configuring", 00:12:03.996 "raid_level": "raid0", 00:12:03.996 "superblock": true, 00:12:03.996 "num_base_bdevs": 3, 00:12:03.996 "num_base_bdevs_discovered": 1, 00:12:03.996 "num_base_bdevs_operational": 3, 00:12:03.996 "base_bdevs_list": [ 00:12:03.996 { 00:12:03.996 "name": "pt1", 00:12:03.996 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:03.996 "is_configured": true, 00:12:03.996 "data_offset": 2048, 00:12:03.996 "data_size": 63488 00:12:03.996 }, 00:12:03.996 { 00:12:03.996 "name": null, 00:12:03.996 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:03.996 "is_configured": false, 00:12:03.996 "data_offset": 2048, 00:12:03.996 "data_size": 63488 00:12:03.996 }, 00:12:03.996 { 00:12:03.996 "name": null, 00:12:03.996 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:03.996 "is_configured": false, 00:12:03.996 "data_offset": 2048, 00:12:03.996 "data_size": 63488 00:12:03.996 } 00:12:03.996 ] 00:12:03.996 }' 00:12:03.996 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:03.996 11:52:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:04.564 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:12:04.564 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:04.564 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:04.824 [2024-07-12 11:52:54.862378] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:04.824 [2024-07-12 11:52:54.862415] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:04.824 [2024-07-12 11:52:54.862426] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16de0c0 00:12:04.824 [2024-07-12 11:52:54.862449] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:04.824 [2024-07-12 11:52:54.862708] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:04.824 [2024-07-12 11:52:54.862720] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:04.824 [2024-07-12 11:52:54.862763] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:04.824 [2024-07-12 11:52:54.862775] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:04.824 pt2 00:12:04.824 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:12:04.824 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:04.824 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:12:04.824 [2024-07-12 11:52:55.030813] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:12:04.824 [2024-07-12 11:52:55.030838] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:04.824 [2024-07-12 11:52:55.030848] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16dff00 00:12:04.824 [2024-07-12 11:52:55.030853] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:04.824 [2024-07-12 11:52:55.031072] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:04.824 [2024-07-12 11:52:55.031081] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:12:04.824 [2024-07-12 11:52:55.031115] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:12:04.824 [2024-07-12 11:52:55.031126] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:12:04.824 [2024-07-12 11:52:55.031197] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16e1850 00:12:04.824 [2024-07-12 11:52:55.031202] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:04.824 [2024-07-12 11:52:55.031310] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x152f9c0 00:12:04.824 [2024-07-12 11:52:55.031391] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16e1850 00:12:04.824 [2024-07-12 11:52:55.031396] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x16e1850 00:12:04.824 [2024-07-12 11:52:55.031460] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:04.824 pt3 00:12:04.824 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:12:04.824 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:04.824 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:12:04.824 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:04.824 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:04.824 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:04.824 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:04.824 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:04.824 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:04.824 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:04.824 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:04.824 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:04.824 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:04.824 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:05.085 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:05.085 "name": "raid_bdev1", 00:12:05.085 "uuid": "0f720a8d-3699-43c7-8ae2-27028515c11b", 00:12:05.085 "strip_size_kb": 64, 00:12:05.085 "state": "online", 00:12:05.085 "raid_level": "raid0", 00:12:05.085 "superblock": true, 00:12:05.085 "num_base_bdevs": 3, 00:12:05.085 "num_base_bdevs_discovered": 3, 00:12:05.085 "num_base_bdevs_operational": 3, 00:12:05.085 "base_bdevs_list": [ 00:12:05.085 { 00:12:05.085 "name": "pt1", 00:12:05.085 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:05.085 "is_configured": true, 00:12:05.085 "data_offset": 2048, 00:12:05.085 "data_size": 63488 00:12:05.085 }, 00:12:05.085 { 00:12:05.085 "name": "pt2", 00:12:05.085 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:05.085 "is_configured": true, 00:12:05.085 "data_offset": 2048, 00:12:05.085 "data_size": 63488 00:12:05.085 }, 00:12:05.085 { 00:12:05.085 "name": "pt3", 00:12:05.085 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:05.085 "is_configured": true, 00:12:05.085 "data_offset": 2048, 00:12:05.085 "data_size": 63488 00:12:05.085 } 00:12:05.085 ] 00:12:05.085 }' 00:12:05.085 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:05.085 11:52:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:05.654 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:12:05.654 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:05.654 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:05.654 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:05.654 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:05.654 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:05.654 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:05.654 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:05.654 [2024-07-12 11:52:55.873194] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:05.654 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:05.654 "name": "raid_bdev1", 00:12:05.654 "aliases": [ 00:12:05.654 "0f720a8d-3699-43c7-8ae2-27028515c11b" 00:12:05.654 ], 00:12:05.654 "product_name": "Raid Volume", 00:12:05.654 "block_size": 512, 00:12:05.654 "num_blocks": 190464, 00:12:05.654 "uuid": "0f720a8d-3699-43c7-8ae2-27028515c11b", 00:12:05.654 "assigned_rate_limits": { 00:12:05.654 "rw_ios_per_sec": 0, 00:12:05.654 "rw_mbytes_per_sec": 0, 00:12:05.654 "r_mbytes_per_sec": 0, 00:12:05.654 "w_mbytes_per_sec": 0 00:12:05.654 }, 00:12:05.654 "claimed": false, 00:12:05.654 "zoned": false, 00:12:05.654 "supported_io_types": { 00:12:05.654 "read": true, 00:12:05.654 "write": true, 00:12:05.654 "unmap": true, 00:12:05.654 "flush": true, 00:12:05.654 "reset": true, 00:12:05.654 "nvme_admin": false, 00:12:05.654 "nvme_io": false, 00:12:05.654 "nvme_io_md": false, 00:12:05.654 "write_zeroes": true, 00:12:05.654 "zcopy": false, 00:12:05.654 "get_zone_info": false, 00:12:05.654 "zone_management": false, 00:12:05.654 "zone_append": false, 00:12:05.654 "compare": false, 00:12:05.654 "compare_and_write": false, 00:12:05.654 "abort": false, 00:12:05.654 "seek_hole": false, 00:12:05.654 "seek_data": false, 00:12:05.654 "copy": false, 00:12:05.654 "nvme_iov_md": false 00:12:05.654 }, 00:12:05.654 "memory_domains": [ 00:12:05.654 { 00:12:05.654 "dma_device_id": "system", 00:12:05.654 "dma_device_type": 1 00:12:05.654 }, 00:12:05.654 { 00:12:05.654 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:05.654 "dma_device_type": 2 00:12:05.654 }, 00:12:05.654 { 00:12:05.654 "dma_device_id": "system", 00:12:05.654 "dma_device_type": 1 00:12:05.654 }, 00:12:05.654 { 00:12:05.654 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:05.654 "dma_device_type": 2 00:12:05.654 }, 00:12:05.654 { 00:12:05.654 "dma_device_id": "system", 00:12:05.654 "dma_device_type": 1 00:12:05.654 }, 00:12:05.654 { 00:12:05.654 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:05.654 "dma_device_type": 2 00:12:05.654 } 00:12:05.654 ], 00:12:05.654 "driver_specific": { 00:12:05.654 "raid": { 00:12:05.654 "uuid": "0f720a8d-3699-43c7-8ae2-27028515c11b", 00:12:05.654 "strip_size_kb": 64, 00:12:05.654 "state": "online", 00:12:05.654 "raid_level": "raid0", 00:12:05.654 "superblock": true, 00:12:05.654 "num_base_bdevs": 3, 00:12:05.654 "num_base_bdevs_discovered": 3, 00:12:05.654 "num_base_bdevs_operational": 3, 00:12:05.654 "base_bdevs_list": [ 00:12:05.654 { 00:12:05.654 "name": "pt1", 00:12:05.654 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:05.654 "is_configured": true, 00:12:05.654 "data_offset": 2048, 00:12:05.654 "data_size": 63488 00:12:05.654 }, 00:12:05.654 { 00:12:05.654 "name": "pt2", 00:12:05.654 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:05.654 "is_configured": true, 00:12:05.654 "data_offset": 2048, 00:12:05.654 "data_size": 63488 00:12:05.654 }, 00:12:05.654 { 00:12:05.654 "name": "pt3", 00:12:05.654 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:05.654 "is_configured": true, 00:12:05.654 "data_offset": 2048, 00:12:05.654 "data_size": 63488 00:12:05.654 } 00:12:05.654 ] 00:12:05.654 } 00:12:05.654 } 00:12:05.654 }' 00:12:05.654 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:05.914 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:05.914 pt2 00:12:05.914 pt3' 00:12:05.914 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:05.914 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:05.914 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:05.914 11:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:05.914 "name": "pt1", 00:12:05.914 "aliases": [ 00:12:05.914 "00000000-0000-0000-0000-000000000001" 00:12:05.914 ], 00:12:05.914 "product_name": "passthru", 00:12:05.914 "block_size": 512, 00:12:05.914 "num_blocks": 65536, 00:12:05.914 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:05.914 "assigned_rate_limits": { 00:12:05.914 "rw_ios_per_sec": 0, 00:12:05.914 "rw_mbytes_per_sec": 0, 00:12:05.914 "r_mbytes_per_sec": 0, 00:12:05.914 "w_mbytes_per_sec": 0 00:12:05.914 }, 00:12:05.914 "claimed": true, 00:12:05.914 "claim_type": "exclusive_write", 00:12:05.914 "zoned": false, 00:12:05.914 "supported_io_types": { 00:12:05.914 "read": true, 00:12:05.914 "write": true, 00:12:05.914 "unmap": true, 00:12:05.914 "flush": true, 00:12:05.914 "reset": true, 00:12:05.914 "nvme_admin": false, 00:12:05.914 "nvme_io": false, 00:12:05.914 "nvme_io_md": false, 00:12:05.914 "write_zeroes": true, 00:12:05.914 "zcopy": true, 00:12:05.914 "get_zone_info": false, 00:12:05.914 "zone_management": false, 00:12:05.914 "zone_append": false, 00:12:05.914 "compare": false, 00:12:05.914 "compare_and_write": false, 00:12:05.914 "abort": true, 00:12:05.914 "seek_hole": false, 00:12:05.914 "seek_data": false, 00:12:05.914 "copy": true, 00:12:05.914 "nvme_iov_md": false 00:12:05.914 }, 00:12:05.914 "memory_domains": [ 00:12:05.914 { 00:12:05.914 "dma_device_id": "system", 00:12:05.914 "dma_device_type": 1 00:12:05.914 }, 00:12:05.914 { 00:12:05.914 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:05.914 "dma_device_type": 2 00:12:05.914 } 00:12:05.914 ], 00:12:05.914 "driver_specific": { 00:12:05.914 "passthru": { 00:12:05.914 "name": "pt1", 00:12:05.914 "base_bdev_name": "malloc1" 00:12:05.914 } 00:12:05.914 } 00:12:05.914 }' 00:12:05.914 11:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:05.914 11:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:06.173 11:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:06.173 11:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:06.173 11:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:06.173 11:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:06.173 11:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:06.173 11:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:06.173 11:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:06.173 11:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:06.173 11:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:06.432 11:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:06.432 11:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:06.432 11:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:06.432 11:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:06.432 11:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:06.432 "name": "pt2", 00:12:06.432 "aliases": [ 00:12:06.432 "00000000-0000-0000-0000-000000000002" 00:12:06.432 ], 00:12:06.432 "product_name": "passthru", 00:12:06.432 "block_size": 512, 00:12:06.432 "num_blocks": 65536, 00:12:06.432 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:06.432 "assigned_rate_limits": { 00:12:06.432 "rw_ios_per_sec": 0, 00:12:06.432 "rw_mbytes_per_sec": 0, 00:12:06.432 "r_mbytes_per_sec": 0, 00:12:06.432 "w_mbytes_per_sec": 0 00:12:06.432 }, 00:12:06.432 "claimed": true, 00:12:06.432 "claim_type": "exclusive_write", 00:12:06.432 "zoned": false, 00:12:06.432 "supported_io_types": { 00:12:06.432 "read": true, 00:12:06.432 "write": true, 00:12:06.432 "unmap": true, 00:12:06.432 "flush": true, 00:12:06.432 "reset": true, 00:12:06.432 "nvme_admin": false, 00:12:06.432 "nvme_io": false, 00:12:06.432 "nvme_io_md": false, 00:12:06.432 "write_zeroes": true, 00:12:06.432 "zcopy": true, 00:12:06.432 "get_zone_info": false, 00:12:06.432 "zone_management": false, 00:12:06.432 "zone_append": false, 00:12:06.432 "compare": false, 00:12:06.432 "compare_and_write": false, 00:12:06.432 "abort": true, 00:12:06.432 "seek_hole": false, 00:12:06.432 "seek_data": false, 00:12:06.432 "copy": true, 00:12:06.432 "nvme_iov_md": false 00:12:06.432 }, 00:12:06.432 "memory_domains": [ 00:12:06.432 { 00:12:06.432 "dma_device_id": "system", 00:12:06.432 "dma_device_type": 1 00:12:06.432 }, 00:12:06.432 { 00:12:06.432 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:06.432 "dma_device_type": 2 00:12:06.432 } 00:12:06.432 ], 00:12:06.432 "driver_specific": { 00:12:06.432 "passthru": { 00:12:06.432 "name": "pt2", 00:12:06.432 "base_bdev_name": "malloc2" 00:12:06.432 } 00:12:06.432 } 00:12:06.432 }' 00:12:06.432 11:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:06.432 11:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:06.432 11:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:06.691 11:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:06.691 11:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:06.691 11:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:06.691 11:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:06.691 11:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:06.691 11:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:06.691 11:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:06.691 11:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:06.691 11:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:06.691 11:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:06.691 11:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:12:06.691 11:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:06.951 11:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:06.951 "name": "pt3", 00:12:06.951 "aliases": [ 00:12:06.951 "00000000-0000-0000-0000-000000000003" 00:12:06.951 ], 00:12:06.951 "product_name": "passthru", 00:12:06.951 "block_size": 512, 00:12:06.951 "num_blocks": 65536, 00:12:06.951 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:06.951 "assigned_rate_limits": { 00:12:06.951 "rw_ios_per_sec": 0, 00:12:06.951 "rw_mbytes_per_sec": 0, 00:12:06.951 "r_mbytes_per_sec": 0, 00:12:06.951 "w_mbytes_per_sec": 0 00:12:06.951 }, 00:12:06.951 "claimed": true, 00:12:06.951 "claim_type": "exclusive_write", 00:12:06.951 "zoned": false, 00:12:06.951 "supported_io_types": { 00:12:06.951 "read": true, 00:12:06.951 "write": true, 00:12:06.951 "unmap": true, 00:12:06.951 "flush": true, 00:12:06.951 "reset": true, 00:12:06.951 "nvme_admin": false, 00:12:06.951 "nvme_io": false, 00:12:06.951 "nvme_io_md": false, 00:12:06.951 "write_zeroes": true, 00:12:06.951 "zcopy": true, 00:12:06.951 "get_zone_info": false, 00:12:06.951 "zone_management": false, 00:12:06.951 "zone_append": false, 00:12:06.951 "compare": false, 00:12:06.951 "compare_and_write": false, 00:12:06.951 "abort": true, 00:12:06.951 "seek_hole": false, 00:12:06.951 "seek_data": false, 00:12:06.951 "copy": true, 00:12:06.951 "nvme_iov_md": false 00:12:06.951 }, 00:12:06.951 "memory_domains": [ 00:12:06.951 { 00:12:06.951 "dma_device_id": "system", 00:12:06.951 "dma_device_type": 1 00:12:06.951 }, 00:12:06.951 { 00:12:06.951 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:06.951 "dma_device_type": 2 00:12:06.951 } 00:12:06.951 ], 00:12:06.951 "driver_specific": { 00:12:06.951 "passthru": { 00:12:06.951 "name": "pt3", 00:12:06.951 "base_bdev_name": "malloc3" 00:12:06.951 } 00:12:06.951 } 00:12:06.951 }' 00:12:06.951 11:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:06.951 11:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:06.951 11:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:06.951 11:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:06.951 11:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:07.210 11:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:07.210 11:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:07.210 11:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:07.210 11:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:07.210 11:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:07.210 11:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:07.210 11:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:07.210 11:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:07.210 11:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:12:07.470 [2024-07-12 11:52:57.529469] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:07.470 11:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 0f720a8d-3699-43c7-8ae2-27028515c11b '!=' 0f720a8d-3699-43c7-8ae2-27028515c11b ']' 00:12:07.470 11:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:12:07.470 11:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:07.470 11:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:07.470 11:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 618947 00:12:07.470 11:52:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 618947 ']' 00:12:07.470 11:52:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 618947 00:12:07.470 11:52:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:12:07.470 11:52:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:07.470 11:52:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 618947 00:12:07.470 11:52:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:07.470 11:52:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:07.470 11:52:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 618947' 00:12:07.470 killing process with pid 618947 00:12:07.470 11:52:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 618947 00:12:07.470 [2024-07-12 11:52:57.587187] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:07.470 [2024-07-12 11:52:57.587228] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:07.470 [2024-07-12 11:52:57.587265] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:07.470 [2024-07-12 11:52:57.587271] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16e1850 name raid_bdev1, state offline 00:12:07.470 11:52:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 618947 00:12:07.470 [2024-07-12 11:52:57.610603] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:07.730 11:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:12:07.730 00:12:07.730 real 0m10.526s 00:12:07.730 user 0m19.235s 00:12:07.730 sys 0m1.606s 00:12:07.730 11:52:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:07.730 11:52:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:07.730 ************************************ 00:12:07.730 END TEST raid_superblock_test 00:12:07.730 ************************************ 00:12:07.730 11:52:57 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:07.730 11:52:57 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 3 read 00:12:07.730 11:52:57 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:07.730 11:52:57 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:07.730 11:52:57 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:07.730 ************************************ 00:12:07.730 START TEST raid_read_error_test 00:12:07.730 ************************************ 00:12:07.730 11:52:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 3 read 00:12:07.730 11:52:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:12:07.730 11:52:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:12:07.730 11:52:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:12:07.730 11:52:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:07.730 11:52:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:07.730 11:52:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:07.730 11:52:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:07.730 11:52:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:07.730 11:52:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:07.730 11:52:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:07.730 11:52:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:07.730 11:52:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:12:07.730 11:52:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:07.730 11:52:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:07.730 11:52:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:12:07.730 11:52:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:07.730 11:52:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:07.730 11:52:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:07.730 11:52:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:07.730 11:52:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:07.730 11:52:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:07.730 11:52:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:12:07.730 11:52:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:12:07.730 11:52:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:12:07.730 11:52:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:07.730 11:52:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.vCyaSHjZ86 00:12:07.730 11:52:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:07.730 11:52:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=620903 00:12:07.730 11:52:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 620903 /var/tmp/spdk-raid.sock 00:12:07.730 11:52:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 620903 ']' 00:12:07.730 11:52:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:07.730 11:52:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:07.730 11:52:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:07.730 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:07.730 11:52:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:07.730 11:52:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:07.730 [2024-07-12 11:52:57.906090] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:12:07.730 [2024-07-12 11:52:57.906129] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid620903 ] 00:12:07.730 [2024-07-12 11:52:57.964033] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:07.990 [2024-07-12 11:52:58.043373] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:12:07.990 [2024-07-12 11:52:58.102928] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:07.990 [2024-07-12 11:52:58.102950] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:08.563 11:52:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:08.563 11:52:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:08.563 11:52:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:08.563 11:52:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:08.822 BaseBdev1_malloc 00:12:08.822 11:52:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:08.822 true 00:12:08.822 11:52:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:09.082 [2024-07-12 11:52:59.175377] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:09.082 [2024-07-12 11:52:59.175408] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:09.082 [2024-07-12 11:52:59.175420] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27882d0 00:12:09.082 [2024-07-12 11:52:59.175426] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:09.082 [2024-07-12 11:52:59.176695] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:09.082 [2024-07-12 11:52:59.176716] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:09.082 BaseBdev1 00:12:09.082 11:52:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:09.082 11:52:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:09.342 BaseBdev2_malloc 00:12:09.342 11:52:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:09.342 true 00:12:09.342 11:52:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:09.602 [2024-07-12 11:52:59.676242] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:09.602 [2024-07-12 11:52:59.676274] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:09.602 [2024-07-12 11:52:59.676286] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x278cf40 00:12:09.602 [2024-07-12 11:52:59.676292] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:09.602 [2024-07-12 11:52:59.677390] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:09.602 [2024-07-12 11:52:59.677410] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:09.602 BaseBdev2 00:12:09.602 11:52:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:09.602 11:52:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:12:09.862 BaseBdev3_malloc 00:12:09.862 11:52:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:12:09.862 true 00:12:09.862 11:53:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:12:10.121 [2024-07-12 11:53:00.173111] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:12:10.121 [2024-07-12 11:53:00.173142] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:10.121 [2024-07-12 11:53:00.173152] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x278fea0 00:12:10.121 [2024-07-12 11:53:00.173159] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:10.121 [2024-07-12 11:53:00.174298] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:10.121 [2024-07-12 11:53:00.174317] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:12:10.121 BaseBdev3 00:12:10.121 11:53:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:12:10.122 [2024-07-12 11:53:00.325526] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:10.122 [2024-07-12 11:53:00.326379] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:10.122 [2024-07-12 11:53:00.326425] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:10.122 [2024-07-12 11:53:00.326568] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2789000 00:12:10.122 [2024-07-12 11:53:00.326575] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:10.122 [2024-07-12 11:53:00.326705] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x278f7c0 00:12:10.122 [2024-07-12 11:53:00.326810] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2789000 00:12:10.122 [2024-07-12 11:53:00.326815] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2789000 00:12:10.122 [2024-07-12 11:53:00.326878] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:10.122 11:53:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:12:10.122 11:53:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:10.122 11:53:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:10.122 11:53:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:10.122 11:53:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:10.122 11:53:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:10.122 11:53:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:10.122 11:53:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:10.122 11:53:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:10.122 11:53:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:10.122 11:53:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:10.122 11:53:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:10.381 11:53:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:10.381 "name": "raid_bdev1", 00:12:10.381 "uuid": "66e09eeb-d05d-49ce-95f2-b317083f822e", 00:12:10.381 "strip_size_kb": 64, 00:12:10.381 "state": "online", 00:12:10.381 "raid_level": "raid0", 00:12:10.381 "superblock": true, 00:12:10.381 "num_base_bdevs": 3, 00:12:10.381 "num_base_bdevs_discovered": 3, 00:12:10.381 "num_base_bdevs_operational": 3, 00:12:10.381 "base_bdevs_list": [ 00:12:10.381 { 00:12:10.381 "name": "BaseBdev1", 00:12:10.381 "uuid": "421f578a-e569-554f-8228-633eee07d20c", 00:12:10.381 "is_configured": true, 00:12:10.381 "data_offset": 2048, 00:12:10.381 "data_size": 63488 00:12:10.381 }, 00:12:10.381 { 00:12:10.381 "name": "BaseBdev2", 00:12:10.381 "uuid": "cfa59257-c25f-5cae-9c8f-7983baa7ca77", 00:12:10.381 "is_configured": true, 00:12:10.381 "data_offset": 2048, 00:12:10.381 "data_size": 63488 00:12:10.381 }, 00:12:10.381 { 00:12:10.381 "name": "BaseBdev3", 00:12:10.381 "uuid": "6824722e-a76c-5939-aa96-eff140eb9ec3", 00:12:10.381 "is_configured": true, 00:12:10.381 "data_offset": 2048, 00:12:10.381 "data_size": 63488 00:12:10.381 } 00:12:10.381 ] 00:12:10.381 }' 00:12:10.381 11:53:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:10.381 11:53:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:10.949 11:53:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:10.949 11:53:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:10.949 [2024-07-12 11:53:01.075672] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25de840 00:12:11.886 11:53:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:12:12.146 11:53:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:12.146 11:53:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:12:12.146 11:53:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:12:12.146 11:53:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:12:12.146 11:53:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:12.146 11:53:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:12.146 11:53:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:12.146 11:53:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:12.146 11:53:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:12.146 11:53:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:12.146 11:53:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:12.146 11:53:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:12.146 11:53:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:12.146 11:53:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:12.146 11:53:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:12.146 11:53:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:12.146 "name": "raid_bdev1", 00:12:12.146 "uuid": "66e09eeb-d05d-49ce-95f2-b317083f822e", 00:12:12.146 "strip_size_kb": 64, 00:12:12.146 "state": "online", 00:12:12.146 "raid_level": "raid0", 00:12:12.146 "superblock": true, 00:12:12.146 "num_base_bdevs": 3, 00:12:12.146 "num_base_bdevs_discovered": 3, 00:12:12.146 "num_base_bdevs_operational": 3, 00:12:12.146 "base_bdevs_list": [ 00:12:12.146 { 00:12:12.146 "name": "BaseBdev1", 00:12:12.146 "uuid": "421f578a-e569-554f-8228-633eee07d20c", 00:12:12.146 "is_configured": true, 00:12:12.146 "data_offset": 2048, 00:12:12.146 "data_size": 63488 00:12:12.146 }, 00:12:12.146 { 00:12:12.146 "name": "BaseBdev2", 00:12:12.146 "uuid": "cfa59257-c25f-5cae-9c8f-7983baa7ca77", 00:12:12.146 "is_configured": true, 00:12:12.146 "data_offset": 2048, 00:12:12.146 "data_size": 63488 00:12:12.146 }, 00:12:12.146 { 00:12:12.146 "name": "BaseBdev3", 00:12:12.146 "uuid": "6824722e-a76c-5939-aa96-eff140eb9ec3", 00:12:12.146 "is_configured": true, 00:12:12.146 "data_offset": 2048, 00:12:12.146 "data_size": 63488 00:12:12.146 } 00:12:12.146 ] 00:12:12.146 }' 00:12:12.146 11:53:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:12.146 11:53:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:12.715 11:53:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:12.974 [2024-07-12 11:53:02.995345] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:12.975 [2024-07-12 11:53:02.995369] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:12.975 [2024-07-12 11:53:02.997510] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:12.975 [2024-07-12 11:53:02.997538] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:12.975 [2024-07-12 11:53:02.997559] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:12.975 [2024-07-12 11:53:02.997565] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2789000 name raid_bdev1, state offline 00:12:12.975 0 00:12:12.975 11:53:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 620903 00:12:12.975 11:53:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 620903 ']' 00:12:12.975 11:53:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 620903 00:12:12.975 11:53:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:12:12.975 11:53:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:12.975 11:53:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 620903 00:12:12.975 11:53:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:12.975 11:53:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:12.975 11:53:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 620903' 00:12:12.975 killing process with pid 620903 00:12:12.975 11:53:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 620903 00:12:12.975 [2024-07-12 11:53:03.058773] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:12.975 11:53:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 620903 00:12:12.975 [2024-07-12 11:53:03.076980] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:13.235 11:53:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.vCyaSHjZ86 00:12:13.235 11:53:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:13.235 11:53:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:13.235 11:53:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:12:13.235 11:53:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:12:13.235 11:53:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:13.235 11:53:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:13.235 11:53:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:12:13.235 00:12:13.235 real 0m5.410s 00:12:13.235 user 0m8.405s 00:12:13.235 sys 0m0.769s 00:12:13.235 11:53:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:13.235 11:53:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:13.235 ************************************ 00:12:13.235 END TEST raid_read_error_test 00:12:13.235 ************************************ 00:12:13.235 11:53:03 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:13.235 11:53:03 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 3 write 00:12:13.235 11:53:03 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:13.235 11:53:03 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:13.235 11:53:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:13.235 ************************************ 00:12:13.235 START TEST raid_write_error_test 00:12:13.235 ************************************ 00:12:13.235 11:53:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 3 write 00:12:13.235 11:53:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:12:13.235 11:53:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:12:13.235 11:53:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:12:13.235 11:53:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:13.235 11:53:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:13.235 11:53:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:13.235 11:53:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:13.235 11:53:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:13.235 11:53:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:13.235 11:53:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:13.235 11:53:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:13.235 11:53:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:12:13.235 11:53:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:13.235 11:53:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:13.236 11:53:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:12:13.236 11:53:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:13.236 11:53:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:13.236 11:53:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:13.236 11:53:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:13.236 11:53:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:13.236 11:53:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:13.236 11:53:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:12:13.236 11:53:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:12:13.236 11:53:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:12:13.236 11:53:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:13.236 11:53:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.9mODr6r2wt 00:12:13.236 11:53:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=621918 00:12:13.236 11:53:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 621918 /var/tmp/spdk-raid.sock 00:12:13.236 11:53:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:13.236 11:53:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 621918 ']' 00:12:13.236 11:53:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:13.236 11:53:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:13.236 11:53:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:13.236 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:13.236 11:53:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:13.236 11:53:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:13.236 [2024-07-12 11:53:03.399543] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:12:13.236 [2024-07-12 11:53:03.399581] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid621918 ] 00:12:13.236 [2024-07-12 11:53:03.463916] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:13.495 [2024-07-12 11:53:03.539658] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:12:13.495 [2024-07-12 11:53:03.594411] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:13.495 [2024-07-12 11:53:03.594445] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:14.060 11:53:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:14.060 11:53:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:14.060 11:53:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:14.060 11:53:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:14.318 BaseBdev1_malloc 00:12:14.318 11:53:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:14.318 true 00:12:14.318 11:53:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:14.576 [2024-07-12 11:53:04.670492] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:14.576 [2024-07-12 11:53:04.670529] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:14.576 [2024-07-12 11:53:04.670539] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x195d2d0 00:12:14.577 [2024-07-12 11:53:04.670545] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:14.577 [2024-07-12 11:53:04.671685] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:14.577 [2024-07-12 11:53:04.671706] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:14.577 BaseBdev1 00:12:14.577 11:53:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:14.577 11:53:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:14.835 BaseBdev2_malloc 00:12:14.835 11:53:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:14.835 true 00:12:14.836 11:53:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:15.094 [2024-07-12 11:53:05.179052] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:15.094 [2024-07-12 11:53:05.179084] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:15.094 [2024-07-12 11:53:05.179096] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1961f40 00:12:15.094 [2024-07-12 11:53:05.179103] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:15.094 [2024-07-12 11:53:05.180252] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:15.094 [2024-07-12 11:53:05.180274] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:15.094 BaseBdev2 00:12:15.094 11:53:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:15.094 11:53:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:12:15.353 BaseBdev3_malloc 00:12:15.353 11:53:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:12:15.353 true 00:12:15.353 11:53:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:12:15.613 [2024-07-12 11:53:05.687684] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:12:15.613 [2024-07-12 11:53:05.687712] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:15.613 [2024-07-12 11:53:05.687725] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1964ea0 00:12:15.613 [2024-07-12 11:53:05.687731] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:15.613 [2024-07-12 11:53:05.688647] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:15.613 [2024-07-12 11:53:05.688665] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:12:15.613 BaseBdev3 00:12:15.613 11:53:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:12:15.613 [2024-07-12 11:53:05.856136] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:15.613 [2024-07-12 11:53:05.856931] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:15.613 [2024-07-12 11:53:05.856974] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:15.613 [2024-07-12 11:53:05.857106] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x195e000 00:12:15.613 [2024-07-12 11:53:05.857113] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:15.613 [2024-07-12 11:53:05.857229] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19647c0 00:12:15.613 [2024-07-12 11:53:05.857322] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x195e000 00:12:15.613 [2024-07-12 11:53:05.857327] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x195e000 00:12:15.613 [2024-07-12 11:53:05.857389] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:15.873 11:53:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:12:15.873 11:53:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:15.873 11:53:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:15.873 11:53:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:15.873 11:53:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:15.873 11:53:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:15.873 11:53:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:15.873 11:53:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:15.873 11:53:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:15.873 11:53:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:15.873 11:53:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:15.873 11:53:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:15.873 11:53:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:15.873 "name": "raid_bdev1", 00:12:15.873 "uuid": "56f5a54e-2216-417e-ad71-3e2827e1edcd", 00:12:15.873 "strip_size_kb": 64, 00:12:15.873 "state": "online", 00:12:15.873 "raid_level": "raid0", 00:12:15.873 "superblock": true, 00:12:15.873 "num_base_bdevs": 3, 00:12:15.873 "num_base_bdevs_discovered": 3, 00:12:15.873 "num_base_bdevs_operational": 3, 00:12:15.873 "base_bdevs_list": [ 00:12:15.873 { 00:12:15.873 "name": "BaseBdev1", 00:12:15.873 "uuid": "8e12bc3f-99d1-5dd3-95a3-8182c28c1b40", 00:12:15.873 "is_configured": true, 00:12:15.873 "data_offset": 2048, 00:12:15.873 "data_size": 63488 00:12:15.873 }, 00:12:15.873 { 00:12:15.873 "name": "BaseBdev2", 00:12:15.873 "uuid": "2ccf9acf-4709-550f-83eb-be9845e98123", 00:12:15.873 "is_configured": true, 00:12:15.873 "data_offset": 2048, 00:12:15.873 "data_size": 63488 00:12:15.873 }, 00:12:15.873 { 00:12:15.873 "name": "BaseBdev3", 00:12:15.873 "uuid": "cf095925-07a5-5393-8a85-d423c5cee468", 00:12:15.873 "is_configured": true, 00:12:15.873 "data_offset": 2048, 00:12:15.873 "data_size": 63488 00:12:15.873 } 00:12:15.873 ] 00:12:15.873 }' 00:12:15.873 11:53:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:15.873 11:53:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:16.441 11:53:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:16.441 11:53:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:16.441 [2024-07-12 11:53:06.630327] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17b3840 00:12:17.379 11:53:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:12:17.638 11:53:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:17.638 11:53:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:12:17.638 11:53:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:12:17.638 11:53:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:12:17.638 11:53:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:17.638 11:53:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:17.638 11:53:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:17.638 11:53:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:17.638 11:53:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:17.638 11:53:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:17.638 11:53:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:17.638 11:53:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:17.638 11:53:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:17.638 11:53:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:17.638 11:53:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:17.897 11:53:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:17.897 "name": "raid_bdev1", 00:12:17.897 "uuid": "56f5a54e-2216-417e-ad71-3e2827e1edcd", 00:12:17.897 "strip_size_kb": 64, 00:12:17.897 "state": "online", 00:12:17.897 "raid_level": "raid0", 00:12:17.897 "superblock": true, 00:12:17.897 "num_base_bdevs": 3, 00:12:17.897 "num_base_bdevs_discovered": 3, 00:12:17.897 "num_base_bdevs_operational": 3, 00:12:17.897 "base_bdevs_list": [ 00:12:17.897 { 00:12:17.897 "name": "BaseBdev1", 00:12:17.897 "uuid": "8e12bc3f-99d1-5dd3-95a3-8182c28c1b40", 00:12:17.897 "is_configured": true, 00:12:17.897 "data_offset": 2048, 00:12:17.897 "data_size": 63488 00:12:17.897 }, 00:12:17.897 { 00:12:17.897 "name": "BaseBdev2", 00:12:17.897 "uuid": "2ccf9acf-4709-550f-83eb-be9845e98123", 00:12:17.897 "is_configured": true, 00:12:17.897 "data_offset": 2048, 00:12:17.897 "data_size": 63488 00:12:17.897 }, 00:12:17.897 { 00:12:17.897 "name": "BaseBdev3", 00:12:17.897 "uuid": "cf095925-07a5-5393-8a85-d423c5cee468", 00:12:17.897 "is_configured": true, 00:12:17.897 "data_offset": 2048, 00:12:17.897 "data_size": 63488 00:12:17.897 } 00:12:17.897 ] 00:12:17.897 }' 00:12:17.897 11:53:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:17.897 11:53:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:18.156 11:53:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:18.416 [2024-07-12 11:53:08.522718] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:18.416 [2024-07-12 11:53:08.522753] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:18.416 [2024-07-12 11:53:08.524772] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:18.416 [2024-07-12 11:53:08.524809] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:18.416 [2024-07-12 11:53:08.524830] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:18.416 [2024-07-12 11:53:08.524836] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x195e000 name raid_bdev1, state offline 00:12:18.416 0 00:12:18.416 11:53:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 621918 00:12:18.416 11:53:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 621918 ']' 00:12:18.416 11:53:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 621918 00:12:18.416 11:53:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:12:18.416 11:53:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:18.416 11:53:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 621918 00:12:18.416 11:53:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:18.416 11:53:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:18.416 11:53:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 621918' 00:12:18.416 killing process with pid 621918 00:12:18.416 11:53:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 621918 00:12:18.416 [2024-07-12 11:53:08.586035] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:18.416 11:53:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 621918 00:12:18.416 [2024-07-12 11:53:08.604017] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:18.675 11:53:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.9mODr6r2wt 00:12:18.675 11:53:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:18.675 11:53:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:18.675 11:53:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.53 00:12:18.675 11:53:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:12:18.675 11:53:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:18.675 11:53:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:18.675 11:53:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.53 != \0\.\0\0 ]] 00:12:18.675 00:12:18.675 real 0m5.458s 00:12:18.675 user 0m8.482s 00:12:18.675 sys 0m0.795s 00:12:18.675 11:53:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:18.675 11:53:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:18.675 ************************************ 00:12:18.675 END TEST raid_write_error_test 00:12:18.675 ************************************ 00:12:18.675 11:53:08 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:18.675 11:53:08 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:12:18.675 11:53:08 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 3 false 00:12:18.675 11:53:08 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:18.675 11:53:08 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:18.675 11:53:08 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:18.675 ************************************ 00:12:18.675 START TEST raid_state_function_test 00:12:18.675 ************************************ 00:12:18.675 11:53:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 3 false 00:12:18.676 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:12:18.676 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:12:18.676 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:12:18.676 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:18.676 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:18.676 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:18.676 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:18.676 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:18.676 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:18.676 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:18.676 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:18.676 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:18.676 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:12:18.676 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:18.676 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:18.676 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:12:18.676 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:18.676 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:18.676 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:18.676 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:18.676 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:18.676 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:12:18.676 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:12:18.676 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:12:18.676 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:12:18.676 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:12:18.676 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=622924 00:12:18.676 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 622924' 00:12:18.676 Process raid pid: 622924 00:12:18.676 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:18.676 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 622924 /var/tmp/spdk-raid.sock 00:12:18.676 11:53:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 622924 ']' 00:12:18.676 11:53:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:18.676 11:53:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:18.676 11:53:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:18.676 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:18.676 11:53:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:18.676 11:53:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:18.676 [2024-07-12 11:53:08.921293] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:12:18.676 [2024-07-12 11:53:08.921330] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:18.935 [2024-07-12 11:53:08.985481] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:18.935 [2024-07-12 11:53:09.063534] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:12:18.935 [2024-07-12 11:53:09.119017] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:18.935 [2024-07-12 11:53:09.119042] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:19.503 11:53:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:19.503 11:53:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:12:19.503 11:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:19.762 [2024-07-12 11:53:09.846006] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:19.762 [2024-07-12 11:53:09.846039] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:19.762 [2024-07-12 11:53:09.846045] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:19.762 [2024-07-12 11:53:09.846051] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:19.762 [2024-07-12 11:53:09.846055] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:19.762 [2024-07-12 11:53:09.846060] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:19.762 11:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:19.762 11:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:19.762 11:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:19.762 11:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:19.762 11:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:19.762 11:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:19.762 11:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:19.762 11:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:19.762 11:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:19.762 11:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:19.762 11:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:19.762 11:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:20.021 11:53:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:20.021 "name": "Existed_Raid", 00:12:20.021 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:20.021 "strip_size_kb": 64, 00:12:20.021 "state": "configuring", 00:12:20.021 "raid_level": "concat", 00:12:20.021 "superblock": false, 00:12:20.021 "num_base_bdevs": 3, 00:12:20.021 "num_base_bdevs_discovered": 0, 00:12:20.021 "num_base_bdevs_operational": 3, 00:12:20.021 "base_bdevs_list": [ 00:12:20.021 { 00:12:20.021 "name": "BaseBdev1", 00:12:20.021 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:20.021 "is_configured": false, 00:12:20.021 "data_offset": 0, 00:12:20.021 "data_size": 0 00:12:20.021 }, 00:12:20.021 { 00:12:20.021 "name": "BaseBdev2", 00:12:20.021 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:20.021 "is_configured": false, 00:12:20.021 "data_offset": 0, 00:12:20.021 "data_size": 0 00:12:20.021 }, 00:12:20.021 { 00:12:20.021 "name": "BaseBdev3", 00:12:20.021 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:20.021 "is_configured": false, 00:12:20.021 "data_offset": 0, 00:12:20.021 "data_size": 0 00:12:20.021 } 00:12:20.021 ] 00:12:20.021 }' 00:12:20.021 11:53:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:20.021 11:53:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:20.620 11:53:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:20.620 [2024-07-12 11:53:10.684101] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:20.620 [2024-07-12 11:53:10.684126] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19431d0 name Existed_Raid, state configuring 00:12:20.620 11:53:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:20.620 [2024-07-12 11:53:10.844522] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:20.620 [2024-07-12 11:53:10.844545] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:20.620 [2024-07-12 11:53:10.844550] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:20.620 [2024-07-12 11:53:10.844559] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:20.620 [2024-07-12 11:53:10.844564] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:20.620 [2024-07-12 11:53:10.844569] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:20.620 11:53:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:20.879 [2024-07-12 11:53:11.013068] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:20.879 BaseBdev1 00:12:20.879 11:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:20.879 11:53:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:20.879 11:53:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:20.879 11:53:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:20.879 11:53:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:20.879 11:53:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:20.879 11:53:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:21.137 11:53:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:21.137 [ 00:12:21.137 { 00:12:21.137 "name": "BaseBdev1", 00:12:21.137 "aliases": [ 00:12:21.137 "be26beb3-85bc-4f4f-a10d-e6347cd54e2f" 00:12:21.137 ], 00:12:21.137 "product_name": "Malloc disk", 00:12:21.137 "block_size": 512, 00:12:21.137 "num_blocks": 65536, 00:12:21.137 "uuid": "be26beb3-85bc-4f4f-a10d-e6347cd54e2f", 00:12:21.137 "assigned_rate_limits": { 00:12:21.137 "rw_ios_per_sec": 0, 00:12:21.137 "rw_mbytes_per_sec": 0, 00:12:21.137 "r_mbytes_per_sec": 0, 00:12:21.137 "w_mbytes_per_sec": 0 00:12:21.137 }, 00:12:21.137 "claimed": true, 00:12:21.137 "claim_type": "exclusive_write", 00:12:21.137 "zoned": false, 00:12:21.137 "supported_io_types": { 00:12:21.137 "read": true, 00:12:21.137 "write": true, 00:12:21.137 "unmap": true, 00:12:21.137 "flush": true, 00:12:21.137 "reset": true, 00:12:21.137 "nvme_admin": false, 00:12:21.137 "nvme_io": false, 00:12:21.137 "nvme_io_md": false, 00:12:21.137 "write_zeroes": true, 00:12:21.137 "zcopy": true, 00:12:21.137 "get_zone_info": false, 00:12:21.137 "zone_management": false, 00:12:21.137 "zone_append": false, 00:12:21.137 "compare": false, 00:12:21.137 "compare_and_write": false, 00:12:21.137 "abort": true, 00:12:21.137 "seek_hole": false, 00:12:21.137 "seek_data": false, 00:12:21.137 "copy": true, 00:12:21.137 "nvme_iov_md": false 00:12:21.137 }, 00:12:21.137 "memory_domains": [ 00:12:21.137 { 00:12:21.137 "dma_device_id": "system", 00:12:21.137 "dma_device_type": 1 00:12:21.137 }, 00:12:21.137 { 00:12:21.137 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:21.137 "dma_device_type": 2 00:12:21.137 } 00:12:21.137 ], 00:12:21.137 "driver_specific": {} 00:12:21.137 } 00:12:21.137 ] 00:12:21.137 11:53:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:21.137 11:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:21.137 11:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:21.137 11:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:21.137 11:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:21.137 11:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:21.137 11:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:21.137 11:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:21.137 11:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:21.137 11:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:21.137 11:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:21.137 11:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:21.137 11:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:21.396 11:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:21.396 "name": "Existed_Raid", 00:12:21.396 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:21.396 "strip_size_kb": 64, 00:12:21.396 "state": "configuring", 00:12:21.396 "raid_level": "concat", 00:12:21.396 "superblock": false, 00:12:21.396 "num_base_bdevs": 3, 00:12:21.396 "num_base_bdevs_discovered": 1, 00:12:21.397 "num_base_bdevs_operational": 3, 00:12:21.397 "base_bdevs_list": [ 00:12:21.397 { 00:12:21.397 "name": "BaseBdev1", 00:12:21.397 "uuid": "be26beb3-85bc-4f4f-a10d-e6347cd54e2f", 00:12:21.397 "is_configured": true, 00:12:21.397 "data_offset": 0, 00:12:21.397 "data_size": 65536 00:12:21.397 }, 00:12:21.397 { 00:12:21.397 "name": "BaseBdev2", 00:12:21.397 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:21.397 "is_configured": false, 00:12:21.397 "data_offset": 0, 00:12:21.397 "data_size": 0 00:12:21.397 }, 00:12:21.397 { 00:12:21.397 "name": "BaseBdev3", 00:12:21.397 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:21.397 "is_configured": false, 00:12:21.397 "data_offset": 0, 00:12:21.397 "data_size": 0 00:12:21.397 } 00:12:21.397 ] 00:12:21.397 }' 00:12:21.397 11:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:21.397 11:53:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:21.963 11:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:21.963 [2024-07-12 11:53:12.184116] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:21.963 [2024-07-12 11:53:12.184144] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1942aa0 name Existed_Raid, state configuring 00:12:21.963 11:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:22.221 [2024-07-12 11:53:12.356586] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:22.221 [2024-07-12 11:53:12.357599] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:22.221 [2024-07-12 11:53:12.357621] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:22.221 [2024-07-12 11:53:12.357626] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:22.221 [2024-07-12 11:53:12.357631] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:22.221 11:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:22.222 11:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:22.222 11:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:22.222 11:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:22.222 11:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:22.222 11:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:22.222 11:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:22.222 11:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:22.222 11:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:22.222 11:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:22.222 11:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:22.222 11:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:22.222 11:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:22.222 11:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:22.480 11:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:22.480 "name": "Existed_Raid", 00:12:22.480 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:22.480 "strip_size_kb": 64, 00:12:22.480 "state": "configuring", 00:12:22.480 "raid_level": "concat", 00:12:22.480 "superblock": false, 00:12:22.480 "num_base_bdevs": 3, 00:12:22.480 "num_base_bdevs_discovered": 1, 00:12:22.480 "num_base_bdevs_operational": 3, 00:12:22.480 "base_bdevs_list": [ 00:12:22.480 { 00:12:22.480 "name": "BaseBdev1", 00:12:22.480 "uuid": "be26beb3-85bc-4f4f-a10d-e6347cd54e2f", 00:12:22.480 "is_configured": true, 00:12:22.480 "data_offset": 0, 00:12:22.480 "data_size": 65536 00:12:22.480 }, 00:12:22.480 { 00:12:22.480 "name": "BaseBdev2", 00:12:22.480 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:22.480 "is_configured": false, 00:12:22.480 "data_offset": 0, 00:12:22.480 "data_size": 0 00:12:22.480 }, 00:12:22.480 { 00:12:22.480 "name": "BaseBdev3", 00:12:22.480 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:22.480 "is_configured": false, 00:12:22.480 "data_offset": 0, 00:12:22.480 "data_size": 0 00:12:22.480 } 00:12:22.480 ] 00:12:22.480 }' 00:12:22.480 11:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:22.480 11:53:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:23.044 11:53:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:23.044 [2024-07-12 11:53:13.205374] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:23.044 BaseBdev2 00:12:23.044 11:53:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:23.044 11:53:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:23.044 11:53:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:23.044 11:53:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:23.044 11:53:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:23.044 11:53:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:23.044 11:53:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:23.301 11:53:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:23.301 [ 00:12:23.301 { 00:12:23.301 "name": "BaseBdev2", 00:12:23.301 "aliases": [ 00:12:23.301 "06e0835d-c19c-49bd-9df4-ecc696cff01c" 00:12:23.301 ], 00:12:23.301 "product_name": "Malloc disk", 00:12:23.301 "block_size": 512, 00:12:23.301 "num_blocks": 65536, 00:12:23.301 "uuid": "06e0835d-c19c-49bd-9df4-ecc696cff01c", 00:12:23.301 "assigned_rate_limits": { 00:12:23.301 "rw_ios_per_sec": 0, 00:12:23.301 "rw_mbytes_per_sec": 0, 00:12:23.301 "r_mbytes_per_sec": 0, 00:12:23.301 "w_mbytes_per_sec": 0 00:12:23.301 }, 00:12:23.301 "claimed": true, 00:12:23.301 "claim_type": "exclusive_write", 00:12:23.301 "zoned": false, 00:12:23.301 "supported_io_types": { 00:12:23.301 "read": true, 00:12:23.301 "write": true, 00:12:23.301 "unmap": true, 00:12:23.301 "flush": true, 00:12:23.301 "reset": true, 00:12:23.301 "nvme_admin": false, 00:12:23.301 "nvme_io": false, 00:12:23.301 "nvme_io_md": false, 00:12:23.301 "write_zeroes": true, 00:12:23.301 "zcopy": true, 00:12:23.301 "get_zone_info": false, 00:12:23.301 "zone_management": false, 00:12:23.301 "zone_append": false, 00:12:23.301 "compare": false, 00:12:23.301 "compare_and_write": false, 00:12:23.301 "abort": true, 00:12:23.301 "seek_hole": false, 00:12:23.301 "seek_data": false, 00:12:23.301 "copy": true, 00:12:23.301 "nvme_iov_md": false 00:12:23.301 }, 00:12:23.301 "memory_domains": [ 00:12:23.301 { 00:12:23.302 "dma_device_id": "system", 00:12:23.302 "dma_device_type": 1 00:12:23.302 }, 00:12:23.302 { 00:12:23.302 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:23.302 "dma_device_type": 2 00:12:23.302 } 00:12:23.302 ], 00:12:23.302 "driver_specific": {} 00:12:23.302 } 00:12:23.302 ] 00:12:23.558 11:53:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:23.558 11:53:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:23.558 11:53:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:23.558 11:53:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:23.558 11:53:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:23.558 11:53:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:23.558 11:53:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:23.558 11:53:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:23.558 11:53:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:23.558 11:53:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:23.558 11:53:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:23.558 11:53:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:23.558 11:53:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:23.558 11:53:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:23.558 11:53:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:23.558 11:53:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:23.559 "name": "Existed_Raid", 00:12:23.559 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:23.559 "strip_size_kb": 64, 00:12:23.559 "state": "configuring", 00:12:23.559 "raid_level": "concat", 00:12:23.559 "superblock": false, 00:12:23.559 "num_base_bdevs": 3, 00:12:23.559 "num_base_bdevs_discovered": 2, 00:12:23.559 "num_base_bdevs_operational": 3, 00:12:23.559 "base_bdevs_list": [ 00:12:23.559 { 00:12:23.559 "name": "BaseBdev1", 00:12:23.559 "uuid": "be26beb3-85bc-4f4f-a10d-e6347cd54e2f", 00:12:23.559 "is_configured": true, 00:12:23.559 "data_offset": 0, 00:12:23.559 "data_size": 65536 00:12:23.559 }, 00:12:23.559 { 00:12:23.559 "name": "BaseBdev2", 00:12:23.559 "uuid": "06e0835d-c19c-49bd-9df4-ecc696cff01c", 00:12:23.559 "is_configured": true, 00:12:23.559 "data_offset": 0, 00:12:23.559 "data_size": 65536 00:12:23.559 }, 00:12:23.559 { 00:12:23.559 "name": "BaseBdev3", 00:12:23.559 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:23.559 "is_configured": false, 00:12:23.559 "data_offset": 0, 00:12:23.559 "data_size": 0 00:12:23.559 } 00:12:23.559 ] 00:12:23.559 }' 00:12:23.559 11:53:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:23.559 11:53:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:24.125 11:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:24.125 [2024-07-12 11:53:14.355114] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:24.125 [2024-07-12 11:53:14.355141] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1943990 00:12:24.125 [2024-07-12 11:53:14.355145] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:12:24.125 [2024-07-12 11:53:14.355276] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1649210 00:12:24.125 [2024-07-12 11:53:14.355358] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1943990 00:12:24.125 [2024-07-12 11:53:14.355363] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1943990 00:12:24.125 [2024-07-12 11:53:14.355489] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:24.125 BaseBdev3 00:12:24.125 11:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:12:24.125 11:53:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:12:24.125 11:53:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:24.125 11:53:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:24.125 11:53:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:24.125 11:53:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:24.125 11:53:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:24.385 11:53:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:24.644 [ 00:12:24.644 { 00:12:24.644 "name": "BaseBdev3", 00:12:24.644 "aliases": [ 00:12:24.644 "37e264eb-c426-4421-bf81-6a6eba0200ee" 00:12:24.644 ], 00:12:24.644 "product_name": "Malloc disk", 00:12:24.644 "block_size": 512, 00:12:24.644 "num_blocks": 65536, 00:12:24.644 "uuid": "37e264eb-c426-4421-bf81-6a6eba0200ee", 00:12:24.644 "assigned_rate_limits": { 00:12:24.644 "rw_ios_per_sec": 0, 00:12:24.644 "rw_mbytes_per_sec": 0, 00:12:24.644 "r_mbytes_per_sec": 0, 00:12:24.644 "w_mbytes_per_sec": 0 00:12:24.644 }, 00:12:24.644 "claimed": true, 00:12:24.644 "claim_type": "exclusive_write", 00:12:24.644 "zoned": false, 00:12:24.644 "supported_io_types": { 00:12:24.644 "read": true, 00:12:24.644 "write": true, 00:12:24.644 "unmap": true, 00:12:24.644 "flush": true, 00:12:24.644 "reset": true, 00:12:24.644 "nvme_admin": false, 00:12:24.644 "nvme_io": false, 00:12:24.644 "nvme_io_md": false, 00:12:24.644 "write_zeroes": true, 00:12:24.644 "zcopy": true, 00:12:24.644 "get_zone_info": false, 00:12:24.644 "zone_management": false, 00:12:24.644 "zone_append": false, 00:12:24.644 "compare": false, 00:12:24.644 "compare_and_write": false, 00:12:24.644 "abort": true, 00:12:24.644 "seek_hole": false, 00:12:24.644 "seek_data": false, 00:12:24.644 "copy": true, 00:12:24.644 "nvme_iov_md": false 00:12:24.644 }, 00:12:24.644 "memory_domains": [ 00:12:24.644 { 00:12:24.644 "dma_device_id": "system", 00:12:24.644 "dma_device_type": 1 00:12:24.644 }, 00:12:24.644 { 00:12:24.644 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:24.644 "dma_device_type": 2 00:12:24.644 } 00:12:24.644 ], 00:12:24.644 "driver_specific": {} 00:12:24.644 } 00:12:24.644 ] 00:12:24.644 11:53:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:24.644 11:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:24.644 11:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:24.644 11:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:12:24.644 11:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:24.644 11:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:24.644 11:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:24.644 11:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:24.644 11:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:24.644 11:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:24.644 11:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:24.644 11:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:24.644 11:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:24.644 11:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:24.644 11:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:24.644 11:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:24.644 "name": "Existed_Raid", 00:12:24.644 "uuid": "5b318266-70d4-4da9-a4f0-3135e9c318f1", 00:12:24.644 "strip_size_kb": 64, 00:12:24.644 "state": "online", 00:12:24.644 "raid_level": "concat", 00:12:24.644 "superblock": false, 00:12:24.644 "num_base_bdevs": 3, 00:12:24.644 "num_base_bdevs_discovered": 3, 00:12:24.644 "num_base_bdevs_operational": 3, 00:12:24.644 "base_bdevs_list": [ 00:12:24.644 { 00:12:24.644 "name": "BaseBdev1", 00:12:24.644 "uuid": "be26beb3-85bc-4f4f-a10d-e6347cd54e2f", 00:12:24.644 "is_configured": true, 00:12:24.644 "data_offset": 0, 00:12:24.644 "data_size": 65536 00:12:24.644 }, 00:12:24.644 { 00:12:24.644 "name": "BaseBdev2", 00:12:24.644 "uuid": "06e0835d-c19c-49bd-9df4-ecc696cff01c", 00:12:24.644 "is_configured": true, 00:12:24.644 "data_offset": 0, 00:12:24.644 "data_size": 65536 00:12:24.644 }, 00:12:24.644 { 00:12:24.644 "name": "BaseBdev3", 00:12:24.644 "uuid": "37e264eb-c426-4421-bf81-6a6eba0200ee", 00:12:24.644 "is_configured": true, 00:12:24.644 "data_offset": 0, 00:12:24.644 "data_size": 65536 00:12:24.644 } 00:12:24.644 ] 00:12:24.644 }' 00:12:24.644 11:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:24.644 11:53:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:25.211 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:25.211 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:25.211 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:25.211 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:25.211 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:25.211 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:25.211 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:25.211 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:25.470 [2024-07-12 11:53:15.502274] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:25.470 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:25.470 "name": "Existed_Raid", 00:12:25.470 "aliases": [ 00:12:25.470 "5b318266-70d4-4da9-a4f0-3135e9c318f1" 00:12:25.470 ], 00:12:25.470 "product_name": "Raid Volume", 00:12:25.470 "block_size": 512, 00:12:25.470 "num_blocks": 196608, 00:12:25.470 "uuid": "5b318266-70d4-4da9-a4f0-3135e9c318f1", 00:12:25.470 "assigned_rate_limits": { 00:12:25.470 "rw_ios_per_sec": 0, 00:12:25.470 "rw_mbytes_per_sec": 0, 00:12:25.470 "r_mbytes_per_sec": 0, 00:12:25.470 "w_mbytes_per_sec": 0 00:12:25.470 }, 00:12:25.470 "claimed": false, 00:12:25.470 "zoned": false, 00:12:25.470 "supported_io_types": { 00:12:25.470 "read": true, 00:12:25.470 "write": true, 00:12:25.470 "unmap": true, 00:12:25.470 "flush": true, 00:12:25.470 "reset": true, 00:12:25.470 "nvme_admin": false, 00:12:25.470 "nvme_io": false, 00:12:25.470 "nvme_io_md": false, 00:12:25.471 "write_zeroes": true, 00:12:25.471 "zcopy": false, 00:12:25.471 "get_zone_info": false, 00:12:25.471 "zone_management": false, 00:12:25.471 "zone_append": false, 00:12:25.471 "compare": false, 00:12:25.471 "compare_and_write": false, 00:12:25.471 "abort": false, 00:12:25.471 "seek_hole": false, 00:12:25.471 "seek_data": false, 00:12:25.471 "copy": false, 00:12:25.471 "nvme_iov_md": false 00:12:25.471 }, 00:12:25.471 "memory_domains": [ 00:12:25.471 { 00:12:25.471 "dma_device_id": "system", 00:12:25.471 "dma_device_type": 1 00:12:25.471 }, 00:12:25.471 { 00:12:25.471 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:25.471 "dma_device_type": 2 00:12:25.471 }, 00:12:25.471 { 00:12:25.471 "dma_device_id": "system", 00:12:25.471 "dma_device_type": 1 00:12:25.471 }, 00:12:25.471 { 00:12:25.471 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:25.471 "dma_device_type": 2 00:12:25.471 }, 00:12:25.471 { 00:12:25.471 "dma_device_id": "system", 00:12:25.471 "dma_device_type": 1 00:12:25.471 }, 00:12:25.471 { 00:12:25.471 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:25.471 "dma_device_type": 2 00:12:25.471 } 00:12:25.471 ], 00:12:25.471 "driver_specific": { 00:12:25.471 "raid": { 00:12:25.471 "uuid": "5b318266-70d4-4da9-a4f0-3135e9c318f1", 00:12:25.471 "strip_size_kb": 64, 00:12:25.471 "state": "online", 00:12:25.471 "raid_level": "concat", 00:12:25.471 "superblock": false, 00:12:25.471 "num_base_bdevs": 3, 00:12:25.471 "num_base_bdevs_discovered": 3, 00:12:25.471 "num_base_bdevs_operational": 3, 00:12:25.471 "base_bdevs_list": [ 00:12:25.471 { 00:12:25.471 "name": "BaseBdev1", 00:12:25.471 "uuid": "be26beb3-85bc-4f4f-a10d-e6347cd54e2f", 00:12:25.471 "is_configured": true, 00:12:25.471 "data_offset": 0, 00:12:25.471 "data_size": 65536 00:12:25.471 }, 00:12:25.471 { 00:12:25.471 "name": "BaseBdev2", 00:12:25.471 "uuid": "06e0835d-c19c-49bd-9df4-ecc696cff01c", 00:12:25.471 "is_configured": true, 00:12:25.471 "data_offset": 0, 00:12:25.471 "data_size": 65536 00:12:25.471 }, 00:12:25.471 { 00:12:25.471 "name": "BaseBdev3", 00:12:25.471 "uuid": "37e264eb-c426-4421-bf81-6a6eba0200ee", 00:12:25.471 "is_configured": true, 00:12:25.471 "data_offset": 0, 00:12:25.471 "data_size": 65536 00:12:25.471 } 00:12:25.471 ] 00:12:25.471 } 00:12:25.471 } 00:12:25.471 }' 00:12:25.471 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:25.471 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:25.471 BaseBdev2 00:12:25.471 BaseBdev3' 00:12:25.471 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:25.471 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:25.471 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:25.471 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:25.471 "name": "BaseBdev1", 00:12:25.471 "aliases": [ 00:12:25.471 "be26beb3-85bc-4f4f-a10d-e6347cd54e2f" 00:12:25.471 ], 00:12:25.471 "product_name": "Malloc disk", 00:12:25.471 "block_size": 512, 00:12:25.471 "num_blocks": 65536, 00:12:25.471 "uuid": "be26beb3-85bc-4f4f-a10d-e6347cd54e2f", 00:12:25.471 "assigned_rate_limits": { 00:12:25.471 "rw_ios_per_sec": 0, 00:12:25.471 "rw_mbytes_per_sec": 0, 00:12:25.471 "r_mbytes_per_sec": 0, 00:12:25.471 "w_mbytes_per_sec": 0 00:12:25.471 }, 00:12:25.471 "claimed": true, 00:12:25.471 "claim_type": "exclusive_write", 00:12:25.471 "zoned": false, 00:12:25.471 "supported_io_types": { 00:12:25.471 "read": true, 00:12:25.471 "write": true, 00:12:25.471 "unmap": true, 00:12:25.471 "flush": true, 00:12:25.471 "reset": true, 00:12:25.471 "nvme_admin": false, 00:12:25.471 "nvme_io": false, 00:12:25.471 "nvme_io_md": false, 00:12:25.471 "write_zeroes": true, 00:12:25.471 "zcopy": true, 00:12:25.471 "get_zone_info": false, 00:12:25.471 "zone_management": false, 00:12:25.471 "zone_append": false, 00:12:25.471 "compare": false, 00:12:25.471 "compare_and_write": false, 00:12:25.471 "abort": true, 00:12:25.471 "seek_hole": false, 00:12:25.471 "seek_data": false, 00:12:25.471 "copy": true, 00:12:25.471 "nvme_iov_md": false 00:12:25.471 }, 00:12:25.471 "memory_domains": [ 00:12:25.471 { 00:12:25.471 "dma_device_id": "system", 00:12:25.471 "dma_device_type": 1 00:12:25.471 }, 00:12:25.471 { 00:12:25.471 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:25.471 "dma_device_type": 2 00:12:25.471 } 00:12:25.471 ], 00:12:25.471 "driver_specific": {} 00:12:25.471 }' 00:12:25.471 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:25.730 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:25.730 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:25.730 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:25.730 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:25.730 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:25.730 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:25.730 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:25.730 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:25.730 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:25.989 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:25.989 11:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:25.989 11:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:25.989 11:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:25.989 11:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:25.989 11:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:25.989 "name": "BaseBdev2", 00:12:25.989 "aliases": [ 00:12:25.989 "06e0835d-c19c-49bd-9df4-ecc696cff01c" 00:12:25.989 ], 00:12:25.989 "product_name": "Malloc disk", 00:12:25.989 "block_size": 512, 00:12:25.989 "num_blocks": 65536, 00:12:25.989 "uuid": "06e0835d-c19c-49bd-9df4-ecc696cff01c", 00:12:25.989 "assigned_rate_limits": { 00:12:25.989 "rw_ios_per_sec": 0, 00:12:25.989 "rw_mbytes_per_sec": 0, 00:12:25.989 "r_mbytes_per_sec": 0, 00:12:25.989 "w_mbytes_per_sec": 0 00:12:25.989 }, 00:12:25.989 "claimed": true, 00:12:25.989 "claim_type": "exclusive_write", 00:12:25.989 "zoned": false, 00:12:25.989 "supported_io_types": { 00:12:25.989 "read": true, 00:12:25.989 "write": true, 00:12:25.989 "unmap": true, 00:12:25.989 "flush": true, 00:12:25.989 "reset": true, 00:12:25.989 "nvme_admin": false, 00:12:25.989 "nvme_io": false, 00:12:25.989 "nvme_io_md": false, 00:12:25.989 "write_zeroes": true, 00:12:25.989 "zcopy": true, 00:12:25.989 "get_zone_info": false, 00:12:25.989 "zone_management": false, 00:12:25.989 "zone_append": false, 00:12:25.989 "compare": false, 00:12:25.989 "compare_and_write": false, 00:12:25.989 "abort": true, 00:12:25.989 "seek_hole": false, 00:12:25.989 "seek_data": false, 00:12:25.989 "copy": true, 00:12:25.989 "nvme_iov_md": false 00:12:25.989 }, 00:12:25.989 "memory_domains": [ 00:12:25.989 { 00:12:25.989 "dma_device_id": "system", 00:12:25.989 "dma_device_type": 1 00:12:25.989 }, 00:12:25.989 { 00:12:25.989 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:25.989 "dma_device_type": 2 00:12:25.989 } 00:12:25.989 ], 00:12:25.989 "driver_specific": {} 00:12:25.989 }' 00:12:25.989 11:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:25.989 11:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:26.248 11:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:26.248 11:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:26.248 11:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:26.248 11:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:26.248 11:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:26.248 11:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:26.248 11:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:26.248 11:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:26.248 11:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:26.248 11:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:26.248 11:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:26.507 11:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:26.507 11:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:26.507 11:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:26.507 "name": "BaseBdev3", 00:12:26.507 "aliases": [ 00:12:26.507 "37e264eb-c426-4421-bf81-6a6eba0200ee" 00:12:26.507 ], 00:12:26.507 "product_name": "Malloc disk", 00:12:26.507 "block_size": 512, 00:12:26.507 "num_blocks": 65536, 00:12:26.507 "uuid": "37e264eb-c426-4421-bf81-6a6eba0200ee", 00:12:26.507 "assigned_rate_limits": { 00:12:26.507 "rw_ios_per_sec": 0, 00:12:26.507 "rw_mbytes_per_sec": 0, 00:12:26.507 "r_mbytes_per_sec": 0, 00:12:26.507 "w_mbytes_per_sec": 0 00:12:26.507 }, 00:12:26.507 "claimed": true, 00:12:26.507 "claim_type": "exclusive_write", 00:12:26.507 "zoned": false, 00:12:26.507 "supported_io_types": { 00:12:26.507 "read": true, 00:12:26.507 "write": true, 00:12:26.507 "unmap": true, 00:12:26.507 "flush": true, 00:12:26.507 "reset": true, 00:12:26.507 "nvme_admin": false, 00:12:26.507 "nvme_io": false, 00:12:26.507 "nvme_io_md": false, 00:12:26.507 "write_zeroes": true, 00:12:26.507 "zcopy": true, 00:12:26.507 "get_zone_info": false, 00:12:26.507 "zone_management": false, 00:12:26.507 "zone_append": false, 00:12:26.507 "compare": false, 00:12:26.507 "compare_and_write": false, 00:12:26.507 "abort": true, 00:12:26.507 "seek_hole": false, 00:12:26.507 "seek_data": false, 00:12:26.507 "copy": true, 00:12:26.507 "nvme_iov_md": false 00:12:26.507 }, 00:12:26.507 "memory_domains": [ 00:12:26.507 { 00:12:26.507 "dma_device_id": "system", 00:12:26.507 "dma_device_type": 1 00:12:26.507 }, 00:12:26.507 { 00:12:26.507 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:26.507 "dma_device_type": 2 00:12:26.507 } 00:12:26.507 ], 00:12:26.507 "driver_specific": {} 00:12:26.507 }' 00:12:26.507 11:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:26.507 11:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:26.507 11:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:26.507 11:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:26.766 11:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:26.766 11:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:26.766 11:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:26.766 11:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:26.766 11:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:26.766 11:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:26.766 11:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:26.766 11:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:26.766 11:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:27.025 [2024-07-12 11:53:17.142377] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:27.025 [2024-07-12 11:53:17.142395] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:27.025 [2024-07-12 11:53:17.142425] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:27.025 11:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:27.025 11:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:12:27.025 11:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:27.025 11:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:27.025 11:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:12:27.025 11:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:12:27.025 11:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:27.025 11:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:12:27.025 11:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:27.025 11:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:27.025 11:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:27.025 11:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:27.025 11:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:27.025 11:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:27.025 11:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:27.025 11:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:27.025 11:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:27.285 11:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:27.285 "name": "Existed_Raid", 00:12:27.285 "uuid": "5b318266-70d4-4da9-a4f0-3135e9c318f1", 00:12:27.285 "strip_size_kb": 64, 00:12:27.285 "state": "offline", 00:12:27.285 "raid_level": "concat", 00:12:27.285 "superblock": false, 00:12:27.285 "num_base_bdevs": 3, 00:12:27.285 "num_base_bdevs_discovered": 2, 00:12:27.285 "num_base_bdevs_operational": 2, 00:12:27.285 "base_bdevs_list": [ 00:12:27.285 { 00:12:27.285 "name": null, 00:12:27.285 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:27.285 "is_configured": false, 00:12:27.285 "data_offset": 0, 00:12:27.285 "data_size": 65536 00:12:27.285 }, 00:12:27.285 { 00:12:27.285 "name": "BaseBdev2", 00:12:27.285 "uuid": "06e0835d-c19c-49bd-9df4-ecc696cff01c", 00:12:27.285 "is_configured": true, 00:12:27.285 "data_offset": 0, 00:12:27.285 "data_size": 65536 00:12:27.285 }, 00:12:27.285 { 00:12:27.285 "name": "BaseBdev3", 00:12:27.285 "uuid": "37e264eb-c426-4421-bf81-6a6eba0200ee", 00:12:27.285 "is_configured": true, 00:12:27.285 "data_offset": 0, 00:12:27.285 "data_size": 65536 00:12:27.285 } 00:12:27.285 ] 00:12:27.285 }' 00:12:27.285 11:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:27.285 11:53:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:27.852 11:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:27.852 11:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:27.852 11:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:27.852 11:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:27.852 11:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:27.852 11:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:27.852 11:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:28.110 [2024-07-12 11:53:18.153752] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:28.110 11:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:28.110 11:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:28.110 11:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:28.110 11:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:28.110 11:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:28.110 11:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:28.110 11:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:12:28.369 [2024-07-12 11:53:18.500406] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:28.369 [2024-07-12 11:53:18.500435] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1943990 name Existed_Raid, state offline 00:12:28.369 11:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:28.369 11:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:28.369 11:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:28.369 11:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:28.628 11:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:28.628 11:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:28.628 11:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:12:28.629 11:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:12:28.629 11:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:28.629 11:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:28.629 BaseBdev2 00:12:28.629 11:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:12:28.629 11:53:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:28.629 11:53:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:28.629 11:53:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:28.629 11:53:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:28.629 11:53:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:28.629 11:53:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:28.888 11:53:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:29.147 [ 00:12:29.147 { 00:12:29.147 "name": "BaseBdev2", 00:12:29.147 "aliases": [ 00:12:29.147 "4d965c15-e28c-4bd1-9441-67e18ec54683" 00:12:29.147 ], 00:12:29.147 "product_name": "Malloc disk", 00:12:29.147 "block_size": 512, 00:12:29.147 "num_blocks": 65536, 00:12:29.147 "uuid": "4d965c15-e28c-4bd1-9441-67e18ec54683", 00:12:29.147 "assigned_rate_limits": { 00:12:29.147 "rw_ios_per_sec": 0, 00:12:29.147 "rw_mbytes_per_sec": 0, 00:12:29.147 "r_mbytes_per_sec": 0, 00:12:29.147 "w_mbytes_per_sec": 0 00:12:29.147 }, 00:12:29.147 "claimed": false, 00:12:29.147 "zoned": false, 00:12:29.147 "supported_io_types": { 00:12:29.147 "read": true, 00:12:29.147 "write": true, 00:12:29.147 "unmap": true, 00:12:29.147 "flush": true, 00:12:29.147 "reset": true, 00:12:29.147 "nvme_admin": false, 00:12:29.147 "nvme_io": false, 00:12:29.147 "nvme_io_md": false, 00:12:29.147 "write_zeroes": true, 00:12:29.147 "zcopy": true, 00:12:29.147 "get_zone_info": false, 00:12:29.147 "zone_management": false, 00:12:29.147 "zone_append": false, 00:12:29.147 "compare": false, 00:12:29.147 "compare_and_write": false, 00:12:29.147 "abort": true, 00:12:29.147 "seek_hole": false, 00:12:29.147 "seek_data": false, 00:12:29.147 "copy": true, 00:12:29.147 "nvme_iov_md": false 00:12:29.147 }, 00:12:29.147 "memory_domains": [ 00:12:29.147 { 00:12:29.147 "dma_device_id": "system", 00:12:29.147 "dma_device_type": 1 00:12:29.147 }, 00:12:29.147 { 00:12:29.147 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:29.147 "dma_device_type": 2 00:12:29.147 } 00:12:29.147 ], 00:12:29.147 "driver_specific": {} 00:12:29.147 } 00:12:29.147 ] 00:12:29.147 11:53:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:29.147 11:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:12:29.147 11:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:29.147 11:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:29.147 BaseBdev3 00:12:29.147 11:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:12:29.147 11:53:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:12:29.147 11:53:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:29.147 11:53:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:29.147 11:53:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:29.147 11:53:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:29.147 11:53:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:29.406 11:53:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:29.665 [ 00:12:29.665 { 00:12:29.665 "name": "BaseBdev3", 00:12:29.665 "aliases": [ 00:12:29.665 "801b5e31-139d-49a2-96a1-7920422bcf25" 00:12:29.665 ], 00:12:29.665 "product_name": "Malloc disk", 00:12:29.665 "block_size": 512, 00:12:29.665 "num_blocks": 65536, 00:12:29.665 "uuid": "801b5e31-139d-49a2-96a1-7920422bcf25", 00:12:29.665 "assigned_rate_limits": { 00:12:29.665 "rw_ios_per_sec": 0, 00:12:29.665 "rw_mbytes_per_sec": 0, 00:12:29.665 "r_mbytes_per_sec": 0, 00:12:29.665 "w_mbytes_per_sec": 0 00:12:29.665 }, 00:12:29.665 "claimed": false, 00:12:29.665 "zoned": false, 00:12:29.665 "supported_io_types": { 00:12:29.665 "read": true, 00:12:29.665 "write": true, 00:12:29.665 "unmap": true, 00:12:29.665 "flush": true, 00:12:29.665 "reset": true, 00:12:29.665 "nvme_admin": false, 00:12:29.665 "nvme_io": false, 00:12:29.665 "nvme_io_md": false, 00:12:29.665 "write_zeroes": true, 00:12:29.665 "zcopy": true, 00:12:29.665 "get_zone_info": false, 00:12:29.665 "zone_management": false, 00:12:29.665 "zone_append": false, 00:12:29.665 "compare": false, 00:12:29.665 "compare_and_write": false, 00:12:29.665 "abort": true, 00:12:29.665 "seek_hole": false, 00:12:29.665 "seek_data": false, 00:12:29.665 "copy": true, 00:12:29.665 "nvme_iov_md": false 00:12:29.665 }, 00:12:29.665 "memory_domains": [ 00:12:29.665 { 00:12:29.665 "dma_device_id": "system", 00:12:29.665 "dma_device_type": 1 00:12:29.665 }, 00:12:29.665 { 00:12:29.665 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:29.665 "dma_device_type": 2 00:12:29.665 } 00:12:29.665 ], 00:12:29.665 "driver_specific": {} 00:12:29.665 } 00:12:29.665 ] 00:12:29.665 11:53:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:29.665 11:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:12:29.665 11:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:29.665 11:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:29.665 [2024-07-12 11:53:19.825073] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:29.665 [2024-07-12 11:53:19.825101] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:29.665 [2024-07-12 11:53:19.825112] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:29.665 [2024-07-12 11:53:19.826075] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:29.665 11:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:29.665 11:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:29.665 11:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:29.665 11:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:29.665 11:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:29.665 11:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:29.665 11:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:29.665 11:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:29.665 11:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:29.665 11:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:29.665 11:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:29.665 11:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:29.924 11:53:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:29.924 "name": "Existed_Raid", 00:12:29.924 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:29.924 "strip_size_kb": 64, 00:12:29.924 "state": "configuring", 00:12:29.924 "raid_level": "concat", 00:12:29.924 "superblock": false, 00:12:29.924 "num_base_bdevs": 3, 00:12:29.924 "num_base_bdevs_discovered": 2, 00:12:29.924 "num_base_bdevs_operational": 3, 00:12:29.924 "base_bdevs_list": [ 00:12:29.924 { 00:12:29.924 "name": "BaseBdev1", 00:12:29.924 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:29.924 "is_configured": false, 00:12:29.924 "data_offset": 0, 00:12:29.924 "data_size": 0 00:12:29.924 }, 00:12:29.924 { 00:12:29.924 "name": "BaseBdev2", 00:12:29.924 "uuid": "4d965c15-e28c-4bd1-9441-67e18ec54683", 00:12:29.924 "is_configured": true, 00:12:29.924 "data_offset": 0, 00:12:29.924 "data_size": 65536 00:12:29.924 }, 00:12:29.924 { 00:12:29.924 "name": "BaseBdev3", 00:12:29.924 "uuid": "801b5e31-139d-49a2-96a1-7920422bcf25", 00:12:29.924 "is_configured": true, 00:12:29.924 "data_offset": 0, 00:12:29.924 "data_size": 65536 00:12:29.924 } 00:12:29.924 ] 00:12:29.924 }' 00:12:29.924 11:53:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:29.924 11:53:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:30.489 11:53:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:12:30.489 [2024-07-12 11:53:20.663230] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:30.489 11:53:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:30.489 11:53:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:30.489 11:53:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:30.489 11:53:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:30.489 11:53:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:30.489 11:53:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:30.489 11:53:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:30.489 11:53:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:30.489 11:53:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:30.489 11:53:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:30.489 11:53:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:30.489 11:53:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:30.748 11:53:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:30.748 "name": "Existed_Raid", 00:12:30.748 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:30.748 "strip_size_kb": 64, 00:12:30.748 "state": "configuring", 00:12:30.748 "raid_level": "concat", 00:12:30.748 "superblock": false, 00:12:30.748 "num_base_bdevs": 3, 00:12:30.748 "num_base_bdevs_discovered": 1, 00:12:30.748 "num_base_bdevs_operational": 3, 00:12:30.748 "base_bdevs_list": [ 00:12:30.748 { 00:12:30.748 "name": "BaseBdev1", 00:12:30.748 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:30.748 "is_configured": false, 00:12:30.748 "data_offset": 0, 00:12:30.748 "data_size": 0 00:12:30.748 }, 00:12:30.748 { 00:12:30.748 "name": null, 00:12:30.748 "uuid": "4d965c15-e28c-4bd1-9441-67e18ec54683", 00:12:30.748 "is_configured": false, 00:12:30.748 "data_offset": 0, 00:12:30.748 "data_size": 65536 00:12:30.748 }, 00:12:30.748 { 00:12:30.748 "name": "BaseBdev3", 00:12:30.748 "uuid": "801b5e31-139d-49a2-96a1-7920422bcf25", 00:12:30.748 "is_configured": true, 00:12:30.748 "data_offset": 0, 00:12:30.748 "data_size": 65536 00:12:30.748 } 00:12:30.748 ] 00:12:30.748 }' 00:12:30.748 11:53:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:30.748 11:53:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:31.315 11:53:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:31.315 11:53:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:31.315 11:53:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:12:31.315 11:53:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:31.573 [2024-07-12 11:53:21.692714] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:31.574 BaseBdev1 00:12:31.574 11:53:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:12:31.574 11:53:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:31.574 11:53:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:31.574 11:53:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:31.574 11:53:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:31.574 11:53:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:31.574 11:53:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:31.832 11:53:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:31.832 [ 00:12:31.832 { 00:12:31.832 "name": "BaseBdev1", 00:12:31.832 "aliases": [ 00:12:31.832 "9ffb205b-698a-4fc7-b069-fe2d7f907fcb" 00:12:31.832 ], 00:12:31.832 "product_name": "Malloc disk", 00:12:31.832 "block_size": 512, 00:12:31.832 "num_blocks": 65536, 00:12:31.832 "uuid": "9ffb205b-698a-4fc7-b069-fe2d7f907fcb", 00:12:31.832 "assigned_rate_limits": { 00:12:31.832 "rw_ios_per_sec": 0, 00:12:31.832 "rw_mbytes_per_sec": 0, 00:12:31.832 "r_mbytes_per_sec": 0, 00:12:31.832 "w_mbytes_per_sec": 0 00:12:31.832 }, 00:12:31.832 "claimed": true, 00:12:31.832 "claim_type": "exclusive_write", 00:12:31.832 "zoned": false, 00:12:31.832 "supported_io_types": { 00:12:31.832 "read": true, 00:12:31.832 "write": true, 00:12:31.832 "unmap": true, 00:12:31.832 "flush": true, 00:12:31.832 "reset": true, 00:12:31.832 "nvme_admin": false, 00:12:31.832 "nvme_io": false, 00:12:31.832 "nvme_io_md": false, 00:12:31.832 "write_zeroes": true, 00:12:31.832 "zcopy": true, 00:12:31.832 "get_zone_info": false, 00:12:31.832 "zone_management": false, 00:12:31.833 "zone_append": false, 00:12:31.833 "compare": false, 00:12:31.833 "compare_and_write": false, 00:12:31.833 "abort": true, 00:12:31.833 "seek_hole": false, 00:12:31.833 "seek_data": false, 00:12:31.833 "copy": true, 00:12:31.833 "nvme_iov_md": false 00:12:31.833 }, 00:12:31.833 "memory_domains": [ 00:12:31.833 { 00:12:31.833 "dma_device_id": "system", 00:12:31.833 "dma_device_type": 1 00:12:31.833 }, 00:12:31.833 { 00:12:31.833 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:31.833 "dma_device_type": 2 00:12:31.833 } 00:12:31.833 ], 00:12:31.833 "driver_specific": {} 00:12:31.833 } 00:12:31.833 ] 00:12:31.833 11:53:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:31.833 11:53:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:31.833 11:53:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:31.833 11:53:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:31.833 11:53:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:31.833 11:53:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:31.833 11:53:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:31.833 11:53:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:31.833 11:53:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:31.833 11:53:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:31.833 11:53:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:31.833 11:53:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:31.833 11:53:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:32.092 11:53:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:32.092 "name": "Existed_Raid", 00:12:32.092 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:32.092 "strip_size_kb": 64, 00:12:32.092 "state": "configuring", 00:12:32.092 "raid_level": "concat", 00:12:32.092 "superblock": false, 00:12:32.092 "num_base_bdevs": 3, 00:12:32.092 "num_base_bdevs_discovered": 2, 00:12:32.092 "num_base_bdevs_operational": 3, 00:12:32.092 "base_bdevs_list": [ 00:12:32.092 { 00:12:32.092 "name": "BaseBdev1", 00:12:32.092 "uuid": "9ffb205b-698a-4fc7-b069-fe2d7f907fcb", 00:12:32.092 "is_configured": true, 00:12:32.092 "data_offset": 0, 00:12:32.092 "data_size": 65536 00:12:32.092 }, 00:12:32.092 { 00:12:32.092 "name": null, 00:12:32.092 "uuid": "4d965c15-e28c-4bd1-9441-67e18ec54683", 00:12:32.092 "is_configured": false, 00:12:32.092 "data_offset": 0, 00:12:32.092 "data_size": 65536 00:12:32.092 }, 00:12:32.092 { 00:12:32.092 "name": "BaseBdev3", 00:12:32.092 "uuid": "801b5e31-139d-49a2-96a1-7920422bcf25", 00:12:32.092 "is_configured": true, 00:12:32.092 "data_offset": 0, 00:12:32.092 "data_size": 65536 00:12:32.092 } 00:12:32.092 ] 00:12:32.092 }' 00:12:32.092 11:53:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:32.092 11:53:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:32.659 11:53:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:32.659 11:53:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:32.659 11:53:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:12:32.659 11:53:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:12:32.924 [2024-07-12 11:53:23.016151] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:32.924 11:53:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:32.924 11:53:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:32.924 11:53:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:32.924 11:53:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:32.924 11:53:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:32.924 11:53:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:32.924 11:53:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:32.924 11:53:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:32.924 11:53:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:32.924 11:53:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:32.924 11:53:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:32.924 11:53:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:33.183 11:53:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:33.183 "name": "Existed_Raid", 00:12:33.183 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:33.183 "strip_size_kb": 64, 00:12:33.183 "state": "configuring", 00:12:33.183 "raid_level": "concat", 00:12:33.183 "superblock": false, 00:12:33.183 "num_base_bdevs": 3, 00:12:33.183 "num_base_bdevs_discovered": 1, 00:12:33.183 "num_base_bdevs_operational": 3, 00:12:33.183 "base_bdevs_list": [ 00:12:33.183 { 00:12:33.183 "name": "BaseBdev1", 00:12:33.183 "uuid": "9ffb205b-698a-4fc7-b069-fe2d7f907fcb", 00:12:33.183 "is_configured": true, 00:12:33.183 "data_offset": 0, 00:12:33.183 "data_size": 65536 00:12:33.183 }, 00:12:33.183 { 00:12:33.183 "name": null, 00:12:33.183 "uuid": "4d965c15-e28c-4bd1-9441-67e18ec54683", 00:12:33.183 "is_configured": false, 00:12:33.183 "data_offset": 0, 00:12:33.183 "data_size": 65536 00:12:33.183 }, 00:12:33.183 { 00:12:33.183 "name": null, 00:12:33.183 "uuid": "801b5e31-139d-49a2-96a1-7920422bcf25", 00:12:33.183 "is_configured": false, 00:12:33.183 "data_offset": 0, 00:12:33.183 "data_size": 65536 00:12:33.183 } 00:12:33.183 ] 00:12:33.183 }' 00:12:33.183 11:53:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:33.183 11:53:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:33.441 11:53:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:12:33.441 11:53:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:33.699 11:53:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:12:33.699 11:53:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:12:33.958 [2024-07-12 11:53:23.990749] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:33.958 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:33.958 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:33.958 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:33.958 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:33.958 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:33.958 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:33.958 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:33.958 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:33.958 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:33.958 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:33.959 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:33.959 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:33.959 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:33.959 "name": "Existed_Raid", 00:12:33.959 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:33.959 "strip_size_kb": 64, 00:12:33.959 "state": "configuring", 00:12:33.959 "raid_level": "concat", 00:12:33.959 "superblock": false, 00:12:33.959 "num_base_bdevs": 3, 00:12:33.959 "num_base_bdevs_discovered": 2, 00:12:33.959 "num_base_bdevs_operational": 3, 00:12:33.959 "base_bdevs_list": [ 00:12:33.959 { 00:12:33.959 "name": "BaseBdev1", 00:12:33.959 "uuid": "9ffb205b-698a-4fc7-b069-fe2d7f907fcb", 00:12:33.959 "is_configured": true, 00:12:33.959 "data_offset": 0, 00:12:33.959 "data_size": 65536 00:12:33.959 }, 00:12:33.959 { 00:12:33.959 "name": null, 00:12:33.959 "uuid": "4d965c15-e28c-4bd1-9441-67e18ec54683", 00:12:33.959 "is_configured": false, 00:12:33.959 "data_offset": 0, 00:12:33.959 "data_size": 65536 00:12:33.959 }, 00:12:33.959 { 00:12:33.959 "name": "BaseBdev3", 00:12:33.959 "uuid": "801b5e31-139d-49a2-96a1-7920422bcf25", 00:12:33.959 "is_configured": true, 00:12:33.959 "data_offset": 0, 00:12:33.959 "data_size": 65536 00:12:33.959 } 00:12:33.959 ] 00:12:33.959 }' 00:12:33.959 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:33.959 11:53:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:34.528 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:12:34.528 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:34.787 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:12:34.787 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:34.787 [2024-07-12 11:53:24.965283] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:34.787 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:34.787 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:34.787 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:34.787 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:34.787 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:34.787 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:34.787 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:34.787 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:34.787 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:34.787 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:34.787 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:34.787 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:35.046 11:53:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:35.046 "name": "Existed_Raid", 00:12:35.046 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:35.046 "strip_size_kb": 64, 00:12:35.046 "state": "configuring", 00:12:35.046 "raid_level": "concat", 00:12:35.046 "superblock": false, 00:12:35.046 "num_base_bdevs": 3, 00:12:35.046 "num_base_bdevs_discovered": 1, 00:12:35.046 "num_base_bdevs_operational": 3, 00:12:35.046 "base_bdevs_list": [ 00:12:35.046 { 00:12:35.046 "name": null, 00:12:35.046 "uuid": "9ffb205b-698a-4fc7-b069-fe2d7f907fcb", 00:12:35.046 "is_configured": false, 00:12:35.046 "data_offset": 0, 00:12:35.046 "data_size": 65536 00:12:35.046 }, 00:12:35.046 { 00:12:35.046 "name": null, 00:12:35.046 "uuid": "4d965c15-e28c-4bd1-9441-67e18ec54683", 00:12:35.046 "is_configured": false, 00:12:35.046 "data_offset": 0, 00:12:35.046 "data_size": 65536 00:12:35.046 }, 00:12:35.046 { 00:12:35.046 "name": "BaseBdev3", 00:12:35.046 "uuid": "801b5e31-139d-49a2-96a1-7920422bcf25", 00:12:35.046 "is_configured": true, 00:12:35.046 "data_offset": 0, 00:12:35.046 "data_size": 65536 00:12:35.046 } 00:12:35.046 ] 00:12:35.046 }' 00:12:35.046 11:53:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:35.046 11:53:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:35.616 11:53:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:35.616 11:53:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:35.616 11:53:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:12:35.616 11:53:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:12:35.875 [2024-07-12 11:53:25.937558] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:35.875 11:53:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:35.875 11:53:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:35.875 11:53:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:35.875 11:53:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:35.875 11:53:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:35.875 11:53:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:35.875 11:53:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:35.875 11:53:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:35.875 11:53:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:35.875 11:53:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:35.875 11:53:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:35.875 11:53:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:36.134 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:36.134 "name": "Existed_Raid", 00:12:36.134 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:36.134 "strip_size_kb": 64, 00:12:36.134 "state": "configuring", 00:12:36.134 "raid_level": "concat", 00:12:36.134 "superblock": false, 00:12:36.134 "num_base_bdevs": 3, 00:12:36.134 "num_base_bdevs_discovered": 2, 00:12:36.134 "num_base_bdevs_operational": 3, 00:12:36.134 "base_bdevs_list": [ 00:12:36.134 { 00:12:36.134 "name": null, 00:12:36.134 "uuid": "9ffb205b-698a-4fc7-b069-fe2d7f907fcb", 00:12:36.134 "is_configured": false, 00:12:36.135 "data_offset": 0, 00:12:36.135 "data_size": 65536 00:12:36.135 }, 00:12:36.135 { 00:12:36.135 "name": "BaseBdev2", 00:12:36.135 "uuid": "4d965c15-e28c-4bd1-9441-67e18ec54683", 00:12:36.135 "is_configured": true, 00:12:36.135 "data_offset": 0, 00:12:36.135 "data_size": 65536 00:12:36.135 }, 00:12:36.135 { 00:12:36.135 "name": "BaseBdev3", 00:12:36.135 "uuid": "801b5e31-139d-49a2-96a1-7920422bcf25", 00:12:36.135 "is_configured": true, 00:12:36.135 "data_offset": 0, 00:12:36.135 "data_size": 65536 00:12:36.135 } 00:12:36.135 ] 00:12:36.135 }' 00:12:36.135 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:36.135 11:53:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:36.394 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:36.394 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:36.653 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:12:36.653 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:36.653 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:12:36.912 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 9ffb205b-698a-4fc7-b069-fe2d7f907fcb 00:12:36.912 [2024-07-12 11:53:27.103256] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:12:36.912 [2024-07-12 11:53:27.103283] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1af6d70 00:12:36.912 [2024-07-12 11:53:27.103287] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:12:36.912 [2024-07-12 11:53:27.103415] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x193ab50 00:12:36.912 [2024-07-12 11:53:27.103491] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1af6d70 00:12:36.912 [2024-07-12 11:53:27.103500] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1af6d70 00:12:36.912 [2024-07-12 11:53:27.103635] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:36.912 NewBaseBdev 00:12:36.912 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:12:36.912 11:53:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:12:36.912 11:53:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:36.912 11:53:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:36.912 11:53:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:36.912 11:53:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:36.912 11:53:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:37.172 11:53:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:12:37.431 [ 00:12:37.431 { 00:12:37.431 "name": "NewBaseBdev", 00:12:37.431 "aliases": [ 00:12:37.431 "9ffb205b-698a-4fc7-b069-fe2d7f907fcb" 00:12:37.432 ], 00:12:37.432 "product_name": "Malloc disk", 00:12:37.432 "block_size": 512, 00:12:37.432 "num_blocks": 65536, 00:12:37.432 "uuid": "9ffb205b-698a-4fc7-b069-fe2d7f907fcb", 00:12:37.432 "assigned_rate_limits": { 00:12:37.432 "rw_ios_per_sec": 0, 00:12:37.432 "rw_mbytes_per_sec": 0, 00:12:37.432 "r_mbytes_per_sec": 0, 00:12:37.432 "w_mbytes_per_sec": 0 00:12:37.432 }, 00:12:37.432 "claimed": true, 00:12:37.432 "claim_type": "exclusive_write", 00:12:37.432 "zoned": false, 00:12:37.432 "supported_io_types": { 00:12:37.432 "read": true, 00:12:37.432 "write": true, 00:12:37.432 "unmap": true, 00:12:37.432 "flush": true, 00:12:37.432 "reset": true, 00:12:37.432 "nvme_admin": false, 00:12:37.432 "nvme_io": false, 00:12:37.432 "nvme_io_md": false, 00:12:37.432 "write_zeroes": true, 00:12:37.432 "zcopy": true, 00:12:37.432 "get_zone_info": false, 00:12:37.432 "zone_management": false, 00:12:37.432 "zone_append": false, 00:12:37.432 "compare": false, 00:12:37.432 "compare_and_write": false, 00:12:37.432 "abort": true, 00:12:37.432 "seek_hole": false, 00:12:37.432 "seek_data": false, 00:12:37.432 "copy": true, 00:12:37.432 "nvme_iov_md": false 00:12:37.432 }, 00:12:37.432 "memory_domains": [ 00:12:37.432 { 00:12:37.432 "dma_device_id": "system", 00:12:37.432 "dma_device_type": 1 00:12:37.432 }, 00:12:37.432 { 00:12:37.432 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:37.432 "dma_device_type": 2 00:12:37.432 } 00:12:37.432 ], 00:12:37.432 "driver_specific": {} 00:12:37.432 } 00:12:37.432 ] 00:12:37.432 11:53:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:37.432 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:12:37.432 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:37.432 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:37.432 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:37.432 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:37.432 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:37.432 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:37.432 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:37.432 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:37.432 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:37.432 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:37.432 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:37.432 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:37.432 "name": "Existed_Raid", 00:12:37.432 "uuid": "3d4e9182-f5aa-44dd-988d-407375be2195", 00:12:37.432 "strip_size_kb": 64, 00:12:37.432 "state": "online", 00:12:37.432 "raid_level": "concat", 00:12:37.432 "superblock": false, 00:12:37.432 "num_base_bdevs": 3, 00:12:37.432 "num_base_bdevs_discovered": 3, 00:12:37.432 "num_base_bdevs_operational": 3, 00:12:37.432 "base_bdevs_list": [ 00:12:37.432 { 00:12:37.432 "name": "NewBaseBdev", 00:12:37.432 "uuid": "9ffb205b-698a-4fc7-b069-fe2d7f907fcb", 00:12:37.432 "is_configured": true, 00:12:37.432 "data_offset": 0, 00:12:37.432 "data_size": 65536 00:12:37.432 }, 00:12:37.432 { 00:12:37.432 "name": "BaseBdev2", 00:12:37.432 "uuid": "4d965c15-e28c-4bd1-9441-67e18ec54683", 00:12:37.432 "is_configured": true, 00:12:37.432 "data_offset": 0, 00:12:37.432 "data_size": 65536 00:12:37.432 }, 00:12:37.432 { 00:12:37.432 "name": "BaseBdev3", 00:12:37.432 "uuid": "801b5e31-139d-49a2-96a1-7920422bcf25", 00:12:37.432 "is_configured": true, 00:12:37.432 "data_offset": 0, 00:12:37.432 "data_size": 65536 00:12:37.432 } 00:12:37.432 ] 00:12:37.432 }' 00:12:37.432 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:37.432 11:53:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:38.000 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:12:38.000 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:38.000 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:38.000 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:38.000 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:38.000 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:38.000 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:38.000 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:38.259 [2024-07-12 11:53:28.266584] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:38.259 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:38.259 "name": "Existed_Raid", 00:12:38.259 "aliases": [ 00:12:38.259 "3d4e9182-f5aa-44dd-988d-407375be2195" 00:12:38.259 ], 00:12:38.259 "product_name": "Raid Volume", 00:12:38.259 "block_size": 512, 00:12:38.259 "num_blocks": 196608, 00:12:38.259 "uuid": "3d4e9182-f5aa-44dd-988d-407375be2195", 00:12:38.259 "assigned_rate_limits": { 00:12:38.259 "rw_ios_per_sec": 0, 00:12:38.259 "rw_mbytes_per_sec": 0, 00:12:38.259 "r_mbytes_per_sec": 0, 00:12:38.259 "w_mbytes_per_sec": 0 00:12:38.259 }, 00:12:38.259 "claimed": false, 00:12:38.259 "zoned": false, 00:12:38.259 "supported_io_types": { 00:12:38.259 "read": true, 00:12:38.259 "write": true, 00:12:38.259 "unmap": true, 00:12:38.259 "flush": true, 00:12:38.259 "reset": true, 00:12:38.259 "nvme_admin": false, 00:12:38.259 "nvme_io": false, 00:12:38.259 "nvme_io_md": false, 00:12:38.259 "write_zeroes": true, 00:12:38.259 "zcopy": false, 00:12:38.259 "get_zone_info": false, 00:12:38.259 "zone_management": false, 00:12:38.259 "zone_append": false, 00:12:38.259 "compare": false, 00:12:38.259 "compare_and_write": false, 00:12:38.259 "abort": false, 00:12:38.259 "seek_hole": false, 00:12:38.259 "seek_data": false, 00:12:38.259 "copy": false, 00:12:38.259 "nvme_iov_md": false 00:12:38.259 }, 00:12:38.259 "memory_domains": [ 00:12:38.259 { 00:12:38.259 "dma_device_id": "system", 00:12:38.259 "dma_device_type": 1 00:12:38.259 }, 00:12:38.259 { 00:12:38.259 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:38.259 "dma_device_type": 2 00:12:38.259 }, 00:12:38.259 { 00:12:38.259 "dma_device_id": "system", 00:12:38.259 "dma_device_type": 1 00:12:38.259 }, 00:12:38.259 { 00:12:38.259 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:38.259 "dma_device_type": 2 00:12:38.259 }, 00:12:38.259 { 00:12:38.259 "dma_device_id": "system", 00:12:38.259 "dma_device_type": 1 00:12:38.259 }, 00:12:38.259 { 00:12:38.259 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:38.259 "dma_device_type": 2 00:12:38.259 } 00:12:38.259 ], 00:12:38.259 "driver_specific": { 00:12:38.259 "raid": { 00:12:38.259 "uuid": "3d4e9182-f5aa-44dd-988d-407375be2195", 00:12:38.259 "strip_size_kb": 64, 00:12:38.259 "state": "online", 00:12:38.259 "raid_level": "concat", 00:12:38.259 "superblock": false, 00:12:38.259 "num_base_bdevs": 3, 00:12:38.259 "num_base_bdevs_discovered": 3, 00:12:38.259 "num_base_bdevs_operational": 3, 00:12:38.259 "base_bdevs_list": [ 00:12:38.259 { 00:12:38.259 "name": "NewBaseBdev", 00:12:38.260 "uuid": "9ffb205b-698a-4fc7-b069-fe2d7f907fcb", 00:12:38.260 "is_configured": true, 00:12:38.260 "data_offset": 0, 00:12:38.260 "data_size": 65536 00:12:38.260 }, 00:12:38.260 { 00:12:38.260 "name": "BaseBdev2", 00:12:38.260 "uuid": "4d965c15-e28c-4bd1-9441-67e18ec54683", 00:12:38.260 "is_configured": true, 00:12:38.260 "data_offset": 0, 00:12:38.260 "data_size": 65536 00:12:38.260 }, 00:12:38.260 { 00:12:38.260 "name": "BaseBdev3", 00:12:38.260 "uuid": "801b5e31-139d-49a2-96a1-7920422bcf25", 00:12:38.260 "is_configured": true, 00:12:38.260 "data_offset": 0, 00:12:38.260 "data_size": 65536 00:12:38.260 } 00:12:38.260 ] 00:12:38.260 } 00:12:38.260 } 00:12:38.260 }' 00:12:38.260 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:38.260 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:12:38.260 BaseBdev2 00:12:38.260 BaseBdev3' 00:12:38.260 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:38.260 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:12:38.260 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:38.260 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:38.260 "name": "NewBaseBdev", 00:12:38.260 "aliases": [ 00:12:38.260 "9ffb205b-698a-4fc7-b069-fe2d7f907fcb" 00:12:38.260 ], 00:12:38.260 "product_name": "Malloc disk", 00:12:38.260 "block_size": 512, 00:12:38.260 "num_blocks": 65536, 00:12:38.260 "uuid": "9ffb205b-698a-4fc7-b069-fe2d7f907fcb", 00:12:38.260 "assigned_rate_limits": { 00:12:38.260 "rw_ios_per_sec": 0, 00:12:38.260 "rw_mbytes_per_sec": 0, 00:12:38.260 "r_mbytes_per_sec": 0, 00:12:38.260 "w_mbytes_per_sec": 0 00:12:38.260 }, 00:12:38.260 "claimed": true, 00:12:38.260 "claim_type": "exclusive_write", 00:12:38.260 "zoned": false, 00:12:38.260 "supported_io_types": { 00:12:38.260 "read": true, 00:12:38.260 "write": true, 00:12:38.260 "unmap": true, 00:12:38.260 "flush": true, 00:12:38.260 "reset": true, 00:12:38.260 "nvme_admin": false, 00:12:38.260 "nvme_io": false, 00:12:38.260 "nvme_io_md": false, 00:12:38.260 "write_zeroes": true, 00:12:38.260 "zcopy": true, 00:12:38.260 "get_zone_info": false, 00:12:38.260 "zone_management": false, 00:12:38.260 "zone_append": false, 00:12:38.260 "compare": false, 00:12:38.260 "compare_and_write": false, 00:12:38.260 "abort": true, 00:12:38.260 "seek_hole": false, 00:12:38.260 "seek_data": false, 00:12:38.260 "copy": true, 00:12:38.260 "nvme_iov_md": false 00:12:38.260 }, 00:12:38.260 "memory_domains": [ 00:12:38.260 { 00:12:38.260 "dma_device_id": "system", 00:12:38.260 "dma_device_type": 1 00:12:38.260 }, 00:12:38.260 { 00:12:38.260 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:38.260 "dma_device_type": 2 00:12:38.260 } 00:12:38.260 ], 00:12:38.260 "driver_specific": {} 00:12:38.260 }' 00:12:38.260 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:38.519 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:38.519 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:38.519 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:38.519 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:38.519 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:38.519 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:38.519 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:38.519 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:38.519 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:38.779 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:38.779 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:38.779 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:38.779 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:38.779 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:38.779 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:38.779 "name": "BaseBdev2", 00:12:38.779 "aliases": [ 00:12:38.779 "4d965c15-e28c-4bd1-9441-67e18ec54683" 00:12:38.779 ], 00:12:38.779 "product_name": "Malloc disk", 00:12:38.779 "block_size": 512, 00:12:38.779 "num_blocks": 65536, 00:12:38.779 "uuid": "4d965c15-e28c-4bd1-9441-67e18ec54683", 00:12:38.779 "assigned_rate_limits": { 00:12:38.779 "rw_ios_per_sec": 0, 00:12:38.779 "rw_mbytes_per_sec": 0, 00:12:38.779 "r_mbytes_per_sec": 0, 00:12:38.779 "w_mbytes_per_sec": 0 00:12:38.779 }, 00:12:38.779 "claimed": true, 00:12:38.779 "claim_type": "exclusive_write", 00:12:38.779 "zoned": false, 00:12:38.779 "supported_io_types": { 00:12:38.779 "read": true, 00:12:38.779 "write": true, 00:12:38.779 "unmap": true, 00:12:38.779 "flush": true, 00:12:38.779 "reset": true, 00:12:38.779 "nvme_admin": false, 00:12:38.779 "nvme_io": false, 00:12:38.779 "nvme_io_md": false, 00:12:38.779 "write_zeroes": true, 00:12:38.779 "zcopy": true, 00:12:38.779 "get_zone_info": false, 00:12:38.779 "zone_management": false, 00:12:38.779 "zone_append": false, 00:12:38.779 "compare": false, 00:12:38.779 "compare_and_write": false, 00:12:38.779 "abort": true, 00:12:38.779 "seek_hole": false, 00:12:38.779 "seek_data": false, 00:12:38.779 "copy": true, 00:12:38.779 "nvme_iov_md": false 00:12:38.779 }, 00:12:38.779 "memory_domains": [ 00:12:38.779 { 00:12:38.779 "dma_device_id": "system", 00:12:38.779 "dma_device_type": 1 00:12:38.779 }, 00:12:38.779 { 00:12:38.779 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:38.779 "dma_device_type": 2 00:12:38.779 } 00:12:38.779 ], 00:12:38.779 "driver_specific": {} 00:12:38.779 }' 00:12:38.779 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:39.052 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:39.052 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:39.052 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:39.052 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:39.052 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:39.052 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:39.052 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:39.052 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:39.052 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:39.052 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:39.052 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:39.052 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:39.052 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:39.052 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:39.336 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:39.336 "name": "BaseBdev3", 00:12:39.336 "aliases": [ 00:12:39.336 "801b5e31-139d-49a2-96a1-7920422bcf25" 00:12:39.336 ], 00:12:39.336 "product_name": "Malloc disk", 00:12:39.336 "block_size": 512, 00:12:39.336 "num_blocks": 65536, 00:12:39.336 "uuid": "801b5e31-139d-49a2-96a1-7920422bcf25", 00:12:39.336 "assigned_rate_limits": { 00:12:39.336 "rw_ios_per_sec": 0, 00:12:39.336 "rw_mbytes_per_sec": 0, 00:12:39.336 "r_mbytes_per_sec": 0, 00:12:39.336 "w_mbytes_per_sec": 0 00:12:39.336 }, 00:12:39.336 "claimed": true, 00:12:39.336 "claim_type": "exclusive_write", 00:12:39.336 "zoned": false, 00:12:39.336 "supported_io_types": { 00:12:39.336 "read": true, 00:12:39.336 "write": true, 00:12:39.336 "unmap": true, 00:12:39.336 "flush": true, 00:12:39.336 "reset": true, 00:12:39.336 "nvme_admin": false, 00:12:39.336 "nvme_io": false, 00:12:39.336 "nvme_io_md": false, 00:12:39.336 "write_zeroes": true, 00:12:39.336 "zcopy": true, 00:12:39.336 "get_zone_info": false, 00:12:39.336 "zone_management": false, 00:12:39.336 "zone_append": false, 00:12:39.336 "compare": false, 00:12:39.336 "compare_and_write": false, 00:12:39.336 "abort": true, 00:12:39.336 "seek_hole": false, 00:12:39.336 "seek_data": false, 00:12:39.336 "copy": true, 00:12:39.336 "nvme_iov_md": false 00:12:39.336 }, 00:12:39.336 "memory_domains": [ 00:12:39.336 { 00:12:39.336 "dma_device_id": "system", 00:12:39.336 "dma_device_type": 1 00:12:39.336 }, 00:12:39.336 { 00:12:39.336 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:39.336 "dma_device_type": 2 00:12:39.336 } 00:12:39.336 ], 00:12:39.336 "driver_specific": {} 00:12:39.336 }' 00:12:39.336 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:39.336 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:39.336 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:39.336 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:39.336 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:39.336 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:39.336 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:39.336 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:39.620 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:39.620 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:39.620 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:39.620 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:39.620 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:39.620 [2024-07-12 11:53:29.822509] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:39.620 [2024-07-12 11:53:29.822531] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:39.620 [2024-07-12 11:53:29.822567] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:39.620 [2024-07-12 11:53:29.822600] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:39.620 [2024-07-12 11:53:29.822606] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1af6d70 name Existed_Raid, state offline 00:12:39.620 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 622924 00:12:39.620 11:53:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 622924 ']' 00:12:39.620 11:53:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 622924 00:12:39.620 11:53:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:12:39.620 11:53:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:39.620 11:53:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 622924 00:12:39.896 11:53:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:39.896 11:53:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:39.896 11:53:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 622924' 00:12:39.896 killing process with pid 622924 00:12:39.896 11:53:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 622924 00:12:39.896 [2024-07-12 11:53:29.877646] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:39.896 11:53:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 622924 00:12:39.896 [2024-07-12 11:53:29.901430] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:39.896 11:53:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:12:39.896 00:12:39.896 real 0m21.217s 00:12:39.896 user 0m39.602s 00:12:39.896 sys 0m3.258s 00:12:39.896 11:53:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:39.896 11:53:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:39.896 ************************************ 00:12:39.896 END TEST raid_state_function_test 00:12:39.896 ************************************ 00:12:39.896 11:53:30 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:39.896 11:53:30 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 3 true 00:12:39.896 11:53:30 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:39.896 11:53:30 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:39.896 11:53:30 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:40.179 ************************************ 00:12:40.179 START TEST raid_state_function_test_sb 00:12:40.179 ************************************ 00:12:40.179 11:53:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 3 true 00:12:40.179 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:12:40.179 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:12:40.179 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:12:40.179 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:40.179 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:40.179 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:40.179 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:40.179 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:40.179 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:40.179 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:40.179 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:40.179 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:40.179 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:12:40.179 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:40.179 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:40.179 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:12:40.179 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:40.179 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:40.179 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:40.179 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:40.179 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:40.179 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:12:40.179 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:12:40.179 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:12:40.179 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:12:40.179 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:12:40.179 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=627167 00:12:40.179 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:40.179 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 627167' 00:12:40.179 Process raid pid: 627167 00:12:40.179 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 627167 /var/tmp/spdk-raid.sock 00:12:40.179 11:53:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 627167 ']' 00:12:40.179 11:53:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:40.179 11:53:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:40.179 11:53:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:40.179 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:40.179 11:53:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:40.179 11:53:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:40.179 [2024-07-12 11:53:30.210059] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:12:40.180 [2024-07-12 11:53:30.210102] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:40.180 [2024-07-12 11:53:30.274957] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:40.180 [2024-07-12 11:53:30.345691] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:12:40.180 [2024-07-12 11:53:30.395111] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:40.180 [2024-07-12 11:53:30.395133] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:40.763 11:53:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:40.763 11:53:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:12:40.763 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:41.020 [2024-07-12 11:53:31.150118] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:41.021 [2024-07-12 11:53:31.150150] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:41.021 [2024-07-12 11:53:31.150156] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:41.021 [2024-07-12 11:53:31.150162] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:41.021 [2024-07-12 11:53:31.150166] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:41.021 [2024-07-12 11:53:31.150171] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:41.021 11:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:41.021 11:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:41.021 11:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:41.021 11:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:41.021 11:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:41.021 11:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:41.021 11:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:41.021 11:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:41.021 11:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:41.021 11:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:41.021 11:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:41.021 11:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:41.278 11:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:41.279 "name": "Existed_Raid", 00:12:41.279 "uuid": "25cee72c-4d30-4677-b800-0ebe94547f9b", 00:12:41.279 "strip_size_kb": 64, 00:12:41.279 "state": "configuring", 00:12:41.279 "raid_level": "concat", 00:12:41.279 "superblock": true, 00:12:41.279 "num_base_bdevs": 3, 00:12:41.279 "num_base_bdevs_discovered": 0, 00:12:41.279 "num_base_bdevs_operational": 3, 00:12:41.279 "base_bdevs_list": [ 00:12:41.279 { 00:12:41.279 "name": "BaseBdev1", 00:12:41.279 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:41.279 "is_configured": false, 00:12:41.279 "data_offset": 0, 00:12:41.279 "data_size": 0 00:12:41.279 }, 00:12:41.279 { 00:12:41.279 "name": "BaseBdev2", 00:12:41.279 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:41.279 "is_configured": false, 00:12:41.279 "data_offset": 0, 00:12:41.279 "data_size": 0 00:12:41.279 }, 00:12:41.279 { 00:12:41.279 "name": "BaseBdev3", 00:12:41.279 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:41.279 "is_configured": false, 00:12:41.279 "data_offset": 0, 00:12:41.279 "data_size": 0 00:12:41.279 } 00:12:41.279 ] 00:12:41.279 }' 00:12:41.279 11:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:41.279 11:53:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:41.847 11:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:41.847 [2024-07-12 11:53:31.976151] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:41.847 [2024-07-12 11:53:31.976174] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21cc1d0 name Existed_Raid, state configuring 00:12:41.847 11:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:42.106 [2024-07-12 11:53:32.144608] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:42.106 [2024-07-12 11:53:32.144626] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:42.106 [2024-07-12 11:53:32.144631] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:42.106 [2024-07-12 11:53:32.144636] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:42.106 [2024-07-12 11:53:32.144640] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:42.106 [2024-07-12 11:53:32.144645] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:42.106 11:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:42.106 [2024-07-12 11:53:32.321342] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:42.106 BaseBdev1 00:12:42.106 11:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:42.106 11:53:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:42.106 11:53:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:42.106 11:53:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:42.106 11:53:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:42.106 11:53:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:42.106 11:53:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:42.366 11:53:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:42.626 [ 00:12:42.626 { 00:12:42.626 "name": "BaseBdev1", 00:12:42.626 "aliases": [ 00:12:42.626 "a7c61df9-decc-47a6-a74a-235508a41c5a" 00:12:42.626 ], 00:12:42.626 "product_name": "Malloc disk", 00:12:42.626 "block_size": 512, 00:12:42.626 "num_blocks": 65536, 00:12:42.626 "uuid": "a7c61df9-decc-47a6-a74a-235508a41c5a", 00:12:42.626 "assigned_rate_limits": { 00:12:42.626 "rw_ios_per_sec": 0, 00:12:42.626 "rw_mbytes_per_sec": 0, 00:12:42.626 "r_mbytes_per_sec": 0, 00:12:42.626 "w_mbytes_per_sec": 0 00:12:42.626 }, 00:12:42.626 "claimed": true, 00:12:42.626 "claim_type": "exclusive_write", 00:12:42.626 "zoned": false, 00:12:42.626 "supported_io_types": { 00:12:42.626 "read": true, 00:12:42.626 "write": true, 00:12:42.626 "unmap": true, 00:12:42.626 "flush": true, 00:12:42.626 "reset": true, 00:12:42.626 "nvme_admin": false, 00:12:42.626 "nvme_io": false, 00:12:42.626 "nvme_io_md": false, 00:12:42.626 "write_zeroes": true, 00:12:42.626 "zcopy": true, 00:12:42.626 "get_zone_info": false, 00:12:42.626 "zone_management": false, 00:12:42.626 "zone_append": false, 00:12:42.626 "compare": false, 00:12:42.626 "compare_and_write": false, 00:12:42.626 "abort": true, 00:12:42.626 "seek_hole": false, 00:12:42.626 "seek_data": false, 00:12:42.626 "copy": true, 00:12:42.626 "nvme_iov_md": false 00:12:42.626 }, 00:12:42.626 "memory_domains": [ 00:12:42.626 { 00:12:42.626 "dma_device_id": "system", 00:12:42.626 "dma_device_type": 1 00:12:42.626 }, 00:12:42.626 { 00:12:42.626 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:42.626 "dma_device_type": 2 00:12:42.626 } 00:12:42.626 ], 00:12:42.626 "driver_specific": {} 00:12:42.626 } 00:12:42.626 ] 00:12:42.626 11:53:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:42.626 11:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:42.626 11:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:42.626 11:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:42.626 11:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:42.626 11:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:42.626 11:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:42.626 11:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:42.626 11:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:42.626 11:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:42.626 11:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:42.626 11:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:42.626 11:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:42.626 11:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:42.626 "name": "Existed_Raid", 00:12:42.626 "uuid": "ca64ecdd-a8ba-4d6e-a815-1fb07eacd22b", 00:12:42.626 "strip_size_kb": 64, 00:12:42.626 "state": "configuring", 00:12:42.626 "raid_level": "concat", 00:12:42.626 "superblock": true, 00:12:42.626 "num_base_bdevs": 3, 00:12:42.626 "num_base_bdevs_discovered": 1, 00:12:42.626 "num_base_bdevs_operational": 3, 00:12:42.626 "base_bdevs_list": [ 00:12:42.626 { 00:12:42.626 "name": "BaseBdev1", 00:12:42.626 "uuid": "a7c61df9-decc-47a6-a74a-235508a41c5a", 00:12:42.626 "is_configured": true, 00:12:42.626 "data_offset": 2048, 00:12:42.626 "data_size": 63488 00:12:42.626 }, 00:12:42.626 { 00:12:42.626 "name": "BaseBdev2", 00:12:42.626 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:42.626 "is_configured": false, 00:12:42.626 "data_offset": 0, 00:12:42.626 "data_size": 0 00:12:42.626 }, 00:12:42.626 { 00:12:42.626 "name": "BaseBdev3", 00:12:42.626 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:42.626 "is_configured": false, 00:12:42.626 "data_offset": 0, 00:12:42.626 "data_size": 0 00:12:42.626 } 00:12:42.626 ] 00:12:42.626 }' 00:12:42.626 11:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:42.626 11:53:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:43.193 11:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:43.452 [2024-07-12 11:53:33.476314] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:43.452 [2024-07-12 11:53:33.476345] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21cbaa0 name Existed_Raid, state configuring 00:12:43.452 11:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:43.452 [2024-07-12 11:53:33.644788] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:43.452 [2024-07-12 11:53:33.645835] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:43.452 [2024-07-12 11:53:33.645862] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:43.452 [2024-07-12 11:53:33.645867] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:43.452 [2024-07-12 11:53:33.645872] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:43.452 11:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:43.452 11:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:43.452 11:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:43.452 11:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:43.452 11:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:43.452 11:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:43.452 11:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:43.452 11:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:43.452 11:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:43.452 11:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:43.452 11:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:43.452 11:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:43.452 11:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:43.452 11:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:43.711 11:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:43.711 "name": "Existed_Raid", 00:12:43.711 "uuid": "f4c5e330-4061-404c-aee2-bdc691084ee0", 00:12:43.711 "strip_size_kb": 64, 00:12:43.711 "state": "configuring", 00:12:43.711 "raid_level": "concat", 00:12:43.711 "superblock": true, 00:12:43.711 "num_base_bdevs": 3, 00:12:43.711 "num_base_bdevs_discovered": 1, 00:12:43.711 "num_base_bdevs_operational": 3, 00:12:43.711 "base_bdevs_list": [ 00:12:43.711 { 00:12:43.711 "name": "BaseBdev1", 00:12:43.711 "uuid": "a7c61df9-decc-47a6-a74a-235508a41c5a", 00:12:43.711 "is_configured": true, 00:12:43.711 "data_offset": 2048, 00:12:43.711 "data_size": 63488 00:12:43.711 }, 00:12:43.711 { 00:12:43.711 "name": "BaseBdev2", 00:12:43.711 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:43.711 "is_configured": false, 00:12:43.711 "data_offset": 0, 00:12:43.711 "data_size": 0 00:12:43.711 }, 00:12:43.711 { 00:12:43.711 "name": "BaseBdev3", 00:12:43.711 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:43.711 "is_configured": false, 00:12:43.711 "data_offset": 0, 00:12:43.711 "data_size": 0 00:12:43.711 } 00:12:43.711 ] 00:12:43.711 }' 00:12:43.711 11:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:43.711 11:53:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:44.278 11:53:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:44.278 [2024-07-12 11:53:34.461484] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:44.278 BaseBdev2 00:12:44.278 11:53:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:44.278 11:53:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:44.278 11:53:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:44.278 11:53:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:44.278 11:53:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:44.278 11:53:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:44.278 11:53:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:44.537 11:53:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:44.796 [ 00:12:44.796 { 00:12:44.796 "name": "BaseBdev2", 00:12:44.796 "aliases": [ 00:12:44.796 "f47e0069-5dd4-449e-bc6e-c93b3655f2a7" 00:12:44.796 ], 00:12:44.796 "product_name": "Malloc disk", 00:12:44.796 "block_size": 512, 00:12:44.796 "num_blocks": 65536, 00:12:44.796 "uuid": "f47e0069-5dd4-449e-bc6e-c93b3655f2a7", 00:12:44.796 "assigned_rate_limits": { 00:12:44.796 "rw_ios_per_sec": 0, 00:12:44.796 "rw_mbytes_per_sec": 0, 00:12:44.796 "r_mbytes_per_sec": 0, 00:12:44.796 "w_mbytes_per_sec": 0 00:12:44.796 }, 00:12:44.796 "claimed": true, 00:12:44.796 "claim_type": "exclusive_write", 00:12:44.796 "zoned": false, 00:12:44.796 "supported_io_types": { 00:12:44.796 "read": true, 00:12:44.796 "write": true, 00:12:44.796 "unmap": true, 00:12:44.796 "flush": true, 00:12:44.796 "reset": true, 00:12:44.796 "nvme_admin": false, 00:12:44.796 "nvme_io": false, 00:12:44.796 "nvme_io_md": false, 00:12:44.796 "write_zeroes": true, 00:12:44.796 "zcopy": true, 00:12:44.796 "get_zone_info": false, 00:12:44.796 "zone_management": false, 00:12:44.796 "zone_append": false, 00:12:44.796 "compare": false, 00:12:44.796 "compare_and_write": false, 00:12:44.796 "abort": true, 00:12:44.796 "seek_hole": false, 00:12:44.796 "seek_data": false, 00:12:44.796 "copy": true, 00:12:44.796 "nvme_iov_md": false 00:12:44.796 }, 00:12:44.796 "memory_domains": [ 00:12:44.796 { 00:12:44.796 "dma_device_id": "system", 00:12:44.796 "dma_device_type": 1 00:12:44.796 }, 00:12:44.796 { 00:12:44.796 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:44.796 "dma_device_type": 2 00:12:44.796 } 00:12:44.796 ], 00:12:44.796 "driver_specific": {} 00:12:44.796 } 00:12:44.796 ] 00:12:44.796 11:53:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:44.796 11:53:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:44.796 11:53:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:44.796 11:53:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:44.796 11:53:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:44.796 11:53:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:44.796 11:53:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:44.796 11:53:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:44.796 11:53:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:44.796 11:53:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:44.796 11:53:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:44.796 11:53:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:44.796 11:53:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:44.796 11:53:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:44.796 11:53:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:44.796 11:53:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:44.796 "name": "Existed_Raid", 00:12:44.796 "uuid": "f4c5e330-4061-404c-aee2-bdc691084ee0", 00:12:44.796 "strip_size_kb": 64, 00:12:44.796 "state": "configuring", 00:12:44.796 "raid_level": "concat", 00:12:44.796 "superblock": true, 00:12:44.796 "num_base_bdevs": 3, 00:12:44.796 "num_base_bdevs_discovered": 2, 00:12:44.796 "num_base_bdevs_operational": 3, 00:12:44.796 "base_bdevs_list": [ 00:12:44.796 { 00:12:44.796 "name": "BaseBdev1", 00:12:44.796 "uuid": "a7c61df9-decc-47a6-a74a-235508a41c5a", 00:12:44.796 "is_configured": true, 00:12:44.796 "data_offset": 2048, 00:12:44.796 "data_size": 63488 00:12:44.796 }, 00:12:44.796 { 00:12:44.796 "name": "BaseBdev2", 00:12:44.796 "uuid": "f47e0069-5dd4-449e-bc6e-c93b3655f2a7", 00:12:44.796 "is_configured": true, 00:12:44.796 "data_offset": 2048, 00:12:44.796 "data_size": 63488 00:12:44.796 }, 00:12:44.796 { 00:12:44.796 "name": "BaseBdev3", 00:12:44.796 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:44.796 "is_configured": false, 00:12:44.796 "data_offset": 0, 00:12:44.796 "data_size": 0 00:12:44.796 } 00:12:44.796 ] 00:12:44.796 }' 00:12:44.796 11:53:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:44.796 11:53:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:45.363 11:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:45.622 [2024-07-12 11:53:35.631206] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:45.622 [2024-07-12 11:53:35.631324] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x21cc990 00:12:45.622 [2024-07-12 11:53:35.631332] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:45.622 [2024-07-12 11:53:35.631453] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ed2210 00:12:45.622 [2024-07-12 11:53:35.631540] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21cc990 00:12:45.622 [2024-07-12 11:53:35.631546] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x21cc990 00:12:45.622 [2024-07-12 11:53:35.631615] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:45.622 BaseBdev3 00:12:45.622 11:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:12:45.622 11:53:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:12:45.622 11:53:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:45.622 11:53:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:45.622 11:53:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:45.622 11:53:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:45.622 11:53:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:45.622 11:53:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:45.881 [ 00:12:45.881 { 00:12:45.881 "name": "BaseBdev3", 00:12:45.881 "aliases": [ 00:12:45.881 "95be254d-8ea0-42a7-9c74-1013d707c9f3" 00:12:45.881 ], 00:12:45.881 "product_name": "Malloc disk", 00:12:45.881 "block_size": 512, 00:12:45.881 "num_blocks": 65536, 00:12:45.881 "uuid": "95be254d-8ea0-42a7-9c74-1013d707c9f3", 00:12:45.881 "assigned_rate_limits": { 00:12:45.881 "rw_ios_per_sec": 0, 00:12:45.881 "rw_mbytes_per_sec": 0, 00:12:45.881 "r_mbytes_per_sec": 0, 00:12:45.881 "w_mbytes_per_sec": 0 00:12:45.881 }, 00:12:45.881 "claimed": true, 00:12:45.881 "claim_type": "exclusive_write", 00:12:45.881 "zoned": false, 00:12:45.881 "supported_io_types": { 00:12:45.881 "read": true, 00:12:45.881 "write": true, 00:12:45.881 "unmap": true, 00:12:45.881 "flush": true, 00:12:45.881 "reset": true, 00:12:45.881 "nvme_admin": false, 00:12:45.881 "nvme_io": false, 00:12:45.881 "nvme_io_md": false, 00:12:45.881 "write_zeroes": true, 00:12:45.881 "zcopy": true, 00:12:45.881 "get_zone_info": false, 00:12:45.881 "zone_management": false, 00:12:45.881 "zone_append": false, 00:12:45.881 "compare": false, 00:12:45.881 "compare_and_write": false, 00:12:45.881 "abort": true, 00:12:45.881 "seek_hole": false, 00:12:45.881 "seek_data": false, 00:12:45.881 "copy": true, 00:12:45.881 "nvme_iov_md": false 00:12:45.881 }, 00:12:45.881 "memory_domains": [ 00:12:45.881 { 00:12:45.881 "dma_device_id": "system", 00:12:45.881 "dma_device_type": 1 00:12:45.881 }, 00:12:45.881 { 00:12:45.881 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:45.881 "dma_device_type": 2 00:12:45.881 } 00:12:45.881 ], 00:12:45.881 "driver_specific": {} 00:12:45.881 } 00:12:45.881 ] 00:12:45.881 11:53:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:45.881 11:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:45.881 11:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:45.881 11:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:12:45.882 11:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:45.882 11:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:45.882 11:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:45.882 11:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:45.882 11:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:45.882 11:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:45.882 11:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:45.882 11:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:45.882 11:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:45.882 11:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:45.882 11:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:46.140 11:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:46.141 "name": "Existed_Raid", 00:12:46.141 "uuid": "f4c5e330-4061-404c-aee2-bdc691084ee0", 00:12:46.141 "strip_size_kb": 64, 00:12:46.141 "state": "online", 00:12:46.141 "raid_level": "concat", 00:12:46.141 "superblock": true, 00:12:46.141 "num_base_bdevs": 3, 00:12:46.141 "num_base_bdevs_discovered": 3, 00:12:46.141 "num_base_bdevs_operational": 3, 00:12:46.141 "base_bdevs_list": [ 00:12:46.141 { 00:12:46.141 "name": "BaseBdev1", 00:12:46.141 "uuid": "a7c61df9-decc-47a6-a74a-235508a41c5a", 00:12:46.141 "is_configured": true, 00:12:46.141 "data_offset": 2048, 00:12:46.141 "data_size": 63488 00:12:46.141 }, 00:12:46.141 { 00:12:46.141 "name": "BaseBdev2", 00:12:46.141 "uuid": "f47e0069-5dd4-449e-bc6e-c93b3655f2a7", 00:12:46.141 "is_configured": true, 00:12:46.141 "data_offset": 2048, 00:12:46.141 "data_size": 63488 00:12:46.141 }, 00:12:46.141 { 00:12:46.141 "name": "BaseBdev3", 00:12:46.141 "uuid": "95be254d-8ea0-42a7-9c74-1013d707c9f3", 00:12:46.141 "is_configured": true, 00:12:46.141 "data_offset": 2048, 00:12:46.141 "data_size": 63488 00:12:46.141 } 00:12:46.141 ] 00:12:46.141 }' 00:12:46.141 11:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:46.141 11:53:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:46.399 11:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:46.399 11:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:46.399 11:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:46.399 11:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:46.399 11:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:46.399 11:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:12:46.399 11:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:46.399 11:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:46.659 [2024-07-12 11:53:36.790395] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:46.659 11:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:46.659 "name": "Existed_Raid", 00:12:46.659 "aliases": [ 00:12:46.659 "f4c5e330-4061-404c-aee2-bdc691084ee0" 00:12:46.659 ], 00:12:46.659 "product_name": "Raid Volume", 00:12:46.659 "block_size": 512, 00:12:46.659 "num_blocks": 190464, 00:12:46.659 "uuid": "f4c5e330-4061-404c-aee2-bdc691084ee0", 00:12:46.659 "assigned_rate_limits": { 00:12:46.659 "rw_ios_per_sec": 0, 00:12:46.659 "rw_mbytes_per_sec": 0, 00:12:46.659 "r_mbytes_per_sec": 0, 00:12:46.659 "w_mbytes_per_sec": 0 00:12:46.659 }, 00:12:46.659 "claimed": false, 00:12:46.659 "zoned": false, 00:12:46.659 "supported_io_types": { 00:12:46.659 "read": true, 00:12:46.659 "write": true, 00:12:46.659 "unmap": true, 00:12:46.659 "flush": true, 00:12:46.659 "reset": true, 00:12:46.659 "nvme_admin": false, 00:12:46.659 "nvme_io": false, 00:12:46.659 "nvme_io_md": false, 00:12:46.659 "write_zeroes": true, 00:12:46.659 "zcopy": false, 00:12:46.659 "get_zone_info": false, 00:12:46.659 "zone_management": false, 00:12:46.659 "zone_append": false, 00:12:46.659 "compare": false, 00:12:46.659 "compare_and_write": false, 00:12:46.659 "abort": false, 00:12:46.659 "seek_hole": false, 00:12:46.659 "seek_data": false, 00:12:46.659 "copy": false, 00:12:46.659 "nvme_iov_md": false 00:12:46.659 }, 00:12:46.659 "memory_domains": [ 00:12:46.659 { 00:12:46.659 "dma_device_id": "system", 00:12:46.659 "dma_device_type": 1 00:12:46.659 }, 00:12:46.659 { 00:12:46.659 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:46.659 "dma_device_type": 2 00:12:46.659 }, 00:12:46.659 { 00:12:46.659 "dma_device_id": "system", 00:12:46.659 "dma_device_type": 1 00:12:46.659 }, 00:12:46.659 { 00:12:46.659 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:46.659 "dma_device_type": 2 00:12:46.659 }, 00:12:46.659 { 00:12:46.659 "dma_device_id": "system", 00:12:46.659 "dma_device_type": 1 00:12:46.659 }, 00:12:46.659 { 00:12:46.659 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:46.659 "dma_device_type": 2 00:12:46.659 } 00:12:46.659 ], 00:12:46.659 "driver_specific": { 00:12:46.659 "raid": { 00:12:46.659 "uuid": "f4c5e330-4061-404c-aee2-bdc691084ee0", 00:12:46.659 "strip_size_kb": 64, 00:12:46.659 "state": "online", 00:12:46.659 "raid_level": "concat", 00:12:46.659 "superblock": true, 00:12:46.659 "num_base_bdevs": 3, 00:12:46.659 "num_base_bdevs_discovered": 3, 00:12:46.659 "num_base_bdevs_operational": 3, 00:12:46.659 "base_bdevs_list": [ 00:12:46.659 { 00:12:46.659 "name": "BaseBdev1", 00:12:46.659 "uuid": "a7c61df9-decc-47a6-a74a-235508a41c5a", 00:12:46.659 "is_configured": true, 00:12:46.659 "data_offset": 2048, 00:12:46.659 "data_size": 63488 00:12:46.659 }, 00:12:46.659 { 00:12:46.659 "name": "BaseBdev2", 00:12:46.659 "uuid": "f47e0069-5dd4-449e-bc6e-c93b3655f2a7", 00:12:46.659 "is_configured": true, 00:12:46.659 "data_offset": 2048, 00:12:46.659 "data_size": 63488 00:12:46.659 }, 00:12:46.659 { 00:12:46.659 "name": "BaseBdev3", 00:12:46.659 "uuid": "95be254d-8ea0-42a7-9c74-1013d707c9f3", 00:12:46.659 "is_configured": true, 00:12:46.659 "data_offset": 2048, 00:12:46.659 "data_size": 63488 00:12:46.659 } 00:12:46.659 ] 00:12:46.659 } 00:12:46.659 } 00:12:46.659 }' 00:12:46.659 11:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:46.659 11:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:46.659 BaseBdev2 00:12:46.659 BaseBdev3' 00:12:46.659 11:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:46.659 11:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:46.659 11:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:46.918 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:46.918 "name": "BaseBdev1", 00:12:46.918 "aliases": [ 00:12:46.918 "a7c61df9-decc-47a6-a74a-235508a41c5a" 00:12:46.918 ], 00:12:46.918 "product_name": "Malloc disk", 00:12:46.918 "block_size": 512, 00:12:46.918 "num_blocks": 65536, 00:12:46.918 "uuid": "a7c61df9-decc-47a6-a74a-235508a41c5a", 00:12:46.918 "assigned_rate_limits": { 00:12:46.918 "rw_ios_per_sec": 0, 00:12:46.918 "rw_mbytes_per_sec": 0, 00:12:46.918 "r_mbytes_per_sec": 0, 00:12:46.918 "w_mbytes_per_sec": 0 00:12:46.918 }, 00:12:46.918 "claimed": true, 00:12:46.918 "claim_type": "exclusive_write", 00:12:46.918 "zoned": false, 00:12:46.918 "supported_io_types": { 00:12:46.918 "read": true, 00:12:46.918 "write": true, 00:12:46.918 "unmap": true, 00:12:46.918 "flush": true, 00:12:46.918 "reset": true, 00:12:46.918 "nvme_admin": false, 00:12:46.918 "nvme_io": false, 00:12:46.918 "nvme_io_md": false, 00:12:46.918 "write_zeroes": true, 00:12:46.918 "zcopy": true, 00:12:46.918 "get_zone_info": false, 00:12:46.918 "zone_management": false, 00:12:46.918 "zone_append": false, 00:12:46.918 "compare": false, 00:12:46.918 "compare_and_write": false, 00:12:46.918 "abort": true, 00:12:46.918 "seek_hole": false, 00:12:46.918 "seek_data": false, 00:12:46.918 "copy": true, 00:12:46.918 "nvme_iov_md": false 00:12:46.918 }, 00:12:46.918 "memory_domains": [ 00:12:46.918 { 00:12:46.918 "dma_device_id": "system", 00:12:46.918 "dma_device_type": 1 00:12:46.918 }, 00:12:46.918 { 00:12:46.918 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:46.918 "dma_device_type": 2 00:12:46.918 } 00:12:46.918 ], 00:12:46.918 "driver_specific": {} 00:12:46.918 }' 00:12:46.918 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:46.918 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:46.918 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:46.918 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:46.918 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:46.918 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:47.177 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:47.177 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:47.177 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:47.177 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:47.177 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:47.178 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:47.178 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:47.178 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:47.178 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:47.437 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:47.437 "name": "BaseBdev2", 00:12:47.437 "aliases": [ 00:12:47.437 "f47e0069-5dd4-449e-bc6e-c93b3655f2a7" 00:12:47.437 ], 00:12:47.437 "product_name": "Malloc disk", 00:12:47.437 "block_size": 512, 00:12:47.437 "num_blocks": 65536, 00:12:47.437 "uuid": "f47e0069-5dd4-449e-bc6e-c93b3655f2a7", 00:12:47.437 "assigned_rate_limits": { 00:12:47.437 "rw_ios_per_sec": 0, 00:12:47.437 "rw_mbytes_per_sec": 0, 00:12:47.437 "r_mbytes_per_sec": 0, 00:12:47.437 "w_mbytes_per_sec": 0 00:12:47.437 }, 00:12:47.437 "claimed": true, 00:12:47.437 "claim_type": "exclusive_write", 00:12:47.437 "zoned": false, 00:12:47.437 "supported_io_types": { 00:12:47.437 "read": true, 00:12:47.437 "write": true, 00:12:47.437 "unmap": true, 00:12:47.437 "flush": true, 00:12:47.437 "reset": true, 00:12:47.437 "nvme_admin": false, 00:12:47.437 "nvme_io": false, 00:12:47.437 "nvme_io_md": false, 00:12:47.437 "write_zeroes": true, 00:12:47.437 "zcopy": true, 00:12:47.437 "get_zone_info": false, 00:12:47.437 "zone_management": false, 00:12:47.437 "zone_append": false, 00:12:47.437 "compare": false, 00:12:47.437 "compare_and_write": false, 00:12:47.437 "abort": true, 00:12:47.437 "seek_hole": false, 00:12:47.437 "seek_data": false, 00:12:47.437 "copy": true, 00:12:47.437 "nvme_iov_md": false 00:12:47.437 }, 00:12:47.437 "memory_domains": [ 00:12:47.437 { 00:12:47.437 "dma_device_id": "system", 00:12:47.437 "dma_device_type": 1 00:12:47.437 }, 00:12:47.437 { 00:12:47.437 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:47.437 "dma_device_type": 2 00:12:47.437 } 00:12:47.437 ], 00:12:47.437 "driver_specific": {} 00:12:47.437 }' 00:12:47.437 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:47.437 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:47.437 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:47.437 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:47.437 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:47.437 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:47.437 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:47.696 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:47.696 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:47.696 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:47.696 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:47.696 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:47.696 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:47.696 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:47.696 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:47.956 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:47.956 "name": "BaseBdev3", 00:12:47.956 "aliases": [ 00:12:47.956 "95be254d-8ea0-42a7-9c74-1013d707c9f3" 00:12:47.956 ], 00:12:47.956 "product_name": "Malloc disk", 00:12:47.956 "block_size": 512, 00:12:47.956 "num_blocks": 65536, 00:12:47.956 "uuid": "95be254d-8ea0-42a7-9c74-1013d707c9f3", 00:12:47.956 "assigned_rate_limits": { 00:12:47.956 "rw_ios_per_sec": 0, 00:12:47.956 "rw_mbytes_per_sec": 0, 00:12:47.956 "r_mbytes_per_sec": 0, 00:12:47.956 "w_mbytes_per_sec": 0 00:12:47.956 }, 00:12:47.956 "claimed": true, 00:12:47.956 "claim_type": "exclusive_write", 00:12:47.956 "zoned": false, 00:12:47.956 "supported_io_types": { 00:12:47.956 "read": true, 00:12:47.956 "write": true, 00:12:47.956 "unmap": true, 00:12:47.956 "flush": true, 00:12:47.956 "reset": true, 00:12:47.956 "nvme_admin": false, 00:12:47.956 "nvme_io": false, 00:12:47.956 "nvme_io_md": false, 00:12:47.956 "write_zeroes": true, 00:12:47.956 "zcopy": true, 00:12:47.956 "get_zone_info": false, 00:12:47.956 "zone_management": false, 00:12:47.956 "zone_append": false, 00:12:47.956 "compare": false, 00:12:47.956 "compare_and_write": false, 00:12:47.956 "abort": true, 00:12:47.956 "seek_hole": false, 00:12:47.956 "seek_data": false, 00:12:47.956 "copy": true, 00:12:47.956 "nvme_iov_md": false 00:12:47.956 }, 00:12:47.956 "memory_domains": [ 00:12:47.956 { 00:12:47.956 "dma_device_id": "system", 00:12:47.956 "dma_device_type": 1 00:12:47.956 }, 00:12:47.956 { 00:12:47.956 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:47.956 "dma_device_type": 2 00:12:47.956 } 00:12:47.956 ], 00:12:47.956 "driver_specific": {} 00:12:47.956 }' 00:12:47.956 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:47.956 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:47.956 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:47.956 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:47.956 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:47.956 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:47.956 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:47.956 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:48.214 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:48.214 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:48.214 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:48.214 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:48.214 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:48.214 [2024-07-12 11:53:38.438483] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:48.214 [2024-07-12 11:53:38.438502] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:48.214 [2024-07-12 11:53:38.438537] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:48.214 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:48.214 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:12:48.214 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:48.214 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:12:48.214 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:12:48.214 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:12:48.214 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:48.215 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:12:48.215 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:48.215 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:48.215 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:48.215 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:48.215 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:48.215 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:48.215 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:48.215 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:48.215 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:48.473 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:48.473 "name": "Existed_Raid", 00:12:48.473 "uuid": "f4c5e330-4061-404c-aee2-bdc691084ee0", 00:12:48.473 "strip_size_kb": 64, 00:12:48.473 "state": "offline", 00:12:48.473 "raid_level": "concat", 00:12:48.473 "superblock": true, 00:12:48.473 "num_base_bdevs": 3, 00:12:48.473 "num_base_bdevs_discovered": 2, 00:12:48.473 "num_base_bdevs_operational": 2, 00:12:48.473 "base_bdevs_list": [ 00:12:48.473 { 00:12:48.473 "name": null, 00:12:48.473 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:48.473 "is_configured": false, 00:12:48.473 "data_offset": 2048, 00:12:48.473 "data_size": 63488 00:12:48.473 }, 00:12:48.473 { 00:12:48.473 "name": "BaseBdev2", 00:12:48.473 "uuid": "f47e0069-5dd4-449e-bc6e-c93b3655f2a7", 00:12:48.473 "is_configured": true, 00:12:48.473 "data_offset": 2048, 00:12:48.473 "data_size": 63488 00:12:48.473 }, 00:12:48.473 { 00:12:48.473 "name": "BaseBdev3", 00:12:48.473 "uuid": "95be254d-8ea0-42a7-9c74-1013d707c9f3", 00:12:48.473 "is_configured": true, 00:12:48.473 "data_offset": 2048, 00:12:48.473 "data_size": 63488 00:12:48.473 } 00:12:48.473 ] 00:12:48.473 }' 00:12:48.473 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:48.473 11:53:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:49.040 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:49.040 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:49.040 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:49.040 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:49.040 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:49.040 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:49.040 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:49.300 [2024-07-12 11:53:39.417779] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:49.300 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:49.300 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:49.300 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:49.300 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:49.559 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:49.559 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:49.559 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:12:49.559 [2024-07-12 11:53:39.760430] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:49.559 [2024-07-12 11:53:39.760461] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21cc990 name Existed_Raid, state offline 00:12:49.559 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:49.559 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:49.559 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:49.559 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:49.817 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:49.817 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:49.817 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:12:49.817 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:12:49.817 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:49.817 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:50.075 BaseBdev2 00:12:50.075 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:12:50.076 11:53:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:50.076 11:53:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:50.076 11:53:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:50.076 11:53:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:50.076 11:53:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:50.076 11:53:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:50.076 11:53:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:50.335 [ 00:12:50.335 { 00:12:50.335 "name": "BaseBdev2", 00:12:50.335 "aliases": [ 00:12:50.335 "c2ed0579-13d8-43cc-9b9a-be5d76bc7979" 00:12:50.335 ], 00:12:50.335 "product_name": "Malloc disk", 00:12:50.335 "block_size": 512, 00:12:50.335 "num_blocks": 65536, 00:12:50.335 "uuid": "c2ed0579-13d8-43cc-9b9a-be5d76bc7979", 00:12:50.335 "assigned_rate_limits": { 00:12:50.335 "rw_ios_per_sec": 0, 00:12:50.335 "rw_mbytes_per_sec": 0, 00:12:50.335 "r_mbytes_per_sec": 0, 00:12:50.335 "w_mbytes_per_sec": 0 00:12:50.335 }, 00:12:50.335 "claimed": false, 00:12:50.335 "zoned": false, 00:12:50.335 "supported_io_types": { 00:12:50.335 "read": true, 00:12:50.335 "write": true, 00:12:50.335 "unmap": true, 00:12:50.335 "flush": true, 00:12:50.335 "reset": true, 00:12:50.335 "nvme_admin": false, 00:12:50.335 "nvme_io": false, 00:12:50.335 "nvme_io_md": false, 00:12:50.335 "write_zeroes": true, 00:12:50.335 "zcopy": true, 00:12:50.335 "get_zone_info": false, 00:12:50.335 "zone_management": false, 00:12:50.335 "zone_append": false, 00:12:50.335 "compare": false, 00:12:50.335 "compare_and_write": false, 00:12:50.335 "abort": true, 00:12:50.335 "seek_hole": false, 00:12:50.335 "seek_data": false, 00:12:50.335 "copy": true, 00:12:50.335 "nvme_iov_md": false 00:12:50.335 }, 00:12:50.335 "memory_domains": [ 00:12:50.335 { 00:12:50.335 "dma_device_id": "system", 00:12:50.335 "dma_device_type": 1 00:12:50.335 }, 00:12:50.335 { 00:12:50.335 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:50.335 "dma_device_type": 2 00:12:50.335 } 00:12:50.335 ], 00:12:50.335 "driver_specific": {} 00:12:50.335 } 00:12:50.335 ] 00:12:50.335 11:53:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:50.335 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:12:50.335 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:50.335 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:50.335 BaseBdev3 00:12:50.594 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:12:50.594 11:53:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:12:50.594 11:53:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:50.594 11:53:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:50.594 11:53:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:50.594 11:53:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:50.594 11:53:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:50.594 11:53:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:50.854 [ 00:12:50.854 { 00:12:50.854 "name": "BaseBdev3", 00:12:50.854 "aliases": [ 00:12:50.854 "9b6515fa-0d07-4abd-a140-2b52d12231dc" 00:12:50.854 ], 00:12:50.854 "product_name": "Malloc disk", 00:12:50.854 "block_size": 512, 00:12:50.854 "num_blocks": 65536, 00:12:50.854 "uuid": "9b6515fa-0d07-4abd-a140-2b52d12231dc", 00:12:50.854 "assigned_rate_limits": { 00:12:50.854 "rw_ios_per_sec": 0, 00:12:50.854 "rw_mbytes_per_sec": 0, 00:12:50.854 "r_mbytes_per_sec": 0, 00:12:50.854 "w_mbytes_per_sec": 0 00:12:50.854 }, 00:12:50.854 "claimed": false, 00:12:50.854 "zoned": false, 00:12:50.854 "supported_io_types": { 00:12:50.854 "read": true, 00:12:50.854 "write": true, 00:12:50.854 "unmap": true, 00:12:50.854 "flush": true, 00:12:50.854 "reset": true, 00:12:50.854 "nvme_admin": false, 00:12:50.854 "nvme_io": false, 00:12:50.854 "nvme_io_md": false, 00:12:50.854 "write_zeroes": true, 00:12:50.854 "zcopy": true, 00:12:50.854 "get_zone_info": false, 00:12:50.854 "zone_management": false, 00:12:50.854 "zone_append": false, 00:12:50.854 "compare": false, 00:12:50.854 "compare_and_write": false, 00:12:50.854 "abort": true, 00:12:50.854 "seek_hole": false, 00:12:50.854 "seek_data": false, 00:12:50.854 "copy": true, 00:12:50.854 "nvme_iov_md": false 00:12:50.854 }, 00:12:50.854 "memory_domains": [ 00:12:50.854 { 00:12:50.854 "dma_device_id": "system", 00:12:50.854 "dma_device_type": 1 00:12:50.854 }, 00:12:50.854 { 00:12:50.854 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:50.854 "dma_device_type": 2 00:12:50.854 } 00:12:50.854 ], 00:12:50.854 "driver_specific": {} 00:12:50.854 } 00:12:50.854 ] 00:12:50.854 11:53:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:50.854 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:12:50.854 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:50.854 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:50.854 [2024-07-12 11:53:41.073218] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:50.854 [2024-07-12 11:53:41.073248] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:50.854 [2024-07-12 11:53:41.073260] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:50.854 [2024-07-12 11:53:41.074229] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:50.854 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:50.854 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:50.854 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:50.854 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:50.854 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:50.854 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:50.854 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:50.854 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:50.854 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:50.854 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:50.854 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:50.854 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:51.114 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:51.114 "name": "Existed_Raid", 00:12:51.114 "uuid": "604f91c1-6199-427d-8e24-4d7d59bf64c3", 00:12:51.114 "strip_size_kb": 64, 00:12:51.114 "state": "configuring", 00:12:51.114 "raid_level": "concat", 00:12:51.114 "superblock": true, 00:12:51.114 "num_base_bdevs": 3, 00:12:51.114 "num_base_bdevs_discovered": 2, 00:12:51.114 "num_base_bdevs_operational": 3, 00:12:51.114 "base_bdevs_list": [ 00:12:51.114 { 00:12:51.114 "name": "BaseBdev1", 00:12:51.114 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:51.114 "is_configured": false, 00:12:51.114 "data_offset": 0, 00:12:51.114 "data_size": 0 00:12:51.114 }, 00:12:51.114 { 00:12:51.114 "name": "BaseBdev2", 00:12:51.114 "uuid": "c2ed0579-13d8-43cc-9b9a-be5d76bc7979", 00:12:51.114 "is_configured": true, 00:12:51.114 "data_offset": 2048, 00:12:51.114 "data_size": 63488 00:12:51.114 }, 00:12:51.114 { 00:12:51.114 "name": "BaseBdev3", 00:12:51.114 "uuid": "9b6515fa-0d07-4abd-a140-2b52d12231dc", 00:12:51.114 "is_configured": true, 00:12:51.114 "data_offset": 2048, 00:12:51.114 "data_size": 63488 00:12:51.114 } 00:12:51.114 ] 00:12:51.114 }' 00:12:51.114 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:51.114 11:53:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:51.681 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:12:51.681 [2024-07-12 11:53:41.879278] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:51.681 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:51.681 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:51.681 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:51.681 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:51.681 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:51.681 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:51.681 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:51.681 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:51.681 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:51.681 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:51.681 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:51.681 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:51.940 11:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:51.940 "name": "Existed_Raid", 00:12:51.940 "uuid": "604f91c1-6199-427d-8e24-4d7d59bf64c3", 00:12:51.940 "strip_size_kb": 64, 00:12:51.940 "state": "configuring", 00:12:51.940 "raid_level": "concat", 00:12:51.940 "superblock": true, 00:12:51.940 "num_base_bdevs": 3, 00:12:51.940 "num_base_bdevs_discovered": 1, 00:12:51.940 "num_base_bdevs_operational": 3, 00:12:51.940 "base_bdevs_list": [ 00:12:51.940 { 00:12:51.940 "name": "BaseBdev1", 00:12:51.940 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:51.940 "is_configured": false, 00:12:51.940 "data_offset": 0, 00:12:51.940 "data_size": 0 00:12:51.940 }, 00:12:51.940 { 00:12:51.940 "name": null, 00:12:51.940 "uuid": "c2ed0579-13d8-43cc-9b9a-be5d76bc7979", 00:12:51.940 "is_configured": false, 00:12:51.940 "data_offset": 2048, 00:12:51.940 "data_size": 63488 00:12:51.940 }, 00:12:51.940 { 00:12:51.940 "name": "BaseBdev3", 00:12:51.940 "uuid": "9b6515fa-0d07-4abd-a140-2b52d12231dc", 00:12:51.940 "is_configured": true, 00:12:51.940 "data_offset": 2048, 00:12:51.940 "data_size": 63488 00:12:51.940 } 00:12:51.940 ] 00:12:51.940 }' 00:12:51.940 11:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:51.940 11:53:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:52.509 11:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:52.509 11:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:52.509 11:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:12:52.509 11:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:52.767 [2024-07-12 11:53:42.884534] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:52.767 BaseBdev1 00:12:52.767 11:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:12:52.767 11:53:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:52.767 11:53:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:52.767 11:53:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:52.767 11:53:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:52.767 11:53:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:52.767 11:53:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:53.024 11:53:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:53.024 [ 00:12:53.024 { 00:12:53.024 "name": "BaseBdev1", 00:12:53.024 "aliases": [ 00:12:53.024 "3cb89b08-63b2-4ae9-b689-5d697875db66" 00:12:53.024 ], 00:12:53.024 "product_name": "Malloc disk", 00:12:53.024 "block_size": 512, 00:12:53.024 "num_blocks": 65536, 00:12:53.024 "uuid": "3cb89b08-63b2-4ae9-b689-5d697875db66", 00:12:53.024 "assigned_rate_limits": { 00:12:53.024 "rw_ios_per_sec": 0, 00:12:53.024 "rw_mbytes_per_sec": 0, 00:12:53.024 "r_mbytes_per_sec": 0, 00:12:53.024 "w_mbytes_per_sec": 0 00:12:53.024 }, 00:12:53.024 "claimed": true, 00:12:53.024 "claim_type": "exclusive_write", 00:12:53.024 "zoned": false, 00:12:53.024 "supported_io_types": { 00:12:53.024 "read": true, 00:12:53.024 "write": true, 00:12:53.024 "unmap": true, 00:12:53.024 "flush": true, 00:12:53.024 "reset": true, 00:12:53.024 "nvme_admin": false, 00:12:53.024 "nvme_io": false, 00:12:53.024 "nvme_io_md": false, 00:12:53.024 "write_zeroes": true, 00:12:53.024 "zcopy": true, 00:12:53.024 "get_zone_info": false, 00:12:53.024 "zone_management": false, 00:12:53.024 "zone_append": false, 00:12:53.024 "compare": false, 00:12:53.024 "compare_and_write": false, 00:12:53.024 "abort": true, 00:12:53.024 "seek_hole": false, 00:12:53.024 "seek_data": false, 00:12:53.024 "copy": true, 00:12:53.024 "nvme_iov_md": false 00:12:53.024 }, 00:12:53.024 "memory_domains": [ 00:12:53.024 { 00:12:53.024 "dma_device_id": "system", 00:12:53.024 "dma_device_type": 1 00:12:53.024 }, 00:12:53.024 { 00:12:53.024 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:53.024 "dma_device_type": 2 00:12:53.024 } 00:12:53.024 ], 00:12:53.024 "driver_specific": {} 00:12:53.024 } 00:12:53.024 ] 00:12:53.024 11:53:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:53.024 11:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:53.024 11:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:53.024 11:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:53.024 11:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:53.024 11:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:53.024 11:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:53.024 11:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:53.024 11:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:53.024 11:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:53.024 11:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:53.024 11:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:53.024 11:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:53.282 11:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:53.282 "name": "Existed_Raid", 00:12:53.282 "uuid": "604f91c1-6199-427d-8e24-4d7d59bf64c3", 00:12:53.282 "strip_size_kb": 64, 00:12:53.282 "state": "configuring", 00:12:53.282 "raid_level": "concat", 00:12:53.282 "superblock": true, 00:12:53.282 "num_base_bdevs": 3, 00:12:53.282 "num_base_bdevs_discovered": 2, 00:12:53.282 "num_base_bdevs_operational": 3, 00:12:53.282 "base_bdevs_list": [ 00:12:53.282 { 00:12:53.282 "name": "BaseBdev1", 00:12:53.282 "uuid": "3cb89b08-63b2-4ae9-b689-5d697875db66", 00:12:53.282 "is_configured": true, 00:12:53.282 "data_offset": 2048, 00:12:53.282 "data_size": 63488 00:12:53.282 }, 00:12:53.282 { 00:12:53.282 "name": null, 00:12:53.282 "uuid": "c2ed0579-13d8-43cc-9b9a-be5d76bc7979", 00:12:53.282 "is_configured": false, 00:12:53.282 "data_offset": 2048, 00:12:53.282 "data_size": 63488 00:12:53.282 }, 00:12:53.282 { 00:12:53.282 "name": "BaseBdev3", 00:12:53.282 "uuid": "9b6515fa-0d07-4abd-a140-2b52d12231dc", 00:12:53.282 "is_configured": true, 00:12:53.282 "data_offset": 2048, 00:12:53.282 "data_size": 63488 00:12:53.282 } 00:12:53.282 ] 00:12:53.282 }' 00:12:53.282 11:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:53.282 11:53:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:53.847 11:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:53.847 11:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:53.847 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:12:53.847 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:12:54.106 [2024-07-12 11:53:44.200146] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:54.106 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:54.106 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:54.106 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:54.106 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:54.106 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:54.106 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:54.106 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:54.106 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:54.106 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:54.106 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:54.106 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:54.106 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:54.364 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:54.364 "name": "Existed_Raid", 00:12:54.364 "uuid": "604f91c1-6199-427d-8e24-4d7d59bf64c3", 00:12:54.364 "strip_size_kb": 64, 00:12:54.364 "state": "configuring", 00:12:54.364 "raid_level": "concat", 00:12:54.364 "superblock": true, 00:12:54.364 "num_base_bdevs": 3, 00:12:54.364 "num_base_bdevs_discovered": 1, 00:12:54.364 "num_base_bdevs_operational": 3, 00:12:54.364 "base_bdevs_list": [ 00:12:54.364 { 00:12:54.364 "name": "BaseBdev1", 00:12:54.364 "uuid": "3cb89b08-63b2-4ae9-b689-5d697875db66", 00:12:54.364 "is_configured": true, 00:12:54.364 "data_offset": 2048, 00:12:54.364 "data_size": 63488 00:12:54.364 }, 00:12:54.364 { 00:12:54.364 "name": null, 00:12:54.364 "uuid": "c2ed0579-13d8-43cc-9b9a-be5d76bc7979", 00:12:54.364 "is_configured": false, 00:12:54.364 "data_offset": 2048, 00:12:54.364 "data_size": 63488 00:12:54.364 }, 00:12:54.364 { 00:12:54.364 "name": null, 00:12:54.364 "uuid": "9b6515fa-0d07-4abd-a140-2b52d12231dc", 00:12:54.364 "is_configured": false, 00:12:54.364 "data_offset": 2048, 00:12:54.364 "data_size": 63488 00:12:54.364 } 00:12:54.364 ] 00:12:54.364 }' 00:12:54.364 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:54.364 11:53:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:54.622 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:54.622 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:12:54.880 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:12:54.880 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:12:55.139 [2024-07-12 11:53:45.158646] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:55.139 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:55.139 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:55.139 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:55.139 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:55.139 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:55.139 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:55.139 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:55.139 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:55.139 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:55.139 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:55.139 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:55.139 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:55.139 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:55.139 "name": "Existed_Raid", 00:12:55.139 "uuid": "604f91c1-6199-427d-8e24-4d7d59bf64c3", 00:12:55.139 "strip_size_kb": 64, 00:12:55.139 "state": "configuring", 00:12:55.139 "raid_level": "concat", 00:12:55.139 "superblock": true, 00:12:55.139 "num_base_bdevs": 3, 00:12:55.139 "num_base_bdevs_discovered": 2, 00:12:55.139 "num_base_bdevs_operational": 3, 00:12:55.139 "base_bdevs_list": [ 00:12:55.139 { 00:12:55.139 "name": "BaseBdev1", 00:12:55.139 "uuid": "3cb89b08-63b2-4ae9-b689-5d697875db66", 00:12:55.139 "is_configured": true, 00:12:55.139 "data_offset": 2048, 00:12:55.139 "data_size": 63488 00:12:55.139 }, 00:12:55.139 { 00:12:55.139 "name": null, 00:12:55.139 "uuid": "c2ed0579-13d8-43cc-9b9a-be5d76bc7979", 00:12:55.139 "is_configured": false, 00:12:55.139 "data_offset": 2048, 00:12:55.139 "data_size": 63488 00:12:55.139 }, 00:12:55.139 { 00:12:55.139 "name": "BaseBdev3", 00:12:55.139 "uuid": "9b6515fa-0d07-4abd-a140-2b52d12231dc", 00:12:55.139 "is_configured": true, 00:12:55.139 "data_offset": 2048, 00:12:55.139 "data_size": 63488 00:12:55.139 } 00:12:55.139 ] 00:12:55.139 }' 00:12:55.139 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:55.139 11:53:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:55.706 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:12:55.706 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:55.965 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:12:55.965 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:55.965 [2024-07-12 11:53:46.129339] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:55.965 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:55.965 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:55.965 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:55.965 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:55.965 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:55.965 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:55.965 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:55.965 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:55.965 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:55.965 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:55.965 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:55.965 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:56.225 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:56.225 "name": "Existed_Raid", 00:12:56.225 "uuid": "604f91c1-6199-427d-8e24-4d7d59bf64c3", 00:12:56.225 "strip_size_kb": 64, 00:12:56.225 "state": "configuring", 00:12:56.225 "raid_level": "concat", 00:12:56.225 "superblock": true, 00:12:56.225 "num_base_bdevs": 3, 00:12:56.225 "num_base_bdevs_discovered": 1, 00:12:56.225 "num_base_bdevs_operational": 3, 00:12:56.225 "base_bdevs_list": [ 00:12:56.225 { 00:12:56.225 "name": null, 00:12:56.225 "uuid": "3cb89b08-63b2-4ae9-b689-5d697875db66", 00:12:56.225 "is_configured": false, 00:12:56.225 "data_offset": 2048, 00:12:56.225 "data_size": 63488 00:12:56.225 }, 00:12:56.225 { 00:12:56.225 "name": null, 00:12:56.225 "uuid": "c2ed0579-13d8-43cc-9b9a-be5d76bc7979", 00:12:56.225 "is_configured": false, 00:12:56.225 "data_offset": 2048, 00:12:56.225 "data_size": 63488 00:12:56.225 }, 00:12:56.225 { 00:12:56.225 "name": "BaseBdev3", 00:12:56.225 "uuid": "9b6515fa-0d07-4abd-a140-2b52d12231dc", 00:12:56.225 "is_configured": true, 00:12:56.225 "data_offset": 2048, 00:12:56.225 "data_size": 63488 00:12:56.225 } 00:12:56.225 ] 00:12:56.225 }' 00:12:56.225 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:56.225 11:53:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:56.793 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:56.793 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:56.793 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:12:56.793 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:12:57.052 [2024-07-12 11:53:47.161578] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:57.052 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:57.052 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:57.052 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:57.052 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:57.052 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:57.052 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:57.052 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:57.052 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:57.052 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:57.052 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:57.052 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:57.052 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:57.311 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:57.311 "name": "Existed_Raid", 00:12:57.311 "uuid": "604f91c1-6199-427d-8e24-4d7d59bf64c3", 00:12:57.311 "strip_size_kb": 64, 00:12:57.311 "state": "configuring", 00:12:57.311 "raid_level": "concat", 00:12:57.311 "superblock": true, 00:12:57.311 "num_base_bdevs": 3, 00:12:57.311 "num_base_bdevs_discovered": 2, 00:12:57.311 "num_base_bdevs_operational": 3, 00:12:57.311 "base_bdevs_list": [ 00:12:57.311 { 00:12:57.311 "name": null, 00:12:57.311 "uuid": "3cb89b08-63b2-4ae9-b689-5d697875db66", 00:12:57.311 "is_configured": false, 00:12:57.311 "data_offset": 2048, 00:12:57.311 "data_size": 63488 00:12:57.311 }, 00:12:57.311 { 00:12:57.311 "name": "BaseBdev2", 00:12:57.311 "uuid": "c2ed0579-13d8-43cc-9b9a-be5d76bc7979", 00:12:57.311 "is_configured": true, 00:12:57.311 "data_offset": 2048, 00:12:57.311 "data_size": 63488 00:12:57.311 }, 00:12:57.311 { 00:12:57.311 "name": "BaseBdev3", 00:12:57.311 "uuid": "9b6515fa-0d07-4abd-a140-2b52d12231dc", 00:12:57.311 "is_configured": true, 00:12:57.311 "data_offset": 2048, 00:12:57.311 "data_size": 63488 00:12:57.311 } 00:12:57.311 ] 00:12:57.311 }' 00:12:57.311 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:57.311 11:53:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:57.878 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:57.878 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:57.878 11:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:12:57.878 11:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:57.878 11:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:12:58.136 11:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 3cb89b08-63b2-4ae9-b689-5d697875db66 00:12:58.136 [2024-07-12 11:53:48.355446] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:12:58.136 [2024-07-12 11:53:48.355565] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x237fd70 00:12:58.136 [2024-07-12 11:53:48.355574] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:58.136 [2024-07-12 11:53:48.355706] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x237e2d0 00:12:58.136 [2024-07-12 11:53:48.355788] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x237fd70 00:12:58.136 [2024-07-12 11:53:48.355793] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x237fd70 00:12:58.136 [2024-07-12 11:53:48.355856] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:58.136 NewBaseBdev 00:12:58.136 11:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:12:58.136 11:53:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:12:58.136 11:53:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:58.136 11:53:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:58.136 11:53:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:58.136 11:53:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:58.136 11:53:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:58.395 11:53:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:12:58.654 [ 00:12:58.654 { 00:12:58.654 "name": "NewBaseBdev", 00:12:58.654 "aliases": [ 00:12:58.654 "3cb89b08-63b2-4ae9-b689-5d697875db66" 00:12:58.654 ], 00:12:58.654 "product_name": "Malloc disk", 00:12:58.654 "block_size": 512, 00:12:58.654 "num_blocks": 65536, 00:12:58.654 "uuid": "3cb89b08-63b2-4ae9-b689-5d697875db66", 00:12:58.654 "assigned_rate_limits": { 00:12:58.654 "rw_ios_per_sec": 0, 00:12:58.654 "rw_mbytes_per_sec": 0, 00:12:58.654 "r_mbytes_per_sec": 0, 00:12:58.654 "w_mbytes_per_sec": 0 00:12:58.654 }, 00:12:58.654 "claimed": true, 00:12:58.654 "claim_type": "exclusive_write", 00:12:58.654 "zoned": false, 00:12:58.654 "supported_io_types": { 00:12:58.654 "read": true, 00:12:58.654 "write": true, 00:12:58.654 "unmap": true, 00:12:58.654 "flush": true, 00:12:58.654 "reset": true, 00:12:58.654 "nvme_admin": false, 00:12:58.654 "nvme_io": false, 00:12:58.654 "nvme_io_md": false, 00:12:58.654 "write_zeroes": true, 00:12:58.654 "zcopy": true, 00:12:58.654 "get_zone_info": false, 00:12:58.654 "zone_management": false, 00:12:58.654 "zone_append": false, 00:12:58.654 "compare": false, 00:12:58.654 "compare_and_write": false, 00:12:58.654 "abort": true, 00:12:58.654 "seek_hole": false, 00:12:58.654 "seek_data": false, 00:12:58.654 "copy": true, 00:12:58.654 "nvme_iov_md": false 00:12:58.654 }, 00:12:58.654 "memory_domains": [ 00:12:58.654 { 00:12:58.654 "dma_device_id": "system", 00:12:58.654 "dma_device_type": 1 00:12:58.654 }, 00:12:58.654 { 00:12:58.654 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:58.654 "dma_device_type": 2 00:12:58.654 } 00:12:58.654 ], 00:12:58.654 "driver_specific": {} 00:12:58.654 } 00:12:58.654 ] 00:12:58.654 11:53:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:58.654 11:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:12:58.654 11:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:58.654 11:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:58.654 11:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:58.654 11:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:58.654 11:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:58.654 11:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:58.654 11:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:58.654 11:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:58.654 11:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:58.654 11:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:58.654 11:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:58.654 11:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:58.654 "name": "Existed_Raid", 00:12:58.654 "uuid": "604f91c1-6199-427d-8e24-4d7d59bf64c3", 00:12:58.654 "strip_size_kb": 64, 00:12:58.654 "state": "online", 00:12:58.654 "raid_level": "concat", 00:12:58.654 "superblock": true, 00:12:58.654 "num_base_bdevs": 3, 00:12:58.654 "num_base_bdevs_discovered": 3, 00:12:58.654 "num_base_bdevs_operational": 3, 00:12:58.654 "base_bdevs_list": [ 00:12:58.654 { 00:12:58.654 "name": "NewBaseBdev", 00:12:58.654 "uuid": "3cb89b08-63b2-4ae9-b689-5d697875db66", 00:12:58.654 "is_configured": true, 00:12:58.654 "data_offset": 2048, 00:12:58.654 "data_size": 63488 00:12:58.654 }, 00:12:58.654 { 00:12:58.654 "name": "BaseBdev2", 00:12:58.654 "uuid": "c2ed0579-13d8-43cc-9b9a-be5d76bc7979", 00:12:58.654 "is_configured": true, 00:12:58.654 "data_offset": 2048, 00:12:58.654 "data_size": 63488 00:12:58.654 }, 00:12:58.654 { 00:12:58.654 "name": "BaseBdev3", 00:12:58.654 "uuid": "9b6515fa-0d07-4abd-a140-2b52d12231dc", 00:12:58.654 "is_configured": true, 00:12:58.654 "data_offset": 2048, 00:12:58.654 "data_size": 63488 00:12:58.654 } 00:12:58.654 ] 00:12:58.654 }' 00:12:58.654 11:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:58.654 11:53:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:59.220 11:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:12:59.220 11:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:59.220 11:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:59.220 11:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:59.220 11:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:59.220 11:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:12:59.221 11:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:59.221 11:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:59.479 [2024-07-12 11:53:49.510657] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:59.479 11:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:59.479 "name": "Existed_Raid", 00:12:59.479 "aliases": [ 00:12:59.479 "604f91c1-6199-427d-8e24-4d7d59bf64c3" 00:12:59.479 ], 00:12:59.479 "product_name": "Raid Volume", 00:12:59.479 "block_size": 512, 00:12:59.479 "num_blocks": 190464, 00:12:59.479 "uuid": "604f91c1-6199-427d-8e24-4d7d59bf64c3", 00:12:59.479 "assigned_rate_limits": { 00:12:59.479 "rw_ios_per_sec": 0, 00:12:59.479 "rw_mbytes_per_sec": 0, 00:12:59.479 "r_mbytes_per_sec": 0, 00:12:59.479 "w_mbytes_per_sec": 0 00:12:59.479 }, 00:12:59.479 "claimed": false, 00:12:59.479 "zoned": false, 00:12:59.479 "supported_io_types": { 00:12:59.479 "read": true, 00:12:59.479 "write": true, 00:12:59.479 "unmap": true, 00:12:59.479 "flush": true, 00:12:59.479 "reset": true, 00:12:59.479 "nvme_admin": false, 00:12:59.479 "nvme_io": false, 00:12:59.479 "nvme_io_md": false, 00:12:59.479 "write_zeroes": true, 00:12:59.479 "zcopy": false, 00:12:59.479 "get_zone_info": false, 00:12:59.479 "zone_management": false, 00:12:59.479 "zone_append": false, 00:12:59.479 "compare": false, 00:12:59.479 "compare_and_write": false, 00:12:59.479 "abort": false, 00:12:59.479 "seek_hole": false, 00:12:59.479 "seek_data": false, 00:12:59.479 "copy": false, 00:12:59.479 "nvme_iov_md": false 00:12:59.479 }, 00:12:59.479 "memory_domains": [ 00:12:59.479 { 00:12:59.479 "dma_device_id": "system", 00:12:59.479 "dma_device_type": 1 00:12:59.479 }, 00:12:59.479 { 00:12:59.479 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:59.479 "dma_device_type": 2 00:12:59.479 }, 00:12:59.479 { 00:12:59.479 "dma_device_id": "system", 00:12:59.479 "dma_device_type": 1 00:12:59.479 }, 00:12:59.479 { 00:12:59.479 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:59.479 "dma_device_type": 2 00:12:59.479 }, 00:12:59.479 { 00:12:59.479 "dma_device_id": "system", 00:12:59.479 "dma_device_type": 1 00:12:59.479 }, 00:12:59.479 { 00:12:59.479 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:59.479 "dma_device_type": 2 00:12:59.479 } 00:12:59.479 ], 00:12:59.479 "driver_specific": { 00:12:59.479 "raid": { 00:12:59.479 "uuid": "604f91c1-6199-427d-8e24-4d7d59bf64c3", 00:12:59.479 "strip_size_kb": 64, 00:12:59.479 "state": "online", 00:12:59.479 "raid_level": "concat", 00:12:59.479 "superblock": true, 00:12:59.479 "num_base_bdevs": 3, 00:12:59.479 "num_base_bdevs_discovered": 3, 00:12:59.479 "num_base_bdevs_operational": 3, 00:12:59.479 "base_bdevs_list": [ 00:12:59.479 { 00:12:59.479 "name": "NewBaseBdev", 00:12:59.479 "uuid": "3cb89b08-63b2-4ae9-b689-5d697875db66", 00:12:59.479 "is_configured": true, 00:12:59.479 "data_offset": 2048, 00:12:59.479 "data_size": 63488 00:12:59.479 }, 00:12:59.479 { 00:12:59.479 "name": "BaseBdev2", 00:12:59.479 "uuid": "c2ed0579-13d8-43cc-9b9a-be5d76bc7979", 00:12:59.479 "is_configured": true, 00:12:59.479 "data_offset": 2048, 00:12:59.479 "data_size": 63488 00:12:59.479 }, 00:12:59.479 { 00:12:59.479 "name": "BaseBdev3", 00:12:59.479 "uuid": "9b6515fa-0d07-4abd-a140-2b52d12231dc", 00:12:59.479 "is_configured": true, 00:12:59.479 "data_offset": 2048, 00:12:59.479 "data_size": 63488 00:12:59.479 } 00:12:59.479 ] 00:12:59.479 } 00:12:59.479 } 00:12:59.479 }' 00:12:59.479 11:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:59.479 11:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:12:59.479 BaseBdev2 00:12:59.479 BaseBdev3' 00:12:59.479 11:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:59.479 11:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:12:59.479 11:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:59.738 11:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:59.738 "name": "NewBaseBdev", 00:12:59.738 "aliases": [ 00:12:59.738 "3cb89b08-63b2-4ae9-b689-5d697875db66" 00:12:59.738 ], 00:12:59.738 "product_name": "Malloc disk", 00:12:59.738 "block_size": 512, 00:12:59.738 "num_blocks": 65536, 00:12:59.738 "uuid": "3cb89b08-63b2-4ae9-b689-5d697875db66", 00:12:59.738 "assigned_rate_limits": { 00:12:59.738 "rw_ios_per_sec": 0, 00:12:59.738 "rw_mbytes_per_sec": 0, 00:12:59.738 "r_mbytes_per_sec": 0, 00:12:59.738 "w_mbytes_per_sec": 0 00:12:59.738 }, 00:12:59.738 "claimed": true, 00:12:59.738 "claim_type": "exclusive_write", 00:12:59.738 "zoned": false, 00:12:59.738 "supported_io_types": { 00:12:59.738 "read": true, 00:12:59.738 "write": true, 00:12:59.738 "unmap": true, 00:12:59.738 "flush": true, 00:12:59.738 "reset": true, 00:12:59.738 "nvme_admin": false, 00:12:59.738 "nvme_io": false, 00:12:59.738 "nvme_io_md": false, 00:12:59.738 "write_zeroes": true, 00:12:59.738 "zcopy": true, 00:12:59.738 "get_zone_info": false, 00:12:59.738 "zone_management": false, 00:12:59.738 "zone_append": false, 00:12:59.738 "compare": false, 00:12:59.738 "compare_and_write": false, 00:12:59.738 "abort": true, 00:12:59.738 "seek_hole": false, 00:12:59.738 "seek_data": false, 00:12:59.738 "copy": true, 00:12:59.738 "nvme_iov_md": false 00:12:59.738 }, 00:12:59.738 "memory_domains": [ 00:12:59.738 { 00:12:59.738 "dma_device_id": "system", 00:12:59.738 "dma_device_type": 1 00:12:59.738 }, 00:12:59.738 { 00:12:59.738 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:59.738 "dma_device_type": 2 00:12:59.738 } 00:12:59.738 ], 00:12:59.738 "driver_specific": {} 00:12:59.738 }' 00:12:59.738 11:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:59.738 11:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:59.738 11:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:59.738 11:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:59.738 11:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:59.738 11:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:59.738 11:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:59.738 11:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:59.738 11:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:59.738 11:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:59.997 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:59.997 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:59.997 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:59.997 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:59.997 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:59.997 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:59.997 "name": "BaseBdev2", 00:12:59.997 "aliases": [ 00:12:59.997 "c2ed0579-13d8-43cc-9b9a-be5d76bc7979" 00:12:59.997 ], 00:12:59.997 "product_name": "Malloc disk", 00:12:59.997 "block_size": 512, 00:12:59.997 "num_blocks": 65536, 00:12:59.997 "uuid": "c2ed0579-13d8-43cc-9b9a-be5d76bc7979", 00:12:59.997 "assigned_rate_limits": { 00:12:59.997 "rw_ios_per_sec": 0, 00:12:59.997 "rw_mbytes_per_sec": 0, 00:12:59.997 "r_mbytes_per_sec": 0, 00:12:59.997 "w_mbytes_per_sec": 0 00:12:59.997 }, 00:12:59.997 "claimed": true, 00:12:59.997 "claim_type": "exclusive_write", 00:12:59.997 "zoned": false, 00:12:59.997 "supported_io_types": { 00:12:59.997 "read": true, 00:12:59.997 "write": true, 00:12:59.997 "unmap": true, 00:12:59.997 "flush": true, 00:12:59.997 "reset": true, 00:12:59.997 "nvme_admin": false, 00:12:59.997 "nvme_io": false, 00:12:59.997 "nvme_io_md": false, 00:12:59.997 "write_zeroes": true, 00:12:59.997 "zcopy": true, 00:12:59.997 "get_zone_info": false, 00:12:59.997 "zone_management": false, 00:12:59.997 "zone_append": false, 00:12:59.997 "compare": false, 00:12:59.997 "compare_and_write": false, 00:12:59.997 "abort": true, 00:12:59.997 "seek_hole": false, 00:12:59.997 "seek_data": false, 00:12:59.997 "copy": true, 00:12:59.997 "nvme_iov_md": false 00:12:59.997 }, 00:12:59.997 "memory_domains": [ 00:12:59.997 { 00:12:59.997 "dma_device_id": "system", 00:12:59.997 "dma_device_type": 1 00:12:59.997 }, 00:12:59.997 { 00:12:59.997 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:59.997 "dma_device_type": 2 00:12:59.997 } 00:12:59.997 ], 00:12:59.997 "driver_specific": {} 00:12:59.997 }' 00:12:59.997 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:00.256 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:00.256 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:00.256 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:00.256 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:00.256 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:00.256 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:00.256 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:00.256 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:00.256 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:00.514 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:00.515 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:00.515 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:00.515 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:00.515 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:00.515 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:00.515 "name": "BaseBdev3", 00:13:00.515 "aliases": [ 00:13:00.515 "9b6515fa-0d07-4abd-a140-2b52d12231dc" 00:13:00.515 ], 00:13:00.515 "product_name": "Malloc disk", 00:13:00.515 "block_size": 512, 00:13:00.515 "num_blocks": 65536, 00:13:00.515 "uuid": "9b6515fa-0d07-4abd-a140-2b52d12231dc", 00:13:00.515 "assigned_rate_limits": { 00:13:00.515 "rw_ios_per_sec": 0, 00:13:00.515 "rw_mbytes_per_sec": 0, 00:13:00.515 "r_mbytes_per_sec": 0, 00:13:00.515 "w_mbytes_per_sec": 0 00:13:00.515 }, 00:13:00.515 "claimed": true, 00:13:00.515 "claim_type": "exclusive_write", 00:13:00.515 "zoned": false, 00:13:00.515 "supported_io_types": { 00:13:00.515 "read": true, 00:13:00.515 "write": true, 00:13:00.515 "unmap": true, 00:13:00.515 "flush": true, 00:13:00.515 "reset": true, 00:13:00.515 "nvme_admin": false, 00:13:00.515 "nvme_io": false, 00:13:00.515 "nvme_io_md": false, 00:13:00.515 "write_zeroes": true, 00:13:00.515 "zcopy": true, 00:13:00.515 "get_zone_info": false, 00:13:00.515 "zone_management": false, 00:13:00.515 "zone_append": false, 00:13:00.515 "compare": false, 00:13:00.515 "compare_and_write": false, 00:13:00.515 "abort": true, 00:13:00.515 "seek_hole": false, 00:13:00.515 "seek_data": false, 00:13:00.515 "copy": true, 00:13:00.515 "nvme_iov_md": false 00:13:00.515 }, 00:13:00.515 "memory_domains": [ 00:13:00.515 { 00:13:00.515 "dma_device_id": "system", 00:13:00.515 "dma_device_type": 1 00:13:00.515 }, 00:13:00.515 { 00:13:00.515 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:00.515 "dma_device_type": 2 00:13:00.515 } 00:13:00.515 ], 00:13:00.515 "driver_specific": {} 00:13:00.515 }' 00:13:00.515 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:00.773 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:00.773 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:00.773 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:00.773 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:00.773 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:00.773 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:00.773 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:00.773 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:00.773 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:00.773 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:01.032 11:53:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:01.032 11:53:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:01.032 [2024-07-12 11:53:51.178767] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:01.032 [2024-07-12 11:53:51.178789] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:01.032 [2024-07-12 11:53:51.178828] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:01.032 [2024-07-12 11:53:51.178864] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:01.032 [2024-07-12 11:53:51.178870] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x237fd70 name Existed_Raid, state offline 00:13:01.032 11:53:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 627167 00:13:01.032 11:53:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 627167 ']' 00:13:01.032 11:53:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 627167 00:13:01.032 11:53:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:13:01.032 11:53:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:01.032 11:53:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 627167 00:13:01.032 11:53:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:01.032 11:53:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:01.032 11:53:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 627167' 00:13:01.032 killing process with pid 627167 00:13:01.032 11:53:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 627167 00:13:01.032 [2024-07-12 11:53:51.241852] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:01.032 11:53:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 627167 00:13:01.032 [2024-07-12 11:53:51.264988] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:01.292 11:53:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:13:01.292 00:13:01.292 real 0m21.293s 00:13:01.292 user 0m39.603s 00:13:01.292 sys 0m3.377s 00:13:01.292 11:53:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:01.292 11:53:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:01.292 ************************************ 00:13:01.292 END TEST raid_state_function_test_sb 00:13:01.292 ************************************ 00:13:01.292 11:53:51 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:01.292 11:53:51 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 3 00:13:01.292 11:53:51 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:13:01.292 11:53:51 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:01.292 11:53:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:01.292 ************************************ 00:13:01.292 START TEST raid_superblock_test 00:13:01.292 ************************************ 00:13:01.292 11:53:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 3 00:13:01.292 11:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:13:01.292 11:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:13:01.292 11:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:13:01.292 11:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:13:01.292 11:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:13:01.292 11:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:13:01.292 11:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:13:01.292 11:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:13:01.292 11:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:13:01.292 11:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:13:01.292 11:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:13:01.292 11:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:13:01.292 11:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:13:01.292 11:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:13:01.292 11:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:13:01.292 11:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:13:01.292 11:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=631215 00:13:01.292 11:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 631215 /var/tmp/spdk-raid.sock 00:13:01.292 11:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:13:01.292 11:53:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 631215 ']' 00:13:01.292 11:53:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:01.292 11:53:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:01.292 11:53:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:01.292 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:01.292 11:53:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:01.292 11:53:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:01.552 [2024-07-12 11:53:51.571603] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:13:01.552 [2024-07-12 11:53:51.571649] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid631215 ] 00:13:01.552 [2024-07-12 11:53:51.635914] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:01.552 [2024-07-12 11:53:51.706318] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:13:01.552 [2024-07-12 11:53:51.764727] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:01.552 [2024-07-12 11:53:51.764756] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:02.120 11:53:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:02.120 11:53:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:13:02.120 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:13:02.120 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:02.120 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:13:02.120 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:13:02.120 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:13:02.120 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:02.120 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:02.120 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:02.120 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:13:02.379 malloc1 00:13:02.379 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:02.637 [2024-07-12 11:53:52.681558] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:02.637 [2024-07-12 11:53:52.681595] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:02.637 [2024-07-12 11:53:52.681605] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x276c270 00:13:02.637 [2024-07-12 11:53:52.681611] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:02.637 [2024-07-12 11:53:52.682635] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:02.637 [2024-07-12 11:53:52.682655] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:02.637 pt1 00:13:02.637 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:02.637 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:02.637 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:13:02.637 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:13:02.637 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:13:02.637 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:02.637 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:02.637 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:02.637 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:13:02.637 malloc2 00:13:02.637 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:02.904 [2024-07-12 11:53:53.013872] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:02.904 [2024-07-12 11:53:53.013900] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:02.904 [2024-07-12 11:53:53.013908] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x276d580 00:13:02.904 [2024-07-12 11:53:53.013914] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:02.904 [2024-07-12 11:53:53.014852] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:02.904 [2024-07-12 11:53:53.014871] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:02.904 pt2 00:13:02.904 11:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:02.904 11:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:02.904 11:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:13:02.904 11:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:13:02.904 11:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:13:02.904 11:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:02.904 11:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:02.904 11:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:02.904 11:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:13:03.167 malloc3 00:13:03.167 11:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:13:03.167 [2024-07-12 11:53:53.370140] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:13:03.167 [2024-07-12 11:53:53.370168] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:03.167 [2024-07-12 11:53:53.370177] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2917e30 00:13:03.167 [2024-07-12 11:53:53.370183] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:03.167 [2024-07-12 11:53:53.371162] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:03.167 [2024-07-12 11:53:53.371182] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:13:03.167 pt3 00:13:03.167 11:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:03.167 11:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:03.167 11:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:13:03.425 [2024-07-12 11:53:53.530571] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:03.425 [2024-07-12 11:53:53.531341] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:03.425 [2024-07-12 11:53:53.531375] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:13:03.425 [2024-07-12 11:53:53.531473] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x291b390 00:13:03.425 [2024-07-12 11:53:53.531480] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:03.425 [2024-07-12 11:53:53.531600] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x291dc00 00:13:03.425 [2024-07-12 11:53:53.531689] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x291b390 00:13:03.425 [2024-07-12 11:53:53.531694] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x291b390 00:13:03.425 [2024-07-12 11:53:53.531753] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:03.425 11:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:13:03.425 11:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:03.425 11:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:03.425 11:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:03.425 11:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:03.425 11:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:03.425 11:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:03.425 11:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:03.425 11:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:03.425 11:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:03.425 11:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:03.425 11:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:03.684 11:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:03.684 "name": "raid_bdev1", 00:13:03.684 "uuid": "27fb48db-caf8-49dc-b947-0687bbda9a44", 00:13:03.684 "strip_size_kb": 64, 00:13:03.684 "state": "online", 00:13:03.684 "raid_level": "concat", 00:13:03.684 "superblock": true, 00:13:03.684 "num_base_bdevs": 3, 00:13:03.684 "num_base_bdevs_discovered": 3, 00:13:03.684 "num_base_bdevs_operational": 3, 00:13:03.684 "base_bdevs_list": [ 00:13:03.684 { 00:13:03.684 "name": "pt1", 00:13:03.684 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:03.684 "is_configured": true, 00:13:03.684 "data_offset": 2048, 00:13:03.684 "data_size": 63488 00:13:03.684 }, 00:13:03.684 { 00:13:03.684 "name": "pt2", 00:13:03.684 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:03.684 "is_configured": true, 00:13:03.684 "data_offset": 2048, 00:13:03.684 "data_size": 63488 00:13:03.684 }, 00:13:03.684 { 00:13:03.684 "name": "pt3", 00:13:03.684 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:03.684 "is_configured": true, 00:13:03.684 "data_offset": 2048, 00:13:03.684 "data_size": 63488 00:13:03.684 } 00:13:03.684 ] 00:13:03.684 }' 00:13:03.684 11:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:03.684 11:53:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:03.942 11:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:13:03.942 11:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:03.942 11:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:03.942 11:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:03.942 11:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:03.942 11:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:03.942 11:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:03.942 11:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:04.201 [2024-07-12 11:53:54.332802] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:04.201 11:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:04.201 "name": "raid_bdev1", 00:13:04.201 "aliases": [ 00:13:04.201 "27fb48db-caf8-49dc-b947-0687bbda9a44" 00:13:04.201 ], 00:13:04.201 "product_name": "Raid Volume", 00:13:04.201 "block_size": 512, 00:13:04.201 "num_blocks": 190464, 00:13:04.201 "uuid": "27fb48db-caf8-49dc-b947-0687bbda9a44", 00:13:04.201 "assigned_rate_limits": { 00:13:04.201 "rw_ios_per_sec": 0, 00:13:04.201 "rw_mbytes_per_sec": 0, 00:13:04.201 "r_mbytes_per_sec": 0, 00:13:04.201 "w_mbytes_per_sec": 0 00:13:04.201 }, 00:13:04.201 "claimed": false, 00:13:04.201 "zoned": false, 00:13:04.201 "supported_io_types": { 00:13:04.201 "read": true, 00:13:04.201 "write": true, 00:13:04.201 "unmap": true, 00:13:04.201 "flush": true, 00:13:04.201 "reset": true, 00:13:04.201 "nvme_admin": false, 00:13:04.201 "nvme_io": false, 00:13:04.201 "nvme_io_md": false, 00:13:04.201 "write_zeroes": true, 00:13:04.201 "zcopy": false, 00:13:04.201 "get_zone_info": false, 00:13:04.201 "zone_management": false, 00:13:04.201 "zone_append": false, 00:13:04.201 "compare": false, 00:13:04.201 "compare_and_write": false, 00:13:04.201 "abort": false, 00:13:04.201 "seek_hole": false, 00:13:04.201 "seek_data": false, 00:13:04.201 "copy": false, 00:13:04.201 "nvme_iov_md": false 00:13:04.201 }, 00:13:04.201 "memory_domains": [ 00:13:04.201 { 00:13:04.201 "dma_device_id": "system", 00:13:04.201 "dma_device_type": 1 00:13:04.201 }, 00:13:04.201 { 00:13:04.201 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:04.201 "dma_device_type": 2 00:13:04.201 }, 00:13:04.201 { 00:13:04.201 "dma_device_id": "system", 00:13:04.201 "dma_device_type": 1 00:13:04.201 }, 00:13:04.201 { 00:13:04.201 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:04.201 "dma_device_type": 2 00:13:04.201 }, 00:13:04.201 { 00:13:04.201 "dma_device_id": "system", 00:13:04.201 "dma_device_type": 1 00:13:04.201 }, 00:13:04.201 { 00:13:04.201 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:04.201 "dma_device_type": 2 00:13:04.201 } 00:13:04.201 ], 00:13:04.201 "driver_specific": { 00:13:04.201 "raid": { 00:13:04.201 "uuid": "27fb48db-caf8-49dc-b947-0687bbda9a44", 00:13:04.201 "strip_size_kb": 64, 00:13:04.201 "state": "online", 00:13:04.201 "raid_level": "concat", 00:13:04.201 "superblock": true, 00:13:04.201 "num_base_bdevs": 3, 00:13:04.201 "num_base_bdevs_discovered": 3, 00:13:04.201 "num_base_bdevs_operational": 3, 00:13:04.201 "base_bdevs_list": [ 00:13:04.201 { 00:13:04.201 "name": "pt1", 00:13:04.201 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:04.201 "is_configured": true, 00:13:04.201 "data_offset": 2048, 00:13:04.201 "data_size": 63488 00:13:04.201 }, 00:13:04.201 { 00:13:04.201 "name": "pt2", 00:13:04.201 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:04.201 "is_configured": true, 00:13:04.201 "data_offset": 2048, 00:13:04.201 "data_size": 63488 00:13:04.201 }, 00:13:04.201 { 00:13:04.201 "name": "pt3", 00:13:04.201 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:04.201 "is_configured": true, 00:13:04.201 "data_offset": 2048, 00:13:04.201 "data_size": 63488 00:13:04.201 } 00:13:04.201 ] 00:13:04.201 } 00:13:04.201 } 00:13:04.201 }' 00:13:04.201 11:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:04.201 11:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:04.201 pt2 00:13:04.201 pt3' 00:13:04.201 11:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:04.201 11:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:04.201 11:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:04.459 11:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:04.460 "name": "pt1", 00:13:04.460 "aliases": [ 00:13:04.460 "00000000-0000-0000-0000-000000000001" 00:13:04.460 ], 00:13:04.460 "product_name": "passthru", 00:13:04.460 "block_size": 512, 00:13:04.460 "num_blocks": 65536, 00:13:04.460 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:04.460 "assigned_rate_limits": { 00:13:04.460 "rw_ios_per_sec": 0, 00:13:04.460 "rw_mbytes_per_sec": 0, 00:13:04.460 "r_mbytes_per_sec": 0, 00:13:04.460 "w_mbytes_per_sec": 0 00:13:04.460 }, 00:13:04.460 "claimed": true, 00:13:04.460 "claim_type": "exclusive_write", 00:13:04.460 "zoned": false, 00:13:04.460 "supported_io_types": { 00:13:04.460 "read": true, 00:13:04.460 "write": true, 00:13:04.460 "unmap": true, 00:13:04.460 "flush": true, 00:13:04.460 "reset": true, 00:13:04.460 "nvme_admin": false, 00:13:04.460 "nvme_io": false, 00:13:04.460 "nvme_io_md": false, 00:13:04.460 "write_zeroes": true, 00:13:04.460 "zcopy": true, 00:13:04.460 "get_zone_info": false, 00:13:04.460 "zone_management": false, 00:13:04.460 "zone_append": false, 00:13:04.460 "compare": false, 00:13:04.460 "compare_and_write": false, 00:13:04.460 "abort": true, 00:13:04.460 "seek_hole": false, 00:13:04.460 "seek_data": false, 00:13:04.460 "copy": true, 00:13:04.460 "nvme_iov_md": false 00:13:04.460 }, 00:13:04.460 "memory_domains": [ 00:13:04.460 { 00:13:04.460 "dma_device_id": "system", 00:13:04.460 "dma_device_type": 1 00:13:04.460 }, 00:13:04.460 { 00:13:04.460 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:04.460 "dma_device_type": 2 00:13:04.460 } 00:13:04.460 ], 00:13:04.460 "driver_specific": { 00:13:04.460 "passthru": { 00:13:04.460 "name": "pt1", 00:13:04.460 "base_bdev_name": "malloc1" 00:13:04.460 } 00:13:04.460 } 00:13:04.460 }' 00:13:04.460 11:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:04.460 11:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:04.460 11:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:04.460 11:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:04.460 11:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:04.460 11:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:04.460 11:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:04.719 11:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:04.719 11:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:04.719 11:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:04.719 11:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:04.719 11:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:04.719 11:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:04.719 11:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:04.719 11:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:04.978 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:04.978 "name": "pt2", 00:13:04.978 "aliases": [ 00:13:04.978 "00000000-0000-0000-0000-000000000002" 00:13:04.978 ], 00:13:04.978 "product_name": "passthru", 00:13:04.978 "block_size": 512, 00:13:04.978 "num_blocks": 65536, 00:13:04.978 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:04.978 "assigned_rate_limits": { 00:13:04.978 "rw_ios_per_sec": 0, 00:13:04.978 "rw_mbytes_per_sec": 0, 00:13:04.978 "r_mbytes_per_sec": 0, 00:13:04.978 "w_mbytes_per_sec": 0 00:13:04.978 }, 00:13:04.978 "claimed": true, 00:13:04.978 "claim_type": "exclusive_write", 00:13:04.978 "zoned": false, 00:13:04.978 "supported_io_types": { 00:13:04.978 "read": true, 00:13:04.978 "write": true, 00:13:04.978 "unmap": true, 00:13:04.978 "flush": true, 00:13:04.978 "reset": true, 00:13:04.978 "nvme_admin": false, 00:13:04.978 "nvme_io": false, 00:13:04.978 "nvme_io_md": false, 00:13:04.978 "write_zeroes": true, 00:13:04.978 "zcopy": true, 00:13:04.978 "get_zone_info": false, 00:13:04.978 "zone_management": false, 00:13:04.978 "zone_append": false, 00:13:04.978 "compare": false, 00:13:04.978 "compare_and_write": false, 00:13:04.978 "abort": true, 00:13:04.978 "seek_hole": false, 00:13:04.978 "seek_data": false, 00:13:04.978 "copy": true, 00:13:04.978 "nvme_iov_md": false 00:13:04.978 }, 00:13:04.978 "memory_domains": [ 00:13:04.978 { 00:13:04.978 "dma_device_id": "system", 00:13:04.978 "dma_device_type": 1 00:13:04.978 }, 00:13:04.978 { 00:13:04.978 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:04.978 "dma_device_type": 2 00:13:04.978 } 00:13:04.978 ], 00:13:04.978 "driver_specific": { 00:13:04.978 "passthru": { 00:13:04.978 "name": "pt2", 00:13:04.978 "base_bdev_name": "malloc2" 00:13:04.978 } 00:13:04.978 } 00:13:04.978 }' 00:13:04.978 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:04.978 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:04.978 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:04.978 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:04.978 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:04.978 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:04.978 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:04.978 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:05.238 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:05.238 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:05.238 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:05.238 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:05.238 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:05.238 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:05.238 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:13:05.496 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:05.496 "name": "pt3", 00:13:05.496 "aliases": [ 00:13:05.496 "00000000-0000-0000-0000-000000000003" 00:13:05.496 ], 00:13:05.496 "product_name": "passthru", 00:13:05.496 "block_size": 512, 00:13:05.496 "num_blocks": 65536, 00:13:05.497 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:05.497 "assigned_rate_limits": { 00:13:05.497 "rw_ios_per_sec": 0, 00:13:05.497 "rw_mbytes_per_sec": 0, 00:13:05.497 "r_mbytes_per_sec": 0, 00:13:05.497 "w_mbytes_per_sec": 0 00:13:05.497 }, 00:13:05.497 "claimed": true, 00:13:05.497 "claim_type": "exclusive_write", 00:13:05.497 "zoned": false, 00:13:05.497 "supported_io_types": { 00:13:05.497 "read": true, 00:13:05.497 "write": true, 00:13:05.497 "unmap": true, 00:13:05.497 "flush": true, 00:13:05.497 "reset": true, 00:13:05.497 "nvme_admin": false, 00:13:05.497 "nvme_io": false, 00:13:05.497 "nvme_io_md": false, 00:13:05.497 "write_zeroes": true, 00:13:05.497 "zcopy": true, 00:13:05.497 "get_zone_info": false, 00:13:05.497 "zone_management": false, 00:13:05.497 "zone_append": false, 00:13:05.497 "compare": false, 00:13:05.497 "compare_and_write": false, 00:13:05.497 "abort": true, 00:13:05.497 "seek_hole": false, 00:13:05.497 "seek_data": false, 00:13:05.497 "copy": true, 00:13:05.497 "nvme_iov_md": false 00:13:05.497 }, 00:13:05.497 "memory_domains": [ 00:13:05.497 { 00:13:05.497 "dma_device_id": "system", 00:13:05.497 "dma_device_type": 1 00:13:05.497 }, 00:13:05.497 { 00:13:05.497 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:05.497 "dma_device_type": 2 00:13:05.497 } 00:13:05.497 ], 00:13:05.497 "driver_specific": { 00:13:05.497 "passthru": { 00:13:05.497 "name": "pt3", 00:13:05.497 "base_bdev_name": "malloc3" 00:13:05.497 } 00:13:05.497 } 00:13:05.497 }' 00:13:05.497 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:05.497 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:05.497 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:05.497 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:05.497 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:05.497 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:05.497 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:05.497 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:05.497 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:05.497 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:05.755 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:05.755 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:05.755 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:05.755 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:13:05.755 [2024-07-12 11:53:55.945065] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:05.755 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=27fb48db-caf8-49dc-b947-0687bbda9a44 00:13:05.755 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 27fb48db-caf8-49dc-b947-0687bbda9a44 ']' 00:13:05.755 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:06.014 [2024-07-12 11:53:56.113314] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:06.014 [2024-07-12 11:53:56.113327] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:06.014 [2024-07-12 11:53:56.113362] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:06.014 [2024-07-12 11:53:56.113397] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:06.014 [2024-07-12 11:53:56.113403] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x291b390 name raid_bdev1, state offline 00:13:06.014 11:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:13:06.014 11:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:06.273 11:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:13:06.273 11:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:13:06.273 11:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:06.273 11:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:13:06.273 11:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:06.273 11:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:06.531 11:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:06.531 11:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:13:06.791 11:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:13:06.791 11:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:13:06.791 11:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:13:06.791 11:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:13:06.791 11:53:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:13:06.791 11:53:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:13:06.791 11:53:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:06.791 11:53:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:06.791 11:53:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:06.791 11:53:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:06.791 11:53:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:06.791 11:53:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:06.791 11:53:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:06.791 11:53:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:13:06.791 11:53:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:13:07.050 [2024-07-12 11:53:57.139951] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:13:07.050 [2024-07-12 11:53:57.140904] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:13:07.050 [2024-07-12 11:53:57.140935] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:13:07.050 [2024-07-12 11:53:57.140965] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:13:07.050 [2024-07-12 11:53:57.140989] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:13:07.050 [2024-07-12 11:53:57.141002] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:13:07.050 [2024-07-12 11:53:57.141027] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:07.051 [2024-07-12 11:53:57.141033] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2915470 name raid_bdev1, state configuring 00:13:07.051 request: 00:13:07.051 { 00:13:07.051 "name": "raid_bdev1", 00:13:07.051 "raid_level": "concat", 00:13:07.051 "base_bdevs": [ 00:13:07.051 "malloc1", 00:13:07.051 "malloc2", 00:13:07.051 "malloc3" 00:13:07.051 ], 00:13:07.051 "superblock": false, 00:13:07.051 "strip_size_kb": 64, 00:13:07.051 "method": "bdev_raid_create", 00:13:07.051 "req_id": 1 00:13:07.051 } 00:13:07.051 Got JSON-RPC error response 00:13:07.051 response: 00:13:07.051 { 00:13:07.051 "code": -17, 00:13:07.051 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:13:07.051 } 00:13:07.051 11:53:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:13:07.051 11:53:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:13:07.051 11:53:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:13:07.051 11:53:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:13:07.051 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:07.051 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:13:07.310 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:13:07.310 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:13:07.310 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:07.310 [2024-07-12 11:53:57.468787] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:07.310 [2024-07-12 11:53:57.468813] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:07.310 [2024-07-12 11:53:57.468823] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2918060 00:13:07.310 [2024-07-12 11:53:57.468829] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:07.310 [2024-07-12 11:53:57.469966] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:07.310 [2024-07-12 11:53:57.469989] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:07.310 [2024-07-12 11:53:57.470034] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:13:07.310 [2024-07-12 11:53:57.470054] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:07.310 pt1 00:13:07.310 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:13:07.310 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:07.310 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:07.310 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:07.310 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:07.310 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:07.310 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:07.310 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:07.310 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:07.310 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:07.310 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:07.310 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:07.569 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:07.569 "name": "raid_bdev1", 00:13:07.569 "uuid": "27fb48db-caf8-49dc-b947-0687bbda9a44", 00:13:07.569 "strip_size_kb": 64, 00:13:07.569 "state": "configuring", 00:13:07.569 "raid_level": "concat", 00:13:07.569 "superblock": true, 00:13:07.569 "num_base_bdevs": 3, 00:13:07.569 "num_base_bdevs_discovered": 1, 00:13:07.569 "num_base_bdevs_operational": 3, 00:13:07.569 "base_bdevs_list": [ 00:13:07.569 { 00:13:07.569 "name": "pt1", 00:13:07.569 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:07.569 "is_configured": true, 00:13:07.569 "data_offset": 2048, 00:13:07.569 "data_size": 63488 00:13:07.569 }, 00:13:07.569 { 00:13:07.569 "name": null, 00:13:07.569 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:07.569 "is_configured": false, 00:13:07.569 "data_offset": 2048, 00:13:07.569 "data_size": 63488 00:13:07.569 }, 00:13:07.569 { 00:13:07.569 "name": null, 00:13:07.569 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:07.569 "is_configured": false, 00:13:07.569 "data_offset": 2048, 00:13:07.569 "data_size": 63488 00:13:07.569 } 00:13:07.569 ] 00:13:07.569 }' 00:13:07.569 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:07.569 11:53:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:08.165 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:13:08.165 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:08.165 [2024-07-12 11:53:58.286909] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:08.165 [2024-07-12 11:53:58.286945] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:08.165 [2024-07-12 11:53:58.286955] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x276cd90 00:13:08.165 [2024-07-12 11:53:58.286960] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:08.165 [2024-07-12 11:53:58.287208] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:08.165 [2024-07-12 11:53:58.287218] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:08.165 [2024-07-12 11:53:58.287260] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:08.165 [2024-07-12 11:53:58.287273] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:08.165 pt2 00:13:08.165 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:08.424 [2024-07-12 11:53:58.455352] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:13:08.424 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:13:08.424 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:08.424 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:08.424 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:08.424 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:08.424 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:08.424 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:08.424 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:08.424 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:08.424 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:08.424 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:08.424 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:08.424 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:08.424 "name": "raid_bdev1", 00:13:08.424 "uuid": "27fb48db-caf8-49dc-b947-0687bbda9a44", 00:13:08.424 "strip_size_kb": 64, 00:13:08.424 "state": "configuring", 00:13:08.424 "raid_level": "concat", 00:13:08.424 "superblock": true, 00:13:08.424 "num_base_bdevs": 3, 00:13:08.424 "num_base_bdevs_discovered": 1, 00:13:08.424 "num_base_bdevs_operational": 3, 00:13:08.424 "base_bdevs_list": [ 00:13:08.424 { 00:13:08.424 "name": "pt1", 00:13:08.424 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:08.424 "is_configured": true, 00:13:08.424 "data_offset": 2048, 00:13:08.424 "data_size": 63488 00:13:08.424 }, 00:13:08.424 { 00:13:08.424 "name": null, 00:13:08.424 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:08.424 "is_configured": false, 00:13:08.424 "data_offset": 2048, 00:13:08.424 "data_size": 63488 00:13:08.424 }, 00:13:08.424 { 00:13:08.424 "name": null, 00:13:08.424 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:08.424 "is_configured": false, 00:13:08.424 "data_offset": 2048, 00:13:08.424 "data_size": 63488 00:13:08.424 } 00:13:08.424 ] 00:13:08.424 }' 00:13:08.424 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:08.424 11:53:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:08.991 11:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:13:08.991 11:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:08.991 11:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:09.251 [2024-07-12 11:53:59.301532] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:09.251 [2024-07-12 11:53:59.301570] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:09.251 [2024-07-12 11:53:59.301580] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2916820 00:13:09.251 [2024-07-12 11:53:59.301602] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:09.251 [2024-07-12 11:53:59.301861] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:09.251 [2024-07-12 11:53:59.301870] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:09.251 [2024-07-12 11:53:59.301916] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:09.251 [2024-07-12 11:53:59.301930] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:09.251 pt2 00:13:09.251 11:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:13:09.251 11:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:09.251 11:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:13:09.251 [2024-07-12 11:53:59.453923] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:13:09.251 [2024-07-12 11:53:59.453943] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:09.251 [2024-07-12 11:53:59.453952] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x291a540 00:13:09.251 [2024-07-12 11:53:59.453957] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:09.251 [2024-07-12 11:53:59.454156] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:09.251 [2024-07-12 11:53:59.454165] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:13:09.251 [2024-07-12 11:53:59.454199] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:13:09.251 [2024-07-12 11:53:59.454210] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:13:09.251 [2024-07-12 11:53:59.454286] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x291d7e0 00:13:09.251 [2024-07-12 11:53:59.454292] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:09.251 [2024-07-12 11:53:59.454400] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x291d370 00:13:09.251 [2024-07-12 11:53:59.454485] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x291d7e0 00:13:09.251 [2024-07-12 11:53:59.454490] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x291d7e0 00:13:09.251 [2024-07-12 11:53:59.454558] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:09.251 pt3 00:13:09.251 11:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:13:09.251 11:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:09.251 11:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:13:09.251 11:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:09.251 11:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:09.251 11:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:09.251 11:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:09.251 11:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:09.251 11:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:09.251 11:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:09.251 11:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:09.251 11:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:09.251 11:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:09.251 11:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:09.510 11:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:09.510 "name": "raid_bdev1", 00:13:09.510 "uuid": "27fb48db-caf8-49dc-b947-0687bbda9a44", 00:13:09.510 "strip_size_kb": 64, 00:13:09.510 "state": "online", 00:13:09.510 "raid_level": "concat", 00:13:09.510 "superblock": true, 00:13:09.510 "num_base_bdevs": 3, 00:13:09.510 "num_base_bdevs_discovered": 3, 00:13:09.510 "num_base_bdevs_operational": 3, 00:13:09.510 "base_bdevs_list": [ 00:13:09.510 { 00:13:09.510 "name": "pt1", 00:13:09.510 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:09.510 "is_configured": true, 00:13:09.510 "data_offset": 2048, 00:13:09.510 "data_size": 63488 00:13:09.510 }, 00:13:09.510 { 00:13:09.510 "name": "pt2", 00:13:09.510 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:09.510 "is_configured": true, 00:13:09.510 "data_offset": 2048, 00:13:09.510 "data_size": 63488 00:13:09.510 }, 00:13:09.510 { 00:13:09.510 "name": "pt3", 00:13:09.510 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:09.510 "is_configured": true, 00:13:09.510 "data_offset": 2048, 00:13:09.510 "data_size": 63488 00:13:09.510 } 00:13:09.510 ] 00:13:09.510 }' 00:13:09.510 11:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:09.510 11:53:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:10.078 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:13:10.078 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:10.078 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:10.078 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:10.078 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:10.078 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:10.079 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:10.079 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:10.079 [2024-07-12 11:54:00.292303] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:10.079 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:10.079 "name": "raid_bdev1", 00:13:10.079 "aliases": [ 00:13:10.079 "27fb48db-caf8-49dc-b947-0687bbda9a44" 00:13:10.079 ], 00:13:10.079 "product_name": "Raid Volume", 00:13:10.079 "block_size": 512, 00:13:10.079 "num_blocks": 190464, 00:13:10.079 "uuid": "27fb48db-caf8-49dc-b947-0687bbda9a44", 00:13:10.079 "assigned_rate_limits": { 00:13:10.079 "rw_ios_per_sec": 0, 00:13:10.079 "rw_mbytes_per_sec": 0, 00:13:10.079 "r_mbytes_per_sec": 0, 00:13:10.079 "w_mbytes_per_sec": 0 00:13:10.079 }, 00:13:10.079 "claimed": false, 00:13:10.079 "zoned": false, 00:13:10.079 "supported_io_types": { 00:13:10.079 "read": true, 00:13:10.079 "write": true, 00:13:10.079 "unmap": true, 00:13:10.079 "flush": true, 00:13:10.079 "reset": true, 00:13:10.079 "nvme_admin": false, 00:13:10.079 "nvme_io": false, 00:13:10.079 "nvme_io_md": false, 00:13:10.079 "write_zeroes": true, 00:13:10.079 "zcopy": false, 00:13:10.079 "get_zone_info": false, 00:13:10.079 "zone_management": false, 00:13:10.079 "zone_append": false, 00:13:10.079 "compare": false, 00:13:10.079 "compare_and_write": false, 00:13:10.079 "abort": false, 00:13:10.079 "seek_hole": false, 00:13:10.079 "seek_data": false, 00:13:10.079 "copy": false, 00:13:10.079 "nvme_iov_md": false 00:13:10.079 }, 00:13:10.079 "memory_domains": [ 00:13:10.079 { 00:13:10.079 "dma_device_id": "system", 00:13:10.079 "dma_device_type": 1 00:13:10.079 }, 00:13:10.079 { 00:13:10.079 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:10.079 "dma_device_type": 2 00:13:10.079 }, 00:13:10.079 { 00:13:10.079 "dma_device_id": "system", 00:13:10.079 "dma_device_type": 1 00:13:10.079 }, 00:13:10.079 { 00:13:10.079 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:10.079 "dma_device_type": 2 00:13:10.079 }, 00:13:10.079 { 00:13:10.079 "dma_device_id": "system", 00:13:10.079 "dma_device_type": 1 00:13:10.079 }, 00:13:10.079 { 00:13:10.079 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:10.079 "dma_device_type": 2 00:13:10.079 } 00:13:10.079 ], 00:13:10.079 "driver_specific": { 00:13:10.079 "raid": { 00:13:10.079 "uuid": "27fb48db-caf8-49dc-b947-0687bbda9a44", 00:13:10.079 "strip_size_kb": 64, 00:13:10.079 "state": "online", 00:13:10.079 "raid_level": "concat", 00:13:10.079 "superblock": true, 00:13:10.079 "num_base_bdevs": 3, 00:13:10.079 "num_base_bdevs_discovered": 3, 00:13:10.079 "num_base_bdevs_operational": 3, 00:13:10.079 "base_bdevs_list": [ 00:13:10.079 { 00:13:10.079 "name": "pt1", 00:13:10.079 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:10.079 "is_configured": true, 00:13:10.079 "data_offset": 2048, 00:13:10.079 "data_size": 63488 00:13:10.079 }, 00:13:10.079 { 00:13:10.079 "name": "pt2", 00:13:10.079 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:10.079 "is_configured": true, 00:13:10.079 "data_offset": 2048, 00:13:10.079 "data_size": 63488 00:13:10.079 }, 00:13:10.079 { 00:13:10.079 "name": "pt3", 00:13:10.079 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:10.079 "is_configured": true, 00:13:10.079 "data_offset": 2048, 00:13:10.079 "data_size": 63488 00:13:10.079 } 00:13:10.079 ] 00:13:10.079 } 00:13:10.079 } 00:13:10.079 }' 00:13:10.079 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:10.338 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:10.338 pt2 00:13:10.338 pt3' 00:13:10.338 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:10.338 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:10.338 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:10.338 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:10.338 "name": "pt1", 00:13:10.338 "aliases": [ 00:13:10.338 "00000000-0000-0000-0000-000000000001" 00:13:10.338 ], 00:13:10.338 "product_name": "passthru", 00:13:10.338 "block_size": 512, 00:13:10.338 "num_blocks": 65536, 00:13:10.338 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:10.338 "assigned_rate_limits": { 00:13:10.338 "rw_ios_per_sec": 0, 00:13:10.338 "rw_mbytes_per_sec": 0, 00:13:10.338 "r_mbytes_per_sec": 0, 00:13:10.338 "w_mbytes_per_sec": 0 00:13:10.338 }, 00:13:10.338 "claimed": true, 00:13:10.338 "claim_type": "exclusive_write", 00:13:10.338 "zoned": false, 00:13:10.338 "supported_io_types": { 00:13:10.338 "read": true, 00:13:10.338 "write": true, 00:13:10.338 "unmap": true, 00:13:10.338 "flush": true, 00:13:10.338 "reset": true, 00:13:10.338 "nvme_admin": false, 00:13:10.338 "nvme_io": false, 00:13:10.338 "nvme_io_md": false, 00:13:10.338 "write_zeroes": true, 00:13:10.338 "zcopy": true, 00:13:10.338 "get_zone_info": false, 00:13:10.338 "zone_management": false, 00:13:10.338 "zone_append": false, 00:13:10.338 "compare": false, 00:13:10.338 "compare_and_write": false, 00:13:10.338 "abort": true, 00:13:10.338 "seek_hole": false, 00:13:10.338 "seek_data": false, 00:13:10.338 "copy": true, 00:13:10.338 "nvme_iov_md": false 00:13:10.338 }, 00:13:10.338 "memory_domains": [ 00:13:10.338 { 00:13:10.338 "dma_device_id": "system", 00:13:10.338 "dma_device_type": 1 00:13:10.338 }, 00:13:10.338 { 00:13:10.338 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:10.338 "dma_device_type": 2 00:13:10.338 } 00:13:10.338 ], 00:13:10.338 "driver_specific": { 00:13:10.338 "passthru": { 00:13:10.338 "name": "pt1", 00:13:10.338 "base_bdev_name": "malloc1" 00:13:10.338 } 00:13:10.338 } 00:13:10.338 }' 00:13:10.338 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:10.338 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:10.598 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:10.598 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:10.598 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:10.598 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:10.598 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:10.598 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:10.598 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:10.598 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:10.598 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:10.598 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:10.598 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:10.598 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:10.598 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:10.857 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:10.857 "name": "pt2", 00:13:10.857 "aliases": [ 00:13:10.857 "00000000-0000-0000-0000-000000000002" 00:13:10.857 ], 00:13:10.857 "product_name": "passthru", 00:13:10.857 "block_size": 512, 00:13:10.857 "num_blocks": 65536, 00:13:10.857 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:10.857 "assigned_rate_limits": { 00:13:10.857 "rw_ios_per_sec": 0, 00:13:10.857 "rw_mbytes_per_sec": 0, 00:13:10.857 "r_mbytes_per_sec": 0, 00:13:10.857 "w_mbytes_per_sec": 0 00:13:10.857 }, 00:13:10.857 "claimed": true, 00:13:10.857 "claim_type": "exclusive_write", 00:13:10.857 "zoned": false, 00:13:10.857 "supported_io_types": { 00:13:10.857 "read": true, 00:13:10.857 "write": true, 00:13:10.857 "unmap": true, 00:13:10.857 "flush": true, 00:13:10.857 "reset": true, 00:13:10.857 "nvme_admin": false, 00:13:10.857 "nvme_io": false, 00:13:10.857 "nvme_io_md": false, 00:13:10.857 "write_zeroes": true, 00:13:10.857 "zcopy": true, 00:13:10.857 "get_zone_info": false, 00:13:10.857 "zone_management": false, 00:13:10.857 "zone_append": false, 00:13:10.857 "compare": false, 00:13:10.857 "compare_and_write": false, 00:13:10.857 "abort": true, 00:13:10.857 "seek_hole": false, 00:13:10.857 "seek_data": false, 00:13:10.857 "copy": true, 00:13:10.857 "nvme_iov_md": false 00:13:10.857 }, 00:13:10.857 "memory_domains": [ 00:13:10.857 { 00:13:10.857 "dma_device_id": "system", 00:13:10.857 "dma_device_type": 1 00:13:10.857 }, 00:13:10.857 { 00:13:10.857 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:10.857 "dma_device_type": 2 00:13:10.857 } 00:13:10.857 ], 00:13:10.857 "driver_specific": { 00:13:10.857 "passthru": { 00:13:10.857 "name": "pt2", 00:13:10.857 "base_bdev_name": "malloc2" 00:13:10.857 } 00:13:10.857 } 00:13:10.857 }' 00:13:10.857 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:10.857 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:10.857 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:10.857 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:10.857 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:11.116 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:11.116 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:11.116 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:11.116 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:11.116 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:11.116 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:11.116 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:11.116 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:11.116 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:13:11.116 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:11.374 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:11.374 "name": "pt3", 00:13:11.374 "aliases": [ 00:13:11.374 "00000000-0000-0000-0000-000000000003" 00:13:11.374 ], 00:13:11.374 "product_name": "passthru", 00:13:11.374 "block_size": 512, 00:13:11.374 "num_blocks": 65536, 00:13:11.374 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:11.374 "assigned_rate_limits": { 00:13:11.374 "rw_ios_per_sec": 0, 00:13:11.374 "rw_mbytes_per_sec": 0, 00:13:11.374 "r_mbytes_per_sec": 0, 00:13:11.374 "w_mbytes_per_sec": 0 00:13:11.374 }, 00:13:11.374 "claimed": true, 00:13:11.374 "claim_type": "exclusive_write", 00:13:11.374 "zoned": false, 00:13:11.374 "supported_io_types": { 00:13:11.374 "read": true, 00:13:11.374 "write": true, 00:13:11.374 "unmap": true, 00:13:11.374 "flush": true, 00:13:11.374 "reset": true, 00:13:11.374 "nvme_admin": false, 00:13:11.374 "nvme_io": false, 00:13:11.374 "nvme_io_md": false, 00:13:11.374 "write_zeroes": true, 00:13:11.374 "zcopy": true, 00:13:11.374 "get_zone_info": false, 00:13:11.374 "zone_management": false, 00:13:11.374 "zone_append": false, 00:13:11.374 "compare": false, 00:13:11.374 "compare_and_write": false, 00:13:11.374 "abort": true, 00:13:11.374 "seek_hole": false, 00:13:11.375 "seek_data": false, 00:13:11.375 "copy": true, 00:13:11.375 "nvme_iov_md": false 00:13:11.375 }, 00:13:11.375 "memory_domains": [ 00:13:11.375 { 00:13:11.375 "dma_device_id": "system", 00:13:11.375 "dma_device_type": 1 00:13:11.375 }, 00:13:11.375 { 00:13:11.375 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:11.375 "dma_device_type": 2 00:13:11.375 } 00:13:11.375 ], 00:13:11.375 "driver_specific": { 00:13:11.375 "passthru": { 00:13:11.375 "name": "pt3", 00:13:11.375 "base_bdev_name": "malloc3" 00:13:11.375 } 00:13:11.375 } 00:13:11.375 }' 00:13:11.375 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:11.375 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:11.375 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:11.375 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:11.375 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:11.633 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:11.633 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:11.633 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:11.633 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:11.633 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:11.633 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:11.633 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:11.633 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:11.633 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:13:11.892 [2024-07-12 11:54:01.940580] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:11.892 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 27fb48db-caf8-49dc-b947-0687bbda9a44 '!=' 27fb48db-caf8-49dc-b947-0687bbda9a44 ']' 00:13:11.892 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:13:11.892 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:11.892 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:11.892 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 631215 00:13:11.892 11:54:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 631215 ']' 00:13:11.892 11:54:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 631215 00:13:11.892 11:54:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:13:11.892 11:54:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:11.892 11:54:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 631215 00:13:11.892 11:54:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:11.892 11:54:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:11.892 11:54:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 631215' 00:13:11.892 killing process with pid 631215 00:13:11.892 11:54:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 631215 00:13:11.892 [2024-07-12 11:54:01.988284] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:11.892 [2024-07-12 11:54:01.988325] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:11.892 [2024-07-12 11:54:01.988363] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:11.892 [2024-07-12 11:54:01.988370] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x291d7e0 name raid_bdev1, state offline 00:13:11.892 11:54:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 631215 00:13:11.892 [2024-07-12 11:54:02.011131] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:12.151 11:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:13:12.151 00:13:12.151 real 0m10.669s 00:13:12.151 user 0m19.446s 00:13:12.151 sys 0m1.644s 00:13:12.152 11:54:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:12.152 11:54:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:12.152 ************************************ 00:13:12.152 END TEST raid_superblock_test 00:13:12.152 ************************************ 00:13:12.152 11:54:02 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:12.152 11:54:02 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 3 read 00:13:12.152 11:54:02 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:12.152 11:54:02 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:12.152 11:54:02 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:12.152 ************************************ 00:13:12.152 START TEST raid_read_error_test 00:13:12.152 ************************************ 00:13:12.152 11:54:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 3 read 00:13:12.152 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:13:12.152 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:13:12.152 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:13:12.152 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:13:12.152 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:12.152 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:13:12.152 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:12.152 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:12.152 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:13:12.152 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:12.152 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:12.152 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:13:12.152 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:12.152 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:12.152 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:12.152 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:13:12.152 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:13:12.152 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:13:12.152 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:13:12.152 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:13:12.152 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:13:12.152 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:13:12.152 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:13:12.152 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:13:12.152 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:13:12.152 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.ZR1YwPleSg 00:13:12.152 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=633451 00:13:12.152 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 633451 /var/tmp/spdk-raid.sock 00:13:12.152 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:12.152 11:54:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 633451 ']' 00:13:12.152 11:54:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:12.152 11:54:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:12.152 11:54:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:12.152 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:12.152 11:54:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:12.152 11:54:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:12.152 [2024-07-12 11:54:02.300706] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:13:12.152 [2024-07-12 11:54:02.300741] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid633451 ] 00:13:12.152 [2024-07-12 11:54:02.364739] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:12.411 [2024-07-12 11:54:02.443218] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:13:12.411 [2024-07-12 11:54:02.497375] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:12.411 [2024-07-12 11:54:02.497402] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:12.978 11:54:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:12.978 11:54:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:13:12.978 11:54:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:12.978 11:54:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:13.235 BaseBdev1_malloc 00:13:13.235 11:54:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:13.235 true 00:13:13.235 11:54:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:13.493 [2024-07-12 11:54:03.601663] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:13.493 [2024-07-12 11:54:03.601694] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:13.493 [2024-07-12 11:54:03.601706] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11782d0 00:13:13.493 [2024-07-12 11:54:03.601713] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:13.493 [2024-07-12 11:54:03.602962] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:13.493 [2024-07-12 11:54:03.602982] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:13.493 BaseBdev1 00:13:13.493 11:54:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:13.493 11:54:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:13.752 BaseBdev2_malloc 00:13:13.752 11:54:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:13.752 true 00:13:13.752 11:54:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:14.011 [2024-07-12 11:54:04.098368] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:14.012 [2024-07-12 11:54:04.098398] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:14.012 [2024-07-12 11:54:04.098411] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x117cf40 00:13:14.012 [2024-07-12 11:54:04.098417] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:14.012 [2024-07-12 11:54:04.099487] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:14.012 [2024-07-12 11:54:04.099508] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:14.012 BaseBdev2 00:13:14.012 11:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:14.012 11:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:13:14.271 BaseBdev3_malloc 00:13:14.271 11:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:13:14.271 true 00:13:14.271 11:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:13:14.530 [2024-07-12 11:54:04.611439] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:13:14.530 [2024-07-12 11:54:04.611467] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:14.530 [2024-07-12 11:54:04.611479] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x117fea0 00:13:14.530 [2024-07-12 11:54:04.611485] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:14.530 [2024-07-12 11:54:04.612564] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:14.530 [2024-07-12 11:54:04.612583] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:13:14.530 BaseBdev3 00:13:14.530 11:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:13:14.530 [2024-07-12 11:54:04.775894] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:14.789 [2024-07-12 11:54:04.776843] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:14.789 [2024-07-12 11:54:04.776890] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:14.789 [2024-07-12 11:54:04.777031] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1179000 00:13:14.789 [2024-07-12 11:54:04.777039] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:14.789 [2024-07-12 11:54:04.777176] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x117f7c0 00:13:14.789 [2024-07-12 11:54:04.777278] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1179000 00:13:14.789 [2024-07-12 11:54:04.777283] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1179000 00:13:14.789 [2024-07-12 11:54:04.777351] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:14.789 11:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:13:14.789 11:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:14.789 11:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:14.789 11:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:14.789 11:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:14.789 11:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:14.789 11:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:14.789 11:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:14.789 11:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:14.789 11:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:14.789 11:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:14.789 11:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:14.789 11:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:14.789 "name": "raid_bdev1", 00:13:14.789 "uuid": "3534d58d-bc51-49c5-8879-add5d3fb778a", 00:13:14.789 "strip_size_kb": 64, 00:13:14.789 "state": "online", 00:13:14.789 "raid_level": "concat", 00:13:14.789 "superblock": true, 00:13:14.789 "num_base_bdevs": 3, 00:13:14.789 "num_base_bdevs_discovered": 3, 00:13:14.789 "num_base_bdevs_operational": 3, 00:13:14.789 "base_bdevs_list": [ 00:13:14.789 { 00:13:14.789 "name": "BaseBdev1", 00:13:14.789 "uuid": "4ca0041e-2755-54c7-85bf-5f7219bfb977", 00:13:14.789 "is_configured": true, 00:13:14.789 "data_offset": 2048, 00:13:14.789 "data_size": 63488 00:13:14.789 }, 00:13:14.789 { 00:13:14.789 "name": "BaseBdev2", 00:13:14.789 "uuid": "9baf8281-a401-53b6-a378-8d96298f7975", 00:13:14.789 "is_configured": true, 00:13:14.789 "data_offset": 2048, 00:13:14.789 "data_size": 63488 00:13:14.789 }, 00:13:14.789 { 00:13:14.789 "name": "BaseBdev3", 00:13:14.789 "uuid": "a31a82b0-4b02-530d-a254-2e9c9f589597", 00:13:14.789 "is_configured": true, 00:13:14.789 "data_offset": 2048, 00:13:14.789 "data_size": 63488 00:13:14.789 } 00:13:14.789 ] 00:13:14.789 }' 00:13:14.789 11:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:14.789 11:54:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:15.355 11:54:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:15.355 11:54:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:15.355 [2024-07-12 11:54:05.538052] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfce880 00:13:16.291 11:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:13:16.550 11:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:16.550 11:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:13:16.550 11:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:13:16.550 11:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:13:16.550 11:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:16.550 11:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:16.550 11:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:16.550 11:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:16.550 11:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:16.550 11:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:16.550 11:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:16.550 11:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:16.550 11:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:16.550 11:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:16.550 11:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:16.809 11:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:16.809 "name": "raid_bdev1", 00:13:16.809 "uuid": "3534d58d-bc51-49c5-8879-add5d3fb778a", 00:13:16.809 "strip_size_kb": 64, 00:13:16.809 "state": "online", 00:13:16.809 "raid_level": "concat", 00:13:16.809 "superblock": true, 00:13:16.809 "num_base_bdevs": 3, 00:13:16.809 "num_base_bdevs_discovered": 3, 00:13:16.809 "num_base_bdevs_operational": 3, 00:13:16.809 "base_bdevs_list": [ 00:13:16.809 { 00:13:16.809 "name": "BaseBdev1", 00:13:16.809 "uuid": "4ca0041e-2755-54c7-85bf-5f7219bfb977", 00:13:16.809 "is_configured": true, 00:13:16.809 "data_offset": 2048, 00:13:16.809 "data_size": 63488 00:13:16.809 }, 00:13:16.809 { 00:13:16.809 "name": "BaseBdev2", 00:13:16.809 "uuid": "9baf8281-a401-53b6-a378-8d96298f7975", 00:13:16.809 "is_configured": true, 00:13:16.809 "data_offset": 2048, 00:13:16.809 "data_size": 63488 00:13:16.809 }, 00:13:16.809 { 00:13:16.809 "name": "BaseBdev3", 00:13:16.809 "uuid": "a31a82b0-4b02-530d-a254-2e9c9f589597", 00:13:16.809 "is_configured": true, 00:13:16.809 "data_offset": 2048, 00:13:16.809 "data_size": 63488 00:13:16.809 } 00:13:16.809 ] 00:13:16.809 }' 00:13:16.809 11:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:16.809 11:54:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:17.377 11:54:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:17.377 [2024-07-12 11:54:07.470650] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:17.377 [2024-07-12 11:54:07.470684] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:17.377 [2024-07-12 11:54:07.472823] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:17.377 [2024-07-12 11:54:07.472849] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:17.377 [2024-07-12 11:54:07.472870] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:17.377 [2024-07-12 11:54:07.472875] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1179000 name raid_bdev1, state offline 00:13:17.377 0 00:13:17.377 11:54:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 633451 00:13:17.377 11:54:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 633451 ']' 00:13:17.377 11:54:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 633451 00:13:17.377 11:54:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:13:17.377 11:54:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:17.377 11:54:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 633451 00:13:17.377 11:54:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:17.377 11:54:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:17.377 11:54:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 633451' 00:13:17.377 killing process with pid 633451 00:13:17.377 11:54:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 633451 00:13:17.377 [2024-07-12 11:54:07.533468] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:17.377 11:54:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 633451 00:13:17.377 [2024-07-12 11:54:07.551488] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:17.636 11:54:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.ZR1YwPleSg 00:13:17.636 11:54:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:17.636 11:54:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:17.636 11:54:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:13:17.636 11:54:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:13:17.636 11:54:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:17.636 11:54:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:17.636 11:54:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:13:17.636 00:13:17.636 real 0m5.504s 00:13:17.636 user 0m8.565s 00:13:17.636 sys 0m0.802s 00:13:17.636 11:54:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:17.636 11:54:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:17.636 ************************************ 00:13:17.636 END TEST raid_read_error_test 00:13:17.636 ************************************ 00:13:17.636 11:54:07 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:17.636 11:54:07 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 3 write 00:13:17.636 11:54:07 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:17.636 11:54:07 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:17.636 11:54:07 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:17.636 ************************************ 00:13:17.636 START TEST raid_write_error_test 00:13:17.636 ************************************ 00:13:17.636 11:54:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 3 write 00:13:17.636 11:54:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:13:17.636 11:54:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:13:17.636 11:54:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:13:17.636 11:54:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:13:17.636 11:54:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:17.636 11:54:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:13:17.636 11:54:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:17.636 11:54:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:17.636 11:54:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:13:17.636 11:54:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:17.636 11:54:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:17.636 11:54:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:13:17.636 11:54:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:17.636 11:54:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:17.636 11:54:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:17.636 11:54:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:13:17.636 11:54:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:13:17.636 11:54:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:13:17.636 11:54:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:13:17.636 11:54:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:13:17.636 11:54:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:13:17.636 11:54:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:13:17.636 11:54:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:13:17.636 11:54:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:13:17.637 11:54:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:13:17.637 11:54:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.jIcAZU6LXO 00:13:17.637 11:54:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=634458 00:13:17.637 11:54:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 634458 /var/tmp/spdk-raid.sock 00:13:17.637 11:54:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:17.637 11:54:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 634458 ']' 00:13:17.637 11:54:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:17.637 11:54:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:17.637 11:54:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:17.637 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:17.637 11:54:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:17.637 11:54:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:17.637 [2024-07-12 11:54:07.877454] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:13:17.637 [2024-07-12 11:54:07.877497] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid634458 ] 00:13:17.895 [2024-07-12 11:54:07.943851] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:17.895 [2024-07-12 11:54:08.016197] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:13:17.895 [2024-07-12 11:54:08.067965] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:17.895 [2024-07-12 11:54:08.067988] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:18.463 11:54:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:18.463 11:54:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:13:18.463 11:54:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:18.463 11:54:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:18.721 BaseBdev1_malloc 00:13:18.721 11:54:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:18.981 true 00:13:18.981 11:54:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:18.981 [2024-07-12 11:54:09.171866] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:18.981 [2024-07-12 11:54:09.171900] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:18.981 [2024-07-12 11:54:09.171909] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa822d0 00:13:18.981 [2024-07-12 11:54:09.171915] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:18.981 [2024-07-12 11:54:09.172990] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:18.981 [2024-07-12 11:54:09.173009] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:18.981 BaseBdev1 00:13:18.981 11:54:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:18.981 11:54:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:19.240 BaseBdev2_malloc 00:13:19.240 11:54:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:19.498 true 00:13:19.498 11:54:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:19.498 [2024-07-12 11:54:09.704581] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:19.498 [2024-07-12 11:54:09.704610] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:19.498 [2024-07-12 11:54:09.704622] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa86f40 00:13:19.498 [2024-07-12 11:54:09.704629] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:19.498 [2024-07-12 11:54:09.705733] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:19.498 [2024-07-12 11:54:09.705754] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:19.498 BaseBdev2 00:13:19.498 11:54:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:19.498 11:54:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:13:19.756 BaseBdev3_malloc 00:13:19.756 11:54:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:13:20.015 true 00:13:20.015 11:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:13:20.015 [2024-07-12 11:54:10.225575] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:13:20.015 [2024-07-12 11:54:10.225607] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:20.015 [2024-07-12 11:54:10.225617] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa89ea0 00:13:20.015 [2024-07-12 11:54:10.225623] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:20.015 [2024-07-12 11:54:10.226585] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:20.015 [2024-07-12 11:54:10.226605] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:13:20.015 BaseBdev3 00:13:20.015 11:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:13:20.273 [2024-07-12 11:54:10.382003] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:20.273 [2024-07-12 11:54:10.382771] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:20.273 [2024-07-12 11:54:10.382814] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:20.273 [2024-07-12 11:54:10.382940] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xa83000 00:13:20.273 [2024-07-12 11:54:10.382946] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:20.273 [2024-07-12 11:54:10.383062] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa897c0 00:13:20.273 [2024-07-12 11:54:10.383151] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa83000 00:13:20.273 [2024-07-12 11:54:10.383156] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xa83000 00:13:20.273 [2024-07-12 11:54:10.383215] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:20.273 11:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:13:20.273 11:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:20.273 11:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:20.273 11:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:20.273 11:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:20.273 11:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:20.273 11:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:20.273 11:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:20.273 11:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:20.273 11:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:20.273 11:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:20.273 11:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:20.531 11:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:20.531 "name": "raid_bdev1", 00:13:20.531 "uuid": "0ee4dc56-186e-471e-96c6-e8c3c4ab1a8c", 00:13:20.531 "strip_size_kb": 64, 00:13:20.531 "state": "online", 00:13:20.531 "raid_level": "concat", 00:13:20.531 "superblock": true, 00:13:20.531 "num_base_bdevs": 3, 00:13:20.531 "num_base_bdevs_discovered": 3, 00:13:20.531 "num_base_bdevs_operational": 3, 00:13:20.531 "base_bdevs_list": [ 00:13:20.531 { 00:13:20.531 "name": "BaseBdev1", 00:13:20.531 "uuid": "fbba94a1-c2bf-5adb-a43a-fad37e57186b", 00:13:20.531 "is_configured": true, 00:13:20.531 "data_offset": 2048, 00:13:20.531 "data_size": 63488 00:13:20.531 }, 00:13:20.531 { 00:13:20.531 "name": "BaseBdev2", 00:13:20.531 "uuid": "fe94cf28-957c-5ce9-af3c-f05427030b70", 00:13:20.531 "is_configured": true, 00:13:20.531 "data_offset": 2048, 00:13:20.531 "data_size": 63488 00:13:20.531 }, 00:13:20.531 { 00:13:20.531 "name": "BaseBdev3", 00:13:20.531 "uuid": "8777d7b2-279e-5625-a5c2-48bcfe35305d", 00:13:20.531 "is_configured": true, 00:13:20.531 "data_offset": 2048, 00:13:20.531 "data_size": 63488 00:13:20.531 } 00:13:20.531 ] 00:13:20.531 }' 00:13:20.531 11:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:20.531 11:54:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:21.099 11:54:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:21.099 11:54:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:21.099 [2024-07-12 11:54:11.124130] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x8d8880 00:13:22.033 11:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:13:22.033 11:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:22.033 11:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:13:22.033 11:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:13:22.033 11:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:13:22.033 11:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:22.033 11:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:22.033 11:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:22.033 11:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:22.033 11:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:22.033 11:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:22.033 11:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:22.033 11:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:22.033 11:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:22.033 11:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:22.033 11:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:22.292 11:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:22.292 "name": "raid_bdev1", 00:13:22.292 "uuid": "0ee4dc56-186e-471e-96c6-e8c3c4ab1a8c", 00:13:22.292 "strip_size_kb": 64, 00:13:22.292 "state": "online", 00:13:22.292 "raid_level": "concat", 00:13:22.292 "superblock": true, 00:13:22.292 "num_base_bdevs": 3, 00:13:22.292 "num_base_bdevs_discovered": 3, 00:13:22.292 "num_base_bdevs_operational": 3, 00:13:22.292 "base_bdevs_list": [ 00:13:22.292 { 00:13:22.292 "name": "BaseBdev1", 00:13:22.292 "uuid": "fbba94a1-c2bf-5adb-a43a-fad37e57186b", 00:13:22.292 "is_configured": true, 00:13:22.292 "data_offset": 2048, 00:13:22.292 "data_size": 63488 00:13:22.292 }, 00:13:22.292 { 00:13:22.292 "name": "BaseBdev2", 00:13:22.292 "uuid": "fe94cf28-957c-5ce9-af3c-f05427030b70", 00:13:22.292 "is_configured": true, 00:13:22.292 "data_offset": 2048, 00:13:22.292 "data_size": 63488 00:13:22.292 }, 00:13:22.292 { 00:13:22.292 "name": "BaseBdev3", 00:13:22.292 "uuid": "8777d7b2-279e-5625-a5c2-48bcfe35305d", 00:13:22.292 "is_configured": true, 00:13:22.292 "data_offset": 2048, 00:13:22.292 "data_size": 63488 00:13:22.292 } 00:13:22.292 ] 00:13:22.292 }' 00:13:22.292 11:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:22.292 11:54:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:22.857 11:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:22.857 [2024-07-12 11:54:13.028341] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:22.857 [2024-07-12 11:54:13.028369] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:22.857 [2024-07-12 11:54:13.030408] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:22.857 [2024-07-12 11:54:13.030433] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:22.857 [2024-07-12 11:54:13.030452] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:22.857 [2024-07-12 11:54:13.030458] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa83000 name raid_bdev1, state offline 00:13:22.857 0 00:13:22.857 11:54:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 634458 00:13:22.857 11:54:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 634458 ']' 00:13:22.857 11:54:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 634458 00:13:22.857 11:54:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:13:22.857 11:54:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:22.857 11:54:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 634458 00:13:22.857 11:54:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:22.857 11:54:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:22.857 11:54:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 634458' 00:13:22.857 killing process with pid 634458 00:13:22.857 11:54:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 634458 00:13:22.857 [2024-07-12 11:54:13.085489] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:22.857 11:54:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 634458 00:13:23.115 [2024-07-12 11:54:13.103665] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:23.115 11:54:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.jIcAZU6LXO 00:13:23.115 11:54:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:23.115 11:54:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:23.115 11:54:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.53 00:13:23.115 11:54:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:13:23.115 11:54:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:23.115 11:54:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:23.115 11:54:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.53 != \0\.\0\0 ]] 00:13:23.115 00:13:23.115 real 0m5.480s 00:13:23.115 user 0m8.496s 00:13:23.115 sys 0m0.828s 00:13:23.115 11:54:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:23.115 11:54:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:23.115 ************************************ 00:13:23.115 END TEST raid_write_error_test 00:13:23.115 ************************************ 00:13:23.115 11:54:13 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:23.115 11:54:13 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:13:23.115 11:54:13 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 3 false 00:13:23.115 11:54:13 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:23.115 11:54:13 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:23.115 11:54:13 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:23.115 ************************************ 00:13:23.115 START TEST raid_state_function_test 00:13:23.115 ************************************ 00:13:23.115 11:54:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 3 false 00:13:23.373 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:13:23.373 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:13:23.373 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:13:23.373 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:23.373 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:23.373 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:23.373 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:23.373 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:23.373 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:23.373 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:23.373 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:23.373 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:23.373 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:13:23.373 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:23.373 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:23.373 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:23.373 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:23.373 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:23.373 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:23.373 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:23.373 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:23.373 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:13:23.373 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:13:23.373 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:13:23.373 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:13:23.373 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=635857 00:13:23.373 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 635857' 00:13:23.373 Process raid pid: 635857 00:13:23.373 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:23.373 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 635857 /var/tmp/spdk-raid.sock 00:13:23.373 11:54:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 635857 ']' 00:13:23.373 11:54:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:23.373 11:54:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:23.373 11:54:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:23.373 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:23.373 11:54:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:23.373 11:54:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:23.373 [2024-07-12 11:54:13.417589] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:13:23.373 [2024-07-12 11:54:13.417624] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:23.373 [2024-07-12 11:54:13.481839] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:23.373 [2024-07-12 11:54:13.560009] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:13:23.373 [2024-07-12 11:54:13.610580] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:23.373 [2024-07-12 11:54:13.610602] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:24.308 11:54:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:24.308 11:54:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:13:24.308 11:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:24.308 [2024-07-12 11:54:14.361651] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:24.308 [2024-07-12 11:54:14.361679] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:24.308 [2024-07-12 11:54:14.361685] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:24.308 [2024-07-12 11:54:14.361694] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:24.308 [2024-07-12 11:54:14.361698] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:24.308 [2024-07-12 11:54:14.361703] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:24.308 11:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:24.308 11:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:24.308 11:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:24.308 11:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:24.308 11:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:24.308 11:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:24.308 11:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:24.308 11:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:24.308 11:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:24.308 11:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:24.308 11:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:24.308 11:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:24.308 11:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:24.308 "name": "Existed_Raid", 00:13:24.308 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:24.308 "strip_size_kb": 0, 00:13:24.308 "state": "configuring", 00:13:24.308 "raid_level": "raid1", 00:13:24.308 "superblock": false, 00:13:24.308 "num_base_bdevs": 3, 00:13:24.308 "num_base_bdevs_discovered": 0, 00:13:24.308 "num_base_bdevs_operational": 3, 00:13:24.308 "base_bdevs_list": [ 00:13:24.308 { 00:13:24.308 "name": "BaseBdev1", 00:13:24.308 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:24.308 "is_configured": false, 00:13:24.308 "data_offset": 0, 00:13:24.308 "data_size": 0 00:13:24.308 }, 00:13:24.308 { 00:13:24.308 "name": "BaseBdev2", 00:13:24.308 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:24.308 "is_configured": false, 00:13:24.308 "data_offset": 0, 00:13:24.308 "data_size": 0 00:13:24.308 }, 00:13:24.308 { 00:13:24.308 "name": "BaseBdev3", 00:13:24.308 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:24.308 "is_configured": false, 00:13:24.308 "data_offset": 0, 00:13:24.308 "data_size": 0 00:13:24.308 } 00:13:24.308 ] 00:13:24.308 }' 00:13:24.308 11:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:24.308 11:54:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:24.874 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:25.132 [2024-07-12 11:54:15.195722] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:25.132 [2024-07-12 11:54:15.195740] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf2c1d0 name Existed_Raid, state configuring 00:13:25.132 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:25.132 [2024-07-12 11:54:15.376197] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:25.132 [2024-07-12 11:54:15.376214] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:25.132 [2024-07-12 11:54:15.376219] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:25.132 [2024-07-12 11:54:15.376224] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:25.132 [2024-07-12 11:54:15.376228] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:25.132 [2024-07-12 11:54:15.376237] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:25.390 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:25.390 [2024-07-12 11:54:15.556708] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:25.390 BaseBdev1 00:13:25.391 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:25.391 11:54:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:25.391 11:54:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:25.391 11:54:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:25.391 11:54:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:25.391 11:54:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:25.391 11:54:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:25.649 11:54:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:25.649 [ 00:13:25.649 { 00:13:25.649 "name": "BaseBdev1", 00:13:25.649 "aliases": [ 00:13:25.649 "5f7005a5-d774-481b-86f2-fa97454f7dd0" 00:13:25.649 ], 00:13:25.649 "product_name": "Malloc disk", 00:13:25.649 "block_size": 512, 00:13:25.649 "num_blocks": 65536, 00:13:25.649 "uuid": "5f7005a5-d774-481b-86f2-fa97454f7dd0", 00:13:25.649 "assigned_rate_limits": { 00:13:25.649 "rw_ios_per_sec": 0, 00:13:25.649 "rw_mbytes_per_sec": 0, 00:13:25.649 "r_mbytes_per_sec": 0, 00:13:25.649 "w_mbytes_per_sec": 0 00:13:25.649 }, 00:13:25.649 "claimed": true, 00:13:25.649 "claim_type": "exclusive_write", 00:13:25.649 "zoned": false, 00:13:25.649 "supported_io_types": { 00:13:25.649 "read": true, 00:13:25.649 "write": true, 00:13:25.649 "unmap": true, 00:13:25.649 "flush": true, 00:13:25.649 "reset": true, 00:13:25.649 "nvme_admin": false, 00:13:25.649 "nvme_io": false, 00:13:25.649 "nvme_io_md": false, 00:13:25.649 "write_zeroes": true, 00:13:25.649 "zcopy": true, 00:13:25.649 "get_zone_info": false, 00:13:25.649 "zone_management": false, 00:13:25.649 "zone_append": false, 00:13:25.649 "compare": false, 00:13:25.649 "compare_and_write": false, 00:13:25.649 "abort": true, 00:13:25.649 "seek_hole": false, 00:13:25.649 "seek_data": false, 00:13:25.649 "copy": true, 00:13:25.649 "nvme_iov_md": false 00:13:25.649 }, 00:13:25.649 "memory_domains": [ 00:13:25.649 { 00:13:25.649 "dma_device_id": "system", 00:13:25.649 "dma_device_type": 1 00:13:25.649 }, 00:13:25.649 { 00:13:25.649 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:25.649 "dma_device_type": 2 00:13:25.649 } 00:13:25.649 ], 00:13:25.649 "driver_specific": {} 00:13:25.649 } 00:13:25.649 ] 00:13:25.908 11:54:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:25.908 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:25.908 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:25.908 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:25.908 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:25.908 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:25.908 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:25.908 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:25.908 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:25.908 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:25.908 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:25.908 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:25.908 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:25.908 11:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:25.908 "name": "Existed_Raid", 00:13:25.908 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:25.908 "strip_size_kb": 0, 00:13:25.908 "state": "configuring", 00:13:25.908 "raid_level": "raid1", 00:13:25.908 "superblock": false, 00:13:25.908 "num_base_bdevs": 3, 00:13:25.908 "num_base_bdevs_discovered": 1, 00:13:25.908 "num_base_bdevs_operational": 3, 00:13:25.908 "base_bdevs_list": [ 00:13:25.908 { 00:13:25.908 "name": "BaseBdev1", 00:13:25.908 "uuid": "5f7005a5-d774-481b-86f2-fa97454f7dd0", 00:13:25.908 "is_configured": true, 00:13:25.908 "data_offset": 0, 00:13:25.908 "data_size": 65536 00:13:25.908 }, 00:13:25.908 { 00:13:25.908 "name": "BaseBdev2", 00:13:25.908 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:25.908 "is_configured": false, 00:13:25.908 "data_offset": 0, 00:13:25.908 "data_size": 0 00:13:25.908 }, 00:13:25.908 { 00:13:25.908 "name": "BaseBdev3", 00:13:25.908 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:25.908 "is_configured": false, 00:13:25.908 "data_offset": 0, 00:13:25.908 "data_size": 0 00:13:25.908 } 00:13:25.908 ] 00:13:25.908 }' 00:13:25.908 11:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:25.908 11:54:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:26.475 11:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:26.734 [2024-07-12 11:54:16.735760] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:26.734 [2024-07-12 11:54:16.735787] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf2baa0 name Existed_Raid, state configuring 00:13:26.734 11:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:26.734 [2024-07-12 11:54:16.904210] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:26.734 [2024-07-12 11:54:16.905246] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:26.734 [2024-07-12 11:54:16.905267] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:26.734 [2024-07-12 11:54:16.905273] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:26.734 [2024-07-12 11:54:16.905278] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:26.734 11:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:26.734 11:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:26.734 11:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:26.734 11:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:26.734 11:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:26.734 11:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:26.734 11:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:26.734 11:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:26.734 11:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:26.734 11:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:26.734 11:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:26.734 11:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:26.734 11:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:26.734 11:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:26.993 11:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:26.993 "name": "Existed_Raid", 00:13:26.993 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:26.993 "strip_size_kb": 0, 00:13:26.993 "state": "configuring", 00:13:26.993 "raid_level": "raid1", 00:13:26.993 "superblock": false, 00:13:26.993 "num_base_bdevs": 3, 00:13:26.993 "num_base_bdevs_discovered": 1, 00:13:26.993 "num_base_bdevs_operational": 3, 00:13:26.993 "base_bdevs_list": [ 00:13:26.993 { 00:13:26.993 "name": "BaseBdev1", 00:13:26.993 "uuid": "5f7005a5-d774-481b-86f2-fa97454f7dd0", 00:13:26.993 "is_configured": true, 00:13:26.993 "data_offset": 0, 00:13:26.993 "data_size": 65536 00:13:26.993 }, 00:13:26.993 { 00:13:26.993 "name": "BaseBdev2", 00:13:26.993 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:26.993 "is_configured": false, 00:13:26.993 "data_offset": 0, 00:13:26.993 "data_size": 0 00:13:26.993 }, 00:13:26.993 { 00:13:26.993 "name": "BaseBdev3", 00:13:26.993 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:26.993 "is_configured": false, 00:13:26.993 "data_offset": 0, 00:13:26.993 "data_size": 0 00:13:26.993 } 00:13:26.993 ] 00:13:26.993 }' 00:13:26.993 11:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:26.993 11:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:27.558 11:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:27.558 [2024-07-12 11:54:17.724865] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:27.558 BaseBdev2 00:13:27.558 11:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:27.558 11:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:27.558 11:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:27.558 11:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:27.558 11:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:27.558 11:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:27.558 11:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:27.816 11:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:28.075 [ 00:13:28.075 { 00:13:28.075 "name": "BaseBdev2", 00:13:28.075 "aliases": [ 00:13:28.075 "fe6cd267-b057-48d3-b330-466e980f3e24" 00:13:28.075 ], 00:13:28.075 "product_name": "Malloc disk", 00:13:28.075 "block_size": 512, 00:13:28.075 "num_blocks": 65536, 00:13:28.075 "uuid": "fe6cd267-b057-48d3-b330-466e980f3e24", 00:13:28.075 "assigned_rate_limits": { 00:13:28.075 "rw_ios_per_sec": 0, 00:13:28.075 "rw_mbytes_per_sec": 0, 00:13:28.075 "r_mbytes_per_sec": 0, 00:13:28.075 "w_mbytes_per_sec": 0 00:13:28.075 }, 00:13:28.075 "claimed": true, 00:13:28.075 "claim_type": "exclusive_write", 00:13:28.075 "zoned": false, 00:13:28.075 "supported_io_types": { 00:13:28.075 "read": true, 00:13:28.075 "write": true, 00:13:28.075 "unmap": true, 00:13:28.075 "flush": true, 00:13:28.075 "reset": true, 00:13:28.075 "nvme_admin": false, 00:13:28.075 "nvme_io": false, 00:13:28.075 "nvme_io_md": false, 00:13:28.075 "write_zeroes": true, 00:13:28.075 "zcopy": true, 00:13:28.075 "get_zone_info": false, 00:13:28.075 "zone_management": false, 00:13:28.075 "zone_append": false, 00:13:28.075 "compare": false, 00:13:28.075 "compare_and_write": false, 00:13:28.075 "abort": true, 00:13:28.075 "seek_hole": false, 00:13:28.075 "seek_data": false, 00:13:28.075 "copy": true, 00:13:28.075 "nvme_iov_md": false 00:13:28.075 }, 00:13:28.075 "memory_domains": [ 00:13:28.075 { 00:13:28.075 "dma_device_id": "system", 00:13:28.075 "dma_device_type": 1 00:13:28.075 }, 00:13:28.075 { 00:13:28.075 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:28.075 "dma_device_type": 2 00:13:28.075 } 00:13:28.075 ], 00:13:28.075 "driver_specific": {} 00:13:28.075 } 00:13:28.075 ] 00:13:28.075 11:54:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:28.075 11:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:28.075 11:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:28.075 11:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:28.075 11:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:28.075 11:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:28.075 11:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:28.075 11:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:28.075 11:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:28.075 11:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:28.075 11:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:28.075 11:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:28.075 11:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:28.075 11:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:28.075 11:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:28.075 11:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:28.075 "name": "Existed_Raid", 00:13:28.075 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:28.075 "strip_size_kb": 0, 00:13:28.075 "state": "configuring", 00:13:28.075 "raid_level": "raid1", 00:13:28.075 "superblock": false, 00:13:28.075 "num_base_bdevs": 3, 00:13:28.075 "num_base_bdevs_discovered": 2, 00:13:28.075 "num_base_bdevs_operational": 3, 00:13:28.075 "base_bdevs_list": [ 00:13:28.075 { 00:13:28.075 "name": "BaseBdev1", 00:13:28.075 "uuid": "5f7005a5-d774-481b-86f2-fa97454f7dd0", 00:13:28.075 "is_configured": true, 00:13:28.075 "data_offset": 0, 00:13:28.075 "data_size": 65536 00:13:28.075 }, 00:13:28.075 { 00:13:28.075 "name": "BaseBdev2", 00:13:28.075 "uuid": "fe6cd267-b057-48d3-b330-466e980f3e24", 00:13:28.075 "is_configured": true, 00:13:28.075 "data_offset": 0, 00:13:28.075 "data_size": 65536 00:13:28.075 }, 00:13:28.075 { 00:13:28.075 "name": "BaseBdev3", 00:13:28.075 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:28.075 "is_configured": false, 00:13:28.075 "data_offset": 0, 00:13:28.075 "data_size": 0 00:13:28.075 } 00:13:28.075 ] 00:13:28.075 }' 00:13:28.076 11:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:28.076 11:54:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:28.642 11:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:28.900 [2024-07-12 11:54:18.898559] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:28.900 [2024-07-12 11:54:18.898586] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf2c990 00:13:28.900 [2024-07-12 11:54:18.898590] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:13:28.900 [2024-07-12 11:54:18.898714] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf43880 00:13:28.900 [2024-07-12 11:54:18.898798] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf2c990 00:13:28.900 [2024-07-12 11:54:18.898803] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xf2c990 00:13:28.900 [2024-07-12 11:54:18.898912] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:28.900 BaseBdev3 00:13:28.900 11:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:13:28.900 11:54:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:28.900 11:54:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:28.900 11:54:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:28.900 11:54:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:28.900 11:54:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:28.900 11:54:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:28.900 11:54:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:29.158 [ 00:13:29.158 { 00:13:29.158 "name": "BaseBdev3", 00:13:29.158 "aliases": [ 00:13:29.158 "5be80140-1281-45b4-83ab-f7057bdbc895" 00:13:29.158 ], 00:13:29.158 "product_name": "Malloc disk", 00:13:29.158 "block_size": 512, 00:13:29.158 "num_blocks": 65536, 00:13:29.158 "uuid": "5be80140-1281-45b4-83ab-f7057bdbc895", 00:13:29.158 "assigned_rate_limits": { 00:13:29.158 "rw_ios_per_sec": 0, 00:13:29.158 "rw_mbytes_per_sec": 0, 00:13:29.158 "r_mbytes_per_sec": 0, 00:13:29.158 "w_mbytes_per_sec": 0 00:13:29.158 }, 00:13:29.158 "claimed": true, 00:13:29.158 "claim_type": "exclusive_write", 00:13:29.158 "zoned": false, 00:13:29.158 "supported_io_types": { 00:13:29.158 "read": true, 00:13:29.158 "write": true, 00:13:29.158 "unmap": true, 00:13:29.158 "flush": true, 00:13:29.158 "reset": true, 00:13:29.158 "nvme_admin": false, 00:13:29.158 "nvme_io": false, 00:13:29.158 "nvme_io_md": false, 00:13:29.158 "write_zeroes": true, 00:13:29.158 "zcopy": true, 00:13:29.158 "get_zone_info": false, 00:13:29.158 "zone_management": false, 00:13:29.158 "zone_append": false, 00:13:29.158 "compare": false, 00:13:29.158 "compare_and_write": false, 00:13:29.158 "abort": true, 00:13:29.158 "seek_hole": false, 00:13:29.158 "seek_data": false, 00:13:29.158 "copy": true, 00:13:29.158 "nvme_iov_md": false 00:13:29.158 }, 00:13:29.158 "memory_domains": [ 00:13:29.158 { 00:13:29.158 "dma_device_id": "system", 00:13:29.158 "dma_device_type": 1 00:13:29.158 }, 00:13:29.158 { 00:13:29.158 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:29.158 "dma_device_type": 2 00:13:29.158 } 00:13:29.158 ], 00:13:29.158 "driver_specific": {} 00:13:29.158 } 00:13:29.158 ] 00:13:29.158 11:54:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:29.158 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:29.158 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:29.158 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:13:29.158 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:29.158 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:29.158 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:29.158 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:29.158 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:29.158 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:29.158 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:29.158 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:29.158 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:29.158 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:29.158 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:29.415 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:29.415 "name": "Existed_Raid", 00:13:29.416 "uuid": "9f5672a1-1661-4461-b406-8c9ecdbd2d78", 00:13:29.416 "strip_size_kb": 0, 00:13:29.416 "state": "online", 00:13:29.416 "raid_level": "raid1", 00:13:29.416 "superblock": false, 00:13:29.416 "num_base_bdevs": 3, 00:13:29.416 "num_base_bdevs_discovered": 3, 00:13:29.416 "num_base_bdevs_operational": 3, 00:13:29.416 "base_bdevs_list": [ 00:13:29.416 { 00:13:29.416 "name": "BaseBdev1", 00:13:29.416 "uuid": "5f7005a5-d774-481b-86f2-fa97454f7dd0", 00:13:29.416 "is_configured": true, 00:13:29.416 "data_offset": 0, 00:13:29.416 "data_size": 65536 00:13:29.416 }, 00:13:29.416 { 00:13:29.416 "name": "BaseBdev2", 00:13:29.416 "uuid": "fe6cd267-b057-48d3-b330-466e980f3e24", 00:13:29.416 "is_configured": true, 00:13:29.416 "data_offset": 0, 00:13:29.416 "data_size": 65536 00:13:29.416 }, 00:13:29.416 { 00:13:29.416 "name": "BaseBdev3", 00:13:29.416 "uuid": "5be80140-1281-45b4-83ab-f7057bdbc895", 00:13:29.416 "is_configured": true, 00:13:29.416 "data_offset": 0, 00:13:29.416 "data_size": 65536 00:13:29.416 } 00:13:29.416 ] 00:13:29.416 }' 00:13:29.416 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:29.416 11:54:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:29.982 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:29.982 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:29.982 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:29.982 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:29.982 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:29.982 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:29.982 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:29.982 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:29.982 [2024-07-12 11:54:20.093849] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:29.982 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:29.982 "name": "Existed_Raid", 00:13:29.982 "aliases": [ 00:13:29.982 "9f5672a1-1661-4461-b406-8c9ecdbd2d78" 00:13:29.982 ], 00:13:29.982 "product_name": "Raid Volume", 00:13:29.982 "block_size": 512, 00:13:29.982 "num_blocks": 65536, 00:13:29.982 "uuid": "9f5672a1-1661-4461-b406-8c9ecdbd2d78", 00:13:29.982 "assigned_rate_limits": { 00:13:29.982 "rw_ios_per_sec": 0, 00:13:29.982 "rw_mbytes_per_sec": 0, 00:13:29.982 "r_mbytes_per_sec": 0, 00:13:29.982 "w_mbytes_per_sec": 0 00:13:29.982 }, 00:13:29.982 "claimed": false, 00:13:29.982 "zoned": false, 00:13:29.982 "supported_io_types": { 00:13:29.982 "read": true, 00:13:29.982 "write": true, 00:13:29.982 "unmap": false, 00:13:29.982 "flush": false, 00:13:29.982 "reset": true, 00:13:29.982 "nvme_admin": false, 00:13:29.982 "nvme_io": false, 00:13:29.982 "nvme_io_md": false, 00:13:29.982 "write_zeroes": true, 00:13:29.982 "zcopy": false, 00:13:29.982 "get_zone_info": false, 00:13:29.982 "zone_management": false, 00:13:29.982 "zone_append": false, 00:13:29.982 "compare": false, 00:13:29.982 "compare_and_write": false, 00:13:29.982 "abort": false, 00:13:29.982 "seek_hole": false, 00:13:29.982 "seek_data": false, 00:13:29.982 "copy": false, 00:13:29.982 "nvme_iov_md": false 00:13:29.982 }, 00:13:29.982 "memory_domains": [ 00:13:29.982 { 00:13:29.982 "dma_device_id": "system", 00:13:29.982 "dma_device_type": 1 00:13:29.982 }, 00:13:29.982 { 00:13:29.982 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:29.982 "dma_device_type": 2 00:13:29.982 }, 00:13:29.982 { 00:13:29.982 "dma_device_id": "system", 00:13:29.982 "dma_device_type": 1 00:13:29.982 }, 00:13:29.982 { 00:13:29.982 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:29.982 "dma_device_type": 2 00:13:29.982 }, 00:13:29.982 { 00:13:29.982 "dma_device_id": "system", 00:13:29.982 "dma_device_type": 1 00:13:29.982 }, 00:13:29.982 { 00:13:29.982 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:29.982 "dma_device_type": 2 00:13:29.982 } 00:13:29.982 ], 00:13:29.982 "driver_specific": { 00:13:29.982 "raid": { 00:13:29.982 "uuid": "9f5672a1-1661-4461-b406-8c9ecdbd2d78", 00:13:29.982 "strip_size_kb": 0, 00:13:29.982 "state": "online", 00:13:29.982 "raid_level": "raid1", 00:13:29.982 "superblock": false, 00:13:29.982 "num_base_bdevs": 3, 00:13:29.982 "num_base_bdevs_discovered": 3, 00:13:29.982 "num_base_bdevs_operational": 3, 00:13:29.982 "base_bdevs_list": [ 00:13:29.982 { 00:13:29.982 "name": "BaseBdev1", 00:13:29.982 "uuid": "5f7005a5-d774-481b-86f2-fa97454f7dd0", 00:13:29.982 "is_configured": true, 00:13:29.982 "data_offset": 0, 00:13:29.982 "data_size": 65536 00:13:29.982 }, 00:13:29.982 { 00:13:29.982 "name": "BaseBdev2", 00:13:29.982 "uuid": "fe6cd267-b057-48d3-b330-466e980f3e24", 00:13:29.982 "is_configured": true, 00:13:29.982 "data_offset": 0, 00:13:29.982 "data_size": 65536 00:13:29.982 }, 00:13:29.982 { 00:13:29.982 "name": "BaseBdev3", 00:13:29.982 "uuid": "5be80140-1281-45b4-83ab-f7057bdbc895", 00:13:29.982 "is_configured": true, 00:13:29.982 "data_offset": 0, 00:13:29.982 "data_size": 65536 00:13:29.982 } 00:13:29.982 ] 00:13:29.982 } 00:13:29.982 } 00:13:29.982 }' 00:13:29.982 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:29.982 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:29.982 BaseBdev2 00:13:29.982 BaseBdev3' 00:13:29.982 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:29.982 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:29.982 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:30.241 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:30.241 "name": "BaseBdev1", 00:13:30.241 "aliases": [ 00:13:30.241 "5f7005a5-d774-481b-86f2-fa97454f7dd0" 00:13:30.241 ], 00:13:30.241 "product_name": "Malloc disk", 00:13:30.241 "block_size": 512, 00:13:30.241 "num_blocks": 65536, 00:13:30.241 "uuid": "5f7005a5-d774-481b-86f2-fa97454f7dd0", 00:13:30.241 "assigned_rate_limits": { 00:13:30.241 "rw_ios_per_sec": 0, 00:13:30.241 "rw_mbytes_per_sec": 0, 00:13:30.241 "r_mbytes_per_sec": 0, 00:13:30.241 "w_mbytes_per_sec": 0 00:13:30.241 }, 00:13:30.241 "claimed": true, 00:13:30.241 "claim_type": "exclusive_write", 00:13:30.241 "zoned": false, 00:13:30.241 "supported_io_types": { 00:13:30.241 "read": true, 00:13:30.241 "write": true, 00:13:30.241 "unmap": true, 00:13:30.241 "flush": true, 00:13:30.241 "reset": true, 00:13:30.241 "nvme_admin": false, 00:13:30.241 "nvme_io": false, 00:13:30.241 "nvme_io_md": false, 00:13:30.241 "write_zeroes": true, 00:13:30.241 "zcopy": true, 00:13:30.241 "get_zone_info": false, 00:13:30.241 "zone_management": false, 00:13:30.241 "zone_append": false, 00:13:30.241 "compare": false, 00:13:30.241 "compare_and_write": false, 00:13:30.241 "abort": true, 00:13:30.241 "seek_hole": false, 00:13:30.241 "seek_data": false, 00:13:30.241 "copy": true, 00:13:30.241 "nvme_iov_md": false 00:13:30.241 }, 00:13:30.241 "memory_domains": [ 00:13:30.241 { 00:13:30.241 "dma_device_id": "system", 00:13:30.241 "dma_device_type": 1 00:13:30.241 }, 00:13:30.241 { 00:13:30.241 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:30.241 "dma_device_type": 2 00:13:30.241 } 00:13:30.241 ], 00:13:30.241 "driver_specific": {} 00:13:30.241 }' 00:13:30.241 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:30.241 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:30.241 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:30.241 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:30.241 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:30.500 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:30.500 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:30.500 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:30.500 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:30.500 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:30.500 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:30.500 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:30.500 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:30.500 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:30.500 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:30.759 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:30.759 "name": "BaseBdev2", 00:13:30.759 "aliases": [ 00:13:30.759 "fe6cd267-b057-48d3-b330-466e980f3e24" 00:13:30.759 ], 00:13:30.759 "product_name": "Malloc disk", 00:13:30.759 "block_size": 512, 00:13:30.759 "num_blocks": 65536, 00:13:30.759 "uuid": "fe6cd267-b057-48d3-b330-466e980f3e24", 00:13:30.759 "assigned_rate_limits": { 00:13:30.759 "rw_ios_per_sec": 0, 00:13:30.759 "rw_mbytes_per_sec": 0, 00:13:30.759 "r_mbytes_per_sec": 0, 00:13:30.759 "w_mbytes_per_sec": 0 00:13:30.759 }, 00:13:30.759 "claimed": true, 00:13:30.759 "claim_type": "exclusive_write", 00:13:30.759 "zoned": false, 00:13:30.759 "supported_io_types": { 00:13:30.759 "read": true, 00:13:30.759 "write": true, 00:13:30.759 "unmap": true, 00:13:30.759 "flush": true, 00:13:30.759 "reset": true, 00:13:30.759 "nvme_admin": false, 00:13:30.759 "nvme_io": false, 00:13:30.759 "nvme_io_md": false, 00:13:30.759 "write_zeroes": true, 00:13:30.759 "zcopy": true, 00:13:30.759 "get_zone_info": false, 00:13:30.759 "zone_management": false, 00:13:30.759 "zone_append": false, 00:13:30.759 "compare": false, 00:13:30.759 "compare_and_write": false, 00:13:30.759 "abort": true, 00:13:30.759 "seek_hole": false, 00:13:30.759 "seek_data": false, 00:13:30.759 "copy": true, 00:13:30.759 "nvme_iov_md": false 00:13:30.759 }, 00:13:30.759 "memory_domains": [ 00:13:30.759 { 00:13:30.759 "dma_device_id": "system", 00:13:30.759 "dma_device_type": 1 00:13:30.759 }, 00:13:30.759 { 00:13:30.759 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:30.759 "dma_device_type": 2 00:13:30.759 } 00:13:30.759 ], 00:13:30.759 "driver_specific": {} 00:13:30.759 }' 00:13:30.759 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:30.759 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:30.759 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:30.759 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:30.759 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:30.759 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:30.759 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:30.759 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:31.018 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:31.018 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:31.018 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:31.018 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:31.018 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:31.018 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:31.018 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:31.276 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:31.276 "name": "BaseBdev3", 00:13:31.276 "aliases": [ 00:13:31.276 "5be80140-1281-45b4-83ab-f7057bdbc895" 00:13:31.276 ], 00:13:31.276 "product_name": "Malloc disk", 00:13:31.276 "block_size": 512, 00:13:31.276 "num_blocks": 65536, 00:13:31.276 "uuid": "5be80140-1281-45b4-83ab-f7057bdbc895", 00:13:31.276 "assigned_rate_limits": { 00:13:31.276 "rw_ios_per_sec": 0, 00:13:31.276 "rw_mbytes_per_sec": 0, 00:13:31.276 "r_mbytes_per_sec": 0, 00:13:31.276 "w_mbytes_per_sec": 0 00:13:31.276 }, 00:13:31.276 "claimed": true, 00:13:31.276 "claim_type": "exclusive_write", 00:13:31.276 "zoned": false, 00:13:31.276 "supported_io_types": { 00:13:31.276 "read": true, 00:13:31.276 "write": true, 00:13:31.276 "unmap": true, 00:13:31.276 "flush": true, 00:13:31.276 "reset": true, 00:13:31.276 "nvme_admin": false, 00:13:31.276 "nvme_io": false, 00:13:31.276 "nvme_io_md": false, 00:13:31.276 "write_zeroes": true, 00:13:31.276 "zcopy": true, 00:13:31.276 "get_zone_info": false, 00:13:31.276 "zone_management": false, 00:13:31.276 "zone_append": false, 00:13:31.276 "compare": false, 00:13:31.276 "compare_and_write": false, 00:13:31.276 "abort": true, 00:13:31.276 "seek_hole": false, 00:13:31.276 "seek_data": false, 00:13:31.276 "copy": true, 00:13:31.276 "nvme_iov_md": false 00:13:31.276 }, 00:13:31.276 "memory_domains": [ 00:13:31.276 { 00:13:31.276 "dma_device_id": "system", 00:13:31.276 "dma_device_type": 1 00:13:31.276 }, 00:13:31.276 { 00:13:31.276 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:31.276 "dma_device_type": 2 00:13:31.276 } 00:13:31.276 ], 00:13:31.276 "driver_specific": {} 00:13:31.276 }' 00:13:31.276 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:31.276 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:31.276 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:31.276 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:31.276 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:31.276 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:31.276 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:31.276 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:31.535 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:31.535 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:31.535 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:31.535 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:31.535 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:31.535 [2024-07-12 11:54:21.758023] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:31.535 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:31.535 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:13:31.535 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:31.535 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:31.535 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:13:31.794 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:13:31.794 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:31.794 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:31.794 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:31.794 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:31.794 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:31.794 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:31.794 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:31.794 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:31.794 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:31.794 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:31.794 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:31.794 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:31.794 "name": "Existed_Raid", 00:13:31.794 "uuid": "9f5672a1-1661-4461-b406-8c9ecdbd2d78", 00:13:31.794 "strip_size_kb": 0, 00:13:31.794 "state": "online", 00:13:31.794 "raid_level": "raid1", 00:13:31.794 "superblock": false, 00:13:31.794 "num_base_bdevs": 3, 00:13:31.794 "num_base_bdevs_discovered": 2, 00:13:31.794 "num_base_bdevs_operational": 2, 00:13:31.794 "base_bdevs_list": [ 00:13:31.794 { 00:13:31.794 "name": null, 00:13:31.794 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:31.794 "is_configured": false, 00:13:31.794 "data_offset": 0, 00:13:31.794 "data_size": 65536 00:13:31.794 }, 00:13:31.794 { 00:13:31.794 "name": "BaseBdev2", 00:13:31.794 "uuid": "fe6cd267-b057-48d3-b330-466e980f3e24", 00:13:31.794 "is_configured": true, 00:13:31.794 "data_offset": 0, 00:13:31.794 "data_size": 65536 00:13:31.794 }, 00:13:31.794 { 00:13:31.794 "name": "BaseBdev3", 00:13:31.794 "uuid": "5be80140-1281-45b4-83ab-f7057bdbc895", 00:13:31.794 "is_configured": true, 00:13:31.794 "data_offset": 0, 00:13:31.794 "data_size": 65536 00:13:31.794 } 00:13:31.794 ] 00:13:31.794 }' 00:13:31.794 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:31.794 11:54:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:32.480 11:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:32.480 11:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:32.480 11:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:32.480 11:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:32.480 11:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:32.480 11:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:32.480 11:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:32.742 [2024-07-12 11:54:22.785445] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:32.742 11:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:32.742 11:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:32.742 11:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:32.742 11:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:32.742 11:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:32.742 11:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:32.742 11:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:13:33.003 [2024-07-12 11:54:23.135979] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:33.003 [2024-07-12 11:54:23.136034] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:33.003 [2024-07-12 11:54:23.145958] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:33.003 [2024-07-12 11:54:23.145982] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:33.003 [2024-07-12 11:54:23.145988] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf2c990 name Existed_Raid, state offline 00:13:33.003 11:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:33.003 11:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:33.003 11:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:33.003 11:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:33.270 11:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:33.270 11:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:33.270 11:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:13:33.270 11:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:13:33.270 11:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:33.270 11:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:33.270 BaseBdev2 00:13:33.545 11:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:13:33.545 11:54:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:33.545 11:54:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:33.545 11:54:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:33.545 11:54:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:33.545 11:54:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:33.545 11:54:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:33.545 11:54:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:33.818 [ 00:13:33.818 { 00:13:33.818 "name": "BaseBdev2", 00:13:33.818 "aliases": [ 00:13:33.818 "c7a77523-52bf-4ca2-9f2a-05c26088ad9c" 00:13:33.818 ], 00:13:33.818 "product_name": "Malloc disk", 00:13:33.818 "block_size": 512, 00:13:33.818 "num_blocks": 65536, 00:13:33.818 "uuid": "c7a77523-52bf-4ca2-9f2a-05c26088ad9c", 00:13:33.818 "assigned_rate_limits": { 00:13:33.818 "rw_ios_per_sec": 0, 00:13:33.818 "rw_mbytes_per_sec": 0, 00:13:33.818 "r_mbytes_per_sec": 0, 00:13:33.818 "w_mbytes_per_sec": 0 00:13:33.818 }, 00:13:33.818 "claimed": false, 00:13:33.818 "zoned": false, 00:13:33.818 "supported_io_types": { 00:13:33.818 "read": true, 00:13:33.818 "write": true, 00:13:33.818 "unmap": true, 00:13:33.818 "flush": true, 00:13:33.818 "reset": true, 00:13:33.818 "nvme_admin": false, 00:13:33.818 "nvme_io": false, 00:13:33.818 "nvme_io_md": false, 00:13:33.818 "write_zeroes": true, 00:13:33.818 "zcopy": true, 00:13:33.818 "get_zone_info": false, 00:13:33.818 "zone_management": false, 00:13:33.818 "zone_append": false, 00:13:33.818 "compare": false, 00:13:33.818 "compare_and_write": false, 00:13:33.818 "abort": true, 00:13:33.818 "seek_hole": false, 00:13:33.818 "seek_data": false, 00:13:33.818 "copy": true, 00:13:33.818 "nvme_iov_md": false 00:13:33.818 }, 00:13:33.818 "memory_domains": [ 00:13:33.818 { 00:13:33.818 "dma_device_id": "system", 00:13:33.818 "dma_device_type": 1 00:13:33.818 }, 00:13:33.818 { 00:13:33.818 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:33.818 "dma_device_type": 2 00:13:33.818 } 00:13:33.818 ], 00:13:33.818 "driver_specific": {} 00:13:33.818 } 00:13:33.818 ] 00:13:33.818 11:54:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:33.818 11:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:33.818 11:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:33.818 11:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:33.818 BaseBdev3 00:13:33.818 11:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:13:33.818 11:54:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:33.818 11:54:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:33.818 11:54:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:33.818 11:54:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:33.818 11:54:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:33.818 11:54:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:34.076 11:54:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:34.334 [ 00:13:34.334 { 00:13:34.334 "name": "BaseBdev3", 00:13:34.334 "aliases": [ 00:13:34.334 "d9a8c363-ae86-42f9-938b-993c67e16bf6" 00:13:34.334 ], 00:13:34.334 "product_name": "Malloc disk", 00:13:34.334 "block_size": 512, 00:13:34.334 "num_blocks": 65536, 00:13:34.334 "uuid": "d9a8c363-ae86-42f9-938b-993c67e16bf6", 00:13:34.334 "assigned_rate_limits": { 00:13:34.334 "rw_ios_per_sec": 0, 00:13:34.334 "rw_mbytes_per_sec": 0, 00:13:34.334 "r_mbytes_per_sec": 0, 00:13:34.334 "w_mbytes_per_sec": 0 00:13:34.334 }, 00:13:34.334 "claimed": false, 00:13:34.334 "zoned": false, 00:13:34.334 "supported_io_types": { 00:13:34.334 "read": true, 00:13:34.334 "write": true, 00:13:34.334 "unmap": true, 00:13:34.334 "flush": true, 00:13:34.334 "reset": true, 00:13:34.334 "nvme_admin": false, 00:13:34.334 "nvme_io": false, 00:13:34.334 "nvme_io_md": false, 00:13:34.334 "write_zeroes": true, 00:13:34.334 "zcopy": true, 00:13:34.334 "get_zone_info": false, 00:13:34.334 "zone_management": false, 00:13:34.334 "zone_append": false, 00:13:34.334 "compare": false, 00:13:34.334 "compare_and_write": false, 00:13:34.334 "abort": true, 00:13:34.334 "seek_hole": false, 00:13:34.334 "seek_data": false, 00:13:34.334 "copy": true, 00:13:34.334 "nvme_iov_md": false 00:13:34.334 }, 00:13:34.334 "memory_domains": [ 00:13:34.334 { 00:13:34.334 "dma_device_id": "system", 00:13:34.334 "dma_device_type": 1 00:13:34.334 }, 00:13:34.334 { 00:13:34.334 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:34.334 "dma_device_type": 2 00:13:34.334 } 00:13:34.334 ], 00:13:34.334 "driver_specific": {} 00:13:34.334 } 00:13:34.334 ] 00:13:34.334 11:54:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:34.334 11:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:34.334 11:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:34.334 11:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:34.334 [2024-07-12 11:54:24.492808] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:34.334 [2024-07-12 11:54:24.492833] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:34.334 [2024-07-12 11:54:24.492843] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:34.334 [2024-07-12 11:54:24.493642] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:34.334 11:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:34.334 11:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:34.334 11:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:34.334 11:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:34.334 11:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:34.334 11:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:34.335 11:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:34.335 11:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:34.335 11:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:34.335 11:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:34.335 11:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:34.335 11:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:34.593 11:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:34.593 "name": "Existed_Raid", 00:13:34.593 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:34.593 "strip_size_kb": 0, 00:13:34.593 "state": "configuring", 00:13:34.593 "raid_level": "raid1", 00:13:34.593 "superblock": false, 00:13:34.593 "num_base_bdevs": 3, 00:13:34.593 "num_base_bdevs_discovered": 2, 00:13:34.593 "num_base_bdevs_operational": 3, 00:13:34.593 "base_bdevs_list": [ 00:13:34.593 { 00:13:34.593 "name": "BaseBdev1", 00:13:34.593 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:34.593 "is_configured": false, 00:13:34.593 "data_offset": 0, 00:13:34.593 "data_size": 0 00:13:34.593 }, 00:13:34.593 { 00:13:34.593 "name": "BaseBdev2", 00:13:34.593 "uuid": "c7a77523-52bf-4ca2-9f2a-05c26088ad9c", 00:13:34.593 "is_configured": true, 00:13:34.593 "data_offset": 0, 00:13:34.593 "data_size": 65536 00:13:34.593 }, 00:13:34.593 { 00:13:34.593 "name": "BaseBdev3", 00:13:34.593 "uuid": "d9a8c363-ae86-42f9-938b-993c67e16bf6", 00:13:34.593 "is_configured": true, 00:13:34.593 "data_offset": 0, 00:13:34.593 "data_size": 65536 00:13:34.593 } 00:13:34.593 ] 00:13:34.593 }' 00:13:34.593 11:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:34.593 11:54:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:35.160 11:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:13:35.160 [2024-07-12 11:54:25.318910] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:35.160 11:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:35.160 11:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:35.160 11:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:35.160 11:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:35.160 11:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:35.160 11:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:35.160 11:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:35.160 11:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:35.160 11:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:35.160 11:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:35.160 11:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:35.160 11:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:35.418 11:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:35.418 "name": "Existed_Raid", 00:13:35.418 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:35.418 "strip_size_kb": 0, 00:13:35.419 "state": "configuring", 00:13:35.419 "raid_level": "raid1", 00:13:35.419 "superblock": false, 00:13:35.419 "num_base_bdevs": 3, 00:13:35.419 "num_base_bdevs_discovered": 1, 00:13:35.419 "num_base_bdevs_operational": 3, 00:13:35.419 "base_bdevs_list": [ 00:13:35.419 { 00:13:35.419 "name": "BaseBdev1", 00:13:35.419 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:35.419 "is_configured": false, 00:13:35.419 "data_offset": 0, 00:13:35.419 "data_size": 0 00:13:35.419 }, 00:13:35.419 { 00:13:35.419 "name": null, 00:13:35.419 "uuid": "c7a77523-52bf-4ca2-9f2a-05c26088ad9c", 00:13:35.419 "is_configured": false, 00:13:35.419 "data_offset": 0, 00:13:35.419 "data_size": 65536 00:13:35.419 }, 00:13:35.419 { 00:13:35.419 "name": "BaseBdev3", 00:13:35.419 "uuid": "d9a8c363-ae86-42f9-938b-993c67e16bf6", 00:13:35.419 "is_configured": true, 00:13:35.419 "data_offset": 0, 00:13:35.419 "data_size": 65536 00:13:35.419 } 00:13:35.419 ] 00:13:35.419 }' 00:13:35.419 11:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:35.419 11:54:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:35.986 11:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:35.986 11:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:35.987 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:13:35.987 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:36.246 [2024-07-12 11:54:26.324252] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:36.246 BaseBdev1 00:13:36.246 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:13:36.246 11:54:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:36.246 11:54:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:36.246 11:54:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:36.246 11:54:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:36.246 11:54:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:36.246 11:54:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:36.505 11:54:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:36.505 [ 00:13:36.505 { 00:13:36.505 "name": "BaseBdev1", 00:13:36.505 "aliases": [ 00:13:36.505 "2aab7478-55e0-459a-b341-d227a82fcc53" 00:13:36.505 ], 00:13:36.505 "product_name": "Malloc disk", 00:13:36.505 "block_size": 512, 00:13:36.505 "num_blocks": 65536, 00:13:36.505 "uuid": "2aab7478-55e0-459a-b341-d227a82fcc53", 00:13:36.505 "assigned_rate_limits": { 00:13:36.505 "rw_ios_per_sec": 0, 00:13:36.505 "rw_mbytes_per_sec": 0, 00:13:36.505 "r_mbytes_per_sec": 0, 00:13:36.505 "w_mbytes_per_sec": 0 00:13:36.505 }, 00:13:36.505 "claimed": true, 00:13:36.505 "claim_type": "exclusive_write", 00:13:36.505 "zoned": false, 00:13:36.505 "supported_io_types": { 00:13:36.505 "read": true, 00:13:36.505 "write": true, 00:13:36.505 "unmap": true, 00:13:36.505 "flush": true, 00:13:36.505 "reset": true, 00:13:36.505 "nvme_admin": false, 00:13:36.505 "nvme_io": false, 00:13:36.505 "nvme_io_md": false, 00:13:36.505 "write_zeroes": true, 00:13:36.505 "zcopy": true, 00:13:36.505 "get_zone_info": false, 00:13:36.505 "zone_management": false, 00:13:36.505 "zone_append": false, 00:13:36.505 "compare": false, 00:13:36.505 "compare_and_write": false, 00:13:36.505 "abort": true, 00:13:36.505 "seek_hole": false, 00:13:36.505 "seek_data": false, 00:13:36.505 "copy": true, 00:13:36.505 "nvme_iov_md": false 00:13:36.505 }, 00:13:36.505 "memory_domains": [ 00:13:36.505 { 00:13:36.505 "dma_device_id": "system", 00:13:36.505 "dma_device_type": 1 00:13:36.505 }, 00:13:36.505 { 00:13:36.505 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:36.505 "dma_device_type": 2 00:13:36.505 } 00:13:36.505 ], 00:13:36.505 "driver_specific": {} 00:13:36.505 } 00:13:36.505 ] 00:13:36.505 11:54:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:36.505 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:36.505 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:36.505 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:36.505 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:36.505 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:36.505 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:36.505 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:36.505 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:36.505 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:36.505 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:36.505 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:36.505 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:36.765 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:36.765 "name": "Existed_Raid", 00:13:36.765 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:36.765 "strip_size_kb": 0, 00:13:36.765 "state": "configuring", 00:13:36.765 "raid_level": "raid1", 00:13:36.765 "superblock": false, 00:13:36.765 "num_base_bdevs": 3, 00:13:36.765 "num_base_bdevs_discovered": 2, 00:13:36.765 "num_base_bdevs_operational": 3, 00:13:36.765 "base_bdevs_list": [ 00:13:36.765 { 00:13:36.765 "name": "BaseBdev1", 00:13:36.765 "uuid": "2aab7478-55e0-459a-b341-d227a82fcc53", 00:13:36.765 "is_configured": true, 00:13:36.765 "data_offset": 0, 00:13:36.765 "data_size": 65536 00:13:36.765 }, 00:13:36.765 { 00:13:36.765 "name": null, 00:13:36.765 "uuid": "c7a77523-52bf-4ca2-9f2a-05c26088ad9c", 00:13:36.765 "is_configured": false, 00:13:36.765 "data_offset": 0, 00:13:36.765 "data_size": 65536 00:13:36.765 }, 00:13:36.765 { 00:13:36.765 "name": "BaseBdev3", 00:13:36.765 "uuid": "d9a8c363-ae86-42f9-938b-993c67e16bf6", 00:13:36.765 "is_configured": true, 00:13:36.765 "data_offset": 0, 00:13:36.765 "data_size": 65536 00:13:36.765 } 00:13:36.765 ] 00:13:36.765 }' 00:13:36.765 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:36.765 11:54:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:37.333 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:37.333 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:37.333 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:13:37.333 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:13:37.592 [2024-07-12 11:54:27.615739] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:37.592 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:37.592 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:37.592 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:37.592 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:37.592 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:37.592 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:37.592 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:37.592 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:37.592 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:37.592 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:37.592 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:37.592 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:37.592 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:37.592 "name": "Existed_Raid", 00:13:37.592 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:37.592 "strip_size_kb": 0, 00:13:37.592 "state": "configuring", 00:13:37.592 "raid_level": "raid1", 00:13:37.592 "superblock": false, 00:13:37.592 "num_base_bdevs": 3, 00:13:37.592 "num_base_bdevs_discovered": 1, 00:13:37.592 "num_base_bdevs_operational": 3, 00:13:37.592 "base_bdevs_list": [ 00:13:37.592 { 00:13:37.592 "name": "BaseBdev1", 00:13:37.592 "uuid": "2aab7478-55e0-459a-b341-d227a82fcc53", 00:13:37.592 "is_configured": true, 00:13:37.592 "data_offset": 0, 00:13:37.592 "data_size": 65536 00:13:37.592 }, 00:13:37.592 { 00:13:37.592 "name": null, 00:13:37.592 "uuid": "c7a77523-52bf-4ca2-9f2a-05c26088ad9c", 00:13:37.592 "is_configured": false, 00:13:37.592 "data_offset": 0, 00:13:37.592 "data_size": 65536 00:13:37.592 }, 00:13:37.592 { 00:13:37.592 "name": null, 00:13:37.592 "uuid": "d9a8c363-ae86-42f9-938b-993c67e16bf6", 00:13:37.592 "is_configured": false, 00:13:37.592 "data_offset": 0, 00:13:37.592 "data_size": 65536 00:13:37.592 } 00:13:37.592 ] 00:13:37.592 }' 00:13:37.592 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:37.592 11:54:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:38.159 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:38.159 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:38.419 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:13:38.419 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:13:38.419 [2024-07-12 11:54:28.610326] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:38.419 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:38.419 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:38.419 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:38.419 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:38.419 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:38.419 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:38.419 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:38.419 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:38.419 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:38.419 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:38.419 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:38.419 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:38.679 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:38.679 "name": "Existed_Raid", 00:13:38.679 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:38.679 "strip_size_kb": 0, 00:13:38.679 "state": "configuring", 00:13:38.679 "raid_level": "raid1", 00:13:38.679 "superblock": false, 00:13:38.679 "num_base_bdevs": 3, 00:13:38.679 "num_base_bdevs_discovered": 2, 00:13:38.679 "num_base_bdevs_operational": 3, 00:13:38.679 "base_bdevs_list": [ 00:13:38.679 { 00:13:38.679 "name": "BaseBdev1", 00:13:38.679 "uuid": "2aab7478-55e0-459a-b341-d227a82fcc53", 00:13:38.679 "is_configured": true, 00:13:38.679 "data_offset": 0, 00:13:38.679 "data_size": 65536 00:13:38.679 }, 00:13:38.679 { 00:13:38.679 "name": null, 00:13:38.679 "uuid": "c7a77523-52bf-4ca2-9f2a-05c26088ad9c", 00:13:38.679 "is_configured": false, 00:13:38.679 "data_offset": 0, 00:13:38.679 "data_size": 65536 00:13:38.679 }, 00:13:38.679 { 00:13:38.679 "name": "BaseBdev3", 00:13:38.679 "uuid": "d9a8c363-ae86-42f9-938b-993c67e16bf6", 00:13:38.679 "is_configured": true, 00:13:38.679 "data_offset": 0, 00:13:38.679 "data_size": 65536 00:13:38.679 } 00:13:38.679 ] 00:13:38.679 }' 00:13:38.679 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:38.679 11:54:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:39.248 11:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:39.248 11:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:39.248 11:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:13:39.248 11:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:39.507 [2024-07-12 11:54:29.576842] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:39.507 11:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:39.507 11:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:39.507 11:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:39.507 11:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:39.507 11:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:39.507 11:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:39.507 11:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:39.507 11:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:39.507 11:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:39.507 11:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:39.507 11:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:39.507 11:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:39.765 11:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:39.765 "name": "Existed_Raid", 00:13:39.765 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:39.765 "strip_size_kb": 0, 00:13:39.765 "state": "configuring", 00:13:39.765 "raid_level": "raid1", 00:13:39.765 "superblock": false, 00:13:39.765 "num_base_bdevs": 3, 00:13:39.765 "num_base_bdevs_discovered": 1, 00:13:39.765 "num_base_bdevs_operational": 3, 00:13:39.765 "base_bdevs_list": [ 00:13:39.765 { 00:13:39.765 "name": null, 00:13:39.765 "uuid": "2aab7478-55e0-459a-b341-d227a82fcc53", 00:13:39.765 "is_configured": false, 00:13:39.765 "data_offset": 0, 00:13:39.765 "data_size": 65536 00:13:39.765 }, 00:13:39.765 { 00:13:39.765 "name": null, 00:13:39.765 "uuid": "c7a77523-52bf-4ca2-9f2a-05c26088ad9c", 00:13:39.765 "is_configured": false, 00:13:39.765 "data_offset": 0, 00:13:39.765 "data_size": 65536 00:13:39.765 }, 00:13:39.765 { 00:13:39.765 "name": "BaseBdev3", 00:13:39.765 "uuid": "d9a8c363-ae86-42f9-938b-993c67e16bf6", 00:13:39.765 "is_configured": true, 00:13:39.765 "data_offset": 0, 00:13:39.765 "data_size": 65536 00:13:39.765 } 00:13:39.765 ] 00:13:39.765 }' 00:13:39.765 11:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:39.765 11:54:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:40.331 11:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:40.331 11:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:40.331 11:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:13:40.331 11:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:13:40.589 [2024-07-12 11:54:30.593210] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:40.589 11:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:40.589 11:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:40.589 11:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:40.589 11:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:40.589 11:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:40.589 11:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:40.589 11:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:40.589 11:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:40.589 11:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:40.589 11:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:40.589 11:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:40.589 11:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:40.589 11:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:40.589 "name": "Existed_Raid", 00:13:40.589 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:40.589 "strip_size_kb": 0, 00:13:40.589 "state": "configuring", 00:13:40.589 "raid_level": "raid1", 00:13:40.589 "superblock": false, 00:13:40.589 "num_base_bdevs": 3, 00:13:40.589 "num_base_bdevs_discovered": 2, 00:13:40.589 "num_base_bdevs_operational": 3, 00:13:40.589 "base_bdevs_list": [ 00:13:40.589 { 00:13:40.589 "name": null, 00:13:40.589 "uuid": "2aab7478-55e0-459a-b341-d227a82fcc53", 00:13:40.589 "is_configured": false, 00:13:40.589 "data_offset": 0, 00:13:40.589 "data_size": 65536 00:13:40.589 }, 00:13:40.589 { 00:13:40.589 "name": "BaseBdev2", 00:13:40.590 "uuid": "c7a77523-52bf-4ca2-9f2a-05c26088ad9c", 00:13:40.590 "is_configured": true, 00:13:40.590 "data_offset": 0, 00:13:40.590 "data_size": 65536 00:13:40.590 }, 00:13:40.590 { 00:13:40.590 "name": "BaseBdev3", 00:13:40.590 "uuid": "d9a8c363-ae86-42f9-938b-993c67e16bf6", 00:13:40.590 "is_configured": true, 00:13:40.590 "data_offset": 0, 00:13:40.590 "data_size": 65536 00:13:40.590 } 00:13:40.590 ] 00:13:40.590 }' 00:13:40.590 11:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:40.590 11:54:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:41.190 11:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:41.190 11:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:41.448 11:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:13:41.448 11:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:13:41.448 11:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:41.448 11:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 2aab7478-55e0-459a-b341-d227a82fcc53 00:13:41.707 [2024-07-12 11:54:31.766989] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:13:41.707 [2024-07-12 11:54:31.767019] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf243d0 00:13:41.707 [2024-07-12 11:54:31.767023] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:13:41.707 [2024-07-12 11:54:31.767150] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10e08e0 00:13:41.707 [2024-07-12 11:54:31.767235] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf243d0 00:13:41.707 [2024-07-12 11:54:31.767240] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xf243d0 00:13:41.707 [2024-07-12 11:54:31.767350] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:41.707 NewBaseBdev 00:13:41.707 11:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:13:41.707 11:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:13:41.707 11:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:41.707 11:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:41.707 11:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:41.707 11:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:41.707 11:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:41.707 11:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:13:41.964 [ 00:13:41.964 { 00:13:41.964 "name": "NewBaseBdev", 00:13:41.964 "aliases": [ 00:13:41.964 "2aab7478-55e0-459a-b341-d227a82fcc53" 00:13:41.964 ], 00:13:41.964 "product_name": "Malloc disk", 00:13:41.964 "block_size": 512, 00:13:41.964 "num_blocks": 65536, 00:13:41.964 "uuid": "2aab7478-55e0-459a-b341-d227a82fcc53", 00:13:41.964 "assigned_rate_limits": { 00:13:41.964 "rw_ios_per_sec": 0, 00:13:41.964 "rw_mbytes_per_sec": 0, 00:13:41.964 "r_mbytes_per_sec": 0, 00:13:41.964 "w_mbytes_per_sec": 0 00:13:41.964 }, 00:13:41.964 "claimed": true, 00:13:41.964 "claim_type": "exclusive_write", 00:13:41.964 "zoned": false, 00:13:41.964 "supported_io_types": { 00:13:41.964 "read": true, 00:13:41.964 "write": true, 00:13:41.964 "unmap": true, 00:13:41.964 "flush": true, 00:13:41.964 "reset": true, 00:13:41.964 "nvme_admin": false, 00:13:41.964 "nvme_io": false, 00:13:41.964 "nvme_io_md": false, 00:13:41.964 "write_zeroes": true, 00:13:41.964 "zcopy": true, 00:13:41.964 "get_zone_info": false, 00:13:41.964 "zone_management": false, 00:13:41.964 "zone_append": false, 00:13:41.964 "compare": false, 00:13:41.964 "compare_and_write": false, 00:13:41.964 "abort": true, 00:13:41.964 "seek_hole": false, 00:13:41.964 "seek_data": false, 00:13:41.964 "copy": true, 00:13:41.964 "nvme_iov_md": false 00:13:41.964 }, 00:13:41.964 "memory_domains": [ 00:13:41.964 { 00:13:41.964 "dma_device_id": "system", 00:13:41.964 "dma_device_type": 1 00:13:41.964 }, 00:13:41.964 { 00:13:41.964 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:41.964 "dma_device_type": 2 00:13:41.964 } 00:13:41.964 ], 00:13:41.964 "driver_specific": {} 00:13:41.964 } 00:13:41.964 ] 00:13:41.964 11:54:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:41.964 11:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:13:41.964 11:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:41.964 11:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:41.964 11:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:41.964 11:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:41.964 11:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:41.964 11:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:41.964 11:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:41.964 11:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:41.964 11:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:41.964 11:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:41.964 11:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:42.221 11:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:42.221 "name": "Existed_Raid", 00:13:42.221 "uuid": "15f1e0bb-2165-4279-94f2-43ea0cec1960", 00:13:42.221 "strip_size_kb": 0, 00:13:42.221 "state": "online", 00:13:42.221 "raid_level": "raid1", 00:13:42.221 "superblock": false, 00:13:42.221 "num_base_bdevs": 3, 00:13:42.221 "num_base_bdevs_discovered": 3, 00:13:42.221 "num_base_bdevs_operational": 3, 00:13:42.221 "base_bdevs_list": [ 00:13:42.221 { 00:13:42.222 "name": "NewBaseBdev", 00:13:42.222 "uuid": "2aab7478-55e0-459a-b341-d227a82fcc53", 00:13:42.222 "is_configured": true, 00:13:42.222 "data_offset": 0, 00:13:42.222 "data_size": 65536 00:13:42.222 }, 00:13:42.222 { 00:13:42.222 "name": "BaseBdev2", 00:13:42.222 "uuid": "c7a77523-52bf-4ca2-9f2a-05c26088ad9c", 00:13:42.222 "is_configured": true, 00:13:42.222 "data_offset": 0, 00:13:42.222 "data_size": 65536 00:13:42.222 }, 00:13:42.222 { 00:13:42.222 "name": "BaseBdev3", 00:13:42.222 "uuid": "d9a8c363-ae86-42f9-938b-993c67e16bf6", 00:13:42.222 "is_configured": true, 00:13:42.222 "data_offset": 0, 00:13:42.222 "data_size": 65536 00:13:42.222 } 00:13:42.222 ] 00:13:42.222 }' 00:13:42.222 11:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:42.222 11:54:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:42.790 11:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:13:42.790 11:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:42.790 11:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:42.790 11:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:42.790 11:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:42.790 11:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:42.790 11:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:42.790 11:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:42.790 [2024-07-12 11:54:32.922150] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:42.790 11:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:42.790 "name": "Existed_Raid", 00:13:42.790 "aliases": [ 00:13:42.790 "15f1e0bb-2165-4279-94f2-43ea0cec1960" 00:13:42.790 ], 00:13:42.790 "product_name": "Raid Volume", 00:13:42.790 "block_size": 512, 00:13:42.790 "num_blocks": 65536, 00:13:42.790 "uuid": "15f1e0bb-2165-4279-94f2-43ea0cec1960", 00:13:42.790 "assigned_rate_limits": { 00:13:42.790 "rw_ios_per_sec": 0, 00:13:42.790 "rw_mbytes_per_sec": 0, 00:13:42.790 "r_mbytes_per_sec": 0, 00:13:42.790 "w_mbytes_per_sec": 0 00:13:42.790 }, 00:13:42.790 "claimed": false, 00:13:42.790 "zoned": false, 00:13:42.790 "supported_io_types": { 00:13:42.790 "read": true, 00:13:42.790 "write": true, 00:13:42.790 "unmap": false, 00:13:42.790 "flush": false, 00:13:42.790 "reset": true, 00:13:42.790 "nvme_admin": false, 00:13:42.790 "nvme_io": false, 00:13:42.790 "nvme_io_md": false, 00:13:42.790 "write_zeroes": true, 00:13:42.790 "zcopy": false, 00:13:42.790 "get_zone_info": false, 00:13:42.790 "zone_management": false, 00:13:42.790 "zone_append": false, 00:13:42.790 "compare": false, 00:13:42.790 "compare_and_write": false, 00:13:42.790 "abort": false, 00:13:42.790 "seek_hole": false, 00:13:42.790 "seek_data": false, 00:13:42.790 "copy": false, 00:13:42.790 "nvme_iov_md": false 00:13:42.790 }, 00:13:42.790 "memory_domains": [ 00:13:42.790 { 00:13:42.790 "dma_device_id": "system", 00:13:42.790 "dma_device_type": 1 00:13:42.790 }, 00:13:42.790 { 00:13:42.790 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:42.790 "dma_device_type": 2 00:13:42.790 }, 00:13:42.790 { 00:13:42.790 "dma_device_id": "system", 00:13:42.790 "dma_device_type": 1 00:13:42.790 }, 00:13:42.790 { 00:13:42.790 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:42.791 "dma_device_type": 2 00:13:42.791 }, 00:13:42.791 { 00:13:42.791 "dma_device_id": "system", 00:13:42.791 "dma_device_type": 1 00:13:42.791 }, 00:13:42.791 { 00:13:42.791 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:42.791 "dma_device_type": 2 00:13:42.791 } 00:13:42.791 ], 00:13:42.791 "driver_specific": { 00:13:42.791 "raid": { 00:13:42.791 "uuid": "15f1e0bb-2165-4279-94f2-43ea0cec1960", 00:13:42.791 "strip_size_kb": 0, 00:13:42.791 "state": "online", 00:13:42.791 "raid_level": "raid1", 00:13:42.791 "superblock": false, 00:13:42.791 "num_base_bdevs": 3, 00:13:42.791 "num_base_bdevs_discovered": 3, 00:13:42.791 "num_base_bdevs_operational": 3, 00:13:42.791 "base_bdevs_list": [ 00:13:42.791 { 00:13:42.791 "name": "NewBaseBdev", 00:13:42.791 "uuid": "2aab7478-55e0-459a-b341-d227a82fcc53", 00:13:42.791 "is_configured": true, 00:13:42.791 "data_offset": 0, 00:13:42.791 "data_size": 65536 00:13:42.791 }, 00:13:42.791 { 00:13:42.791 "name": "BaseBdev2", 00:13:42.791 "uuid": "c7a77523-52bf-4ca2-9f2a-05c26088ad9c", 00:13:42.791 "is_configured": true, 00:13:42.791 "data_offset": 0, 00:13:42.791 "data_size": 65536 00:13:42.791 }, 00:13:42.791 { 00:13:42.791 "name": "BaseBdev3", 00:13:42.791 "uuid": "d9a8c363-ae86-42f9-938b-993c67e16bf6", 00:13:42.791 "is_configured": true, 00:13:42.791 "data_offset": 0, 00:13:42.791 "data_size": 65536 00:13:42.791 } 00:13:42.791 ] 00:13:42.791 } 00:13:42.791 } 00:13:42.791 }' 00:13:42.791 11:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:42.791 11:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:13:42.791 BaseBdev2 00:13:42.791 BaseBdev3' 00:13:42.791 11:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:42.791 11:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:13:42.791 11:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:43.049 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:43.049 "name": "NewBaseBdev", 00:13:43.049 "aliases": [ 00:13:43.049 "2aab7478-55e0-459a-b341-d227a82fcc53" 00:13:43.049 ], 00:13:43.049 "product_name": "Malloc disk", 00:13:43.049 "block_size": 512, 00:13:43.049 "num_blocks": 65536, 00:13:43.049 "uuid": "2aab7478-55e0-459a-b341-d227a82fcc53", 00:13:43.049 "assigned_rate_limits": { 00:13:43.049 "rw_ios_per_sec": 0, 00:13:43.049 "rw_mbytes_per_sec": 0, 00:13:43.049 "r_mbytes_per_sec": 0, 00:13:43.049 "w_mbytes_per_sec": 0 00:13:43.049 }, 00:13:43.049 "claimed": true, 00:13:43.049 "claim_type": "exclusive_write", 00:13:43.049 "zoned": false, 00:13:43.049 "supported_io_types": { 00:13:43.049 "read": true, 00:13:43.049 "write": true, 00:13:43.049 "unmap": true, 00:13:43.049 "flush": true, 00:13:43.049 "reset": true, 00:13:43.049 "nvme_admin": false, 00:13:43.049 "nvme_io": false, 00:13:43.049 "nvme_io_md": false, 00:13:43.049 "write_zeroes": true, 00:13:43.049 "zcopy": true, 00:13:43.049 "get_zone_info": false, 00:13:43.049 "zone_management": false, 00:13:43.049 "zone_append": false, 00:13:43.049 "compare": false, 00:13:43.050 "compare_and_write": false, 00:13:43.050 "abort": true, 00:13:43.050 "seek_hole": false, 00:13:43.050 "seek_data": false, 00:13:43.050 "copy": true, 00:13:43.050 "nvme_iov_md": false 00:13:43.050 }, 00:13:43.050 "memory_domains": [ 00:13:43.050 { 00:13:43.050 "dma_device_id": "system", 00:13:43.050 "dma_device_type": 1 00:13:43.050 }, 00:13:43.050 { 00:13:43.050 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:43.050 "dma_device_type": 2 00:13:43.050 } 00:13:43.050 ], 00:13:43.050 "driver_specific": {} 00:13:43.050 }' 00:13:43.050 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:43.050 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:43.050 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:43.050 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:43.050 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:43.050 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:43.050 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:43.308 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:43.308 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:43.308 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:43.308 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:43.308 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:43.308 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:43.308 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:43.308 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:43.566 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:43.566 "name": "BaseBdev2", 00:13:43.566 "aliases": [ 00:13:43.566 "c7a77523-52bf-4ca2-9f2a-05c26088ad9c" 00:13:43.566 ], 00:13:43.566 "product_name": "Malloc disk", 00:13:43.566 "block_size": 512, 00:13:43.566 "num_blocks": 65536, 00:13:43.566 "uuid": "c7a77523-52bf-4ca2-9f2a-05c26088ad9c", 00:13:43.566 "assigned_rate_limits": { 00:13:43.567 "rw_ios_per_sec": 0, 00:13:43.567 "rw_mbytes_per_sec": 0, 00:13:43.567 "r_mbytes_per_sec": 0, 00:13:43.567 "w_mbytes_per_sec": 0 00:13:43.567 }, 00:13:43.567 "claimed": true, 00:13:43.567 "claim_type": "exclusive_write", 00:13:43.567 "zoned": false, 00:13:43.567 "supported_io_types": { 00:13:43.567 "read": true, 00:13:43.567 "write": true, 00:13:43.567 "unmap": true, 00:13:43.567 "flush": true, 00:13:43.567 "reset": true, 00:13:43.567 "nvme_admin": false, 00:13:43.567 "nvme_io": false, 00:13:43.567 "nvme_io_md": false, 00:13:43.567 "write_zeroes": true, 00:13:43.567 "zcopy": true, 00:13:43.567 "get_zone_info": false, 00:13:43.567 "zone_management": false, 00:13:43.567 "zone_append": false, 00:13:43.567 "compare": false, 00:13:43.567 "compare_and_write": false, 00:13:43.567 "abort": true, 00:13:43.567 "seek_hole": false, 00:13:43.567 "seek_data": false, 00:13:43.567 "copy": true, 00:13:43.567 "nvme_iov_md": false 00:13:43.567 }, 00:13:43.567 "memory_domains": [ 00:13:43.567 { 00:13:43.567 "dma_device_id": "system", 00:13:43.567 "dma_device_type": 1 00:13:43.567 }, 00:13:43.567 { 00:13:43.567 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:43.567 "dma_device_type": 2 00:13:43.567 } 00:13:43.567 ], 00:13:43.567 "driver_specific": {} 00:13:43.567 }' 00:13:43.567 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:43.567 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:43.567 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:43.567 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:43.567 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:43.567 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:43.567 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:43.567 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:43.567 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:43.567 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:43.826 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:43.826 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:43.826 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:43.826 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:43.826 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:43.826 11:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:43.826 "name": "BaseBdev3", 00:13:43.826 "aliases": [ 00:13:43.826 "d9a8c363-ae86-42f9-938b-993c67e16bf6" 00:13:43.826 ], 00:13:43.826 "product_name": "Malloc disk", 00:13:43.826 "block_size": 512, 00:13:43.826 "num_blocks": 65536, 00:13:43.826 "uuid": "d9a8c363-ae86-42f9-938b-993c67e16bf6", 00:13:43.826 "assigned_rate_limits": { 00:13:43.826 "rw_ios_per_sec": 0, 00:13:43.826 "rw_mbytes_per_sec": 0, 00:13:43.826 "r_mbytes_per_sec": 0, 00:13:43.826 "w_mbytes_per_sec": 0 00:13:43.826 }, 00:13:43.826 "claimed": true, 00:13:43.826 "claim_type": "exclusive_write", 00:13:43.826 "zoned": false, 00:13:43.826 "supported_io_types": { 00:13:43.826 "read": true, 00:13:43.826 "write": true, 00:13:43.826 "unmap": true, 00:13:43.826 "flush": true, 00:13:43.826 "reset": true, 00:13:43.826 "nvme_admin": false, 00:13:43.826 "nvme_io": false, 00:13:43.826 "nvme_io_md": false, 00:13:43.826 "write_zeroes": true, 00:13:43.826 "zcopy": true, 00:13:43.826 "get_zone_info": false, 00:13:43.826 "zone_management": false, 00:13:43.826 "zone_append": false, 00:13:43.826 "compare": false, 00:13:43.826 "compare_and_write": false, 00:13:43.826 "abort": true, 00:13:43.826 "seek_hole": false, 00:13:43.826 "seek_data": false, 00:13:43.826 "copy": true, 00:13:43.826 "nvme_iov_md": false 00:13:43.826 }, 00:13:43.826 "memory_domains": [ 00:13:43.826 { 00:13:43.826 "dma_device_id": "system", 00:13:43.826 "dma_device_type": 1 00:13:43.826 }, 00:13:43.826 { 00:13:43.826 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:43.826 "dma_device_type": 2 00:13:43.826 } 00:13:43.826 ], 00:13:43.826 "driver_specific": {} 00:13:43.826 }' 00:13:43.826 11:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:44.085 11:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:44.085 11:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:44.085 11:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:44.085 11:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:44.085 11:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:44.085 11:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:44.085 11:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:44.085 11:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:44.085 11:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:44.085 11:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:44.345 11:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:44.345 11:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:44.345 [2024-07-12 11:54:34.490028] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:44.345 [2024-07-12 11:54:34.490046] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:44.345 [2024-07-12 11:54:34.490080] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:44.345 [2024-07-12 11:54:34.490263] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:44.345 [2024-07-12 11:54:34.490270] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf243d0 name Existed_Raid, state offline 00:13:44.345 11:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 635857 00:13:44.345 11:54:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 635857 ']' 00:13:44.345 11:54:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 635857 00:13:44.345 11:54:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:13:44.345 11:54:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:44.345 11:54:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 635857 00:13:44.345 11:54:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:44.345 11:54:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:44.345 11:54:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 635857' 00:13:44.345 killing process with pid 635857 00:13:44.345 11:54:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 635857 00:13:44.345 [2024-07-12 11:54:34.546429] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:44.345 11:54:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 635857 00:13:44.345 [2024-07-12 11:54:34.569986] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:44.604 11:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:13:44.604 00:13:44.604 real 0m21.384s 00:13:44.604 user 0m39.763s 00:13:44.604 sys 0m3.300s 00:13:44.604 11:54:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:44.604 11:54:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:44.604 ************************************ 00:13:44.604 END TEST raid_state_function_test 00:13:44.604 ************************************ 00:13:44.604 11:54:34 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:44.604 11:54:34 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 3 true 00:13:44.604 11:54:34 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:44.604 11:54:34 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:44.604 11:54:34 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:44.604 ************************************ 00:13:44.604 START TEST raid_state_function_test_sb 00:13:44.604 ************************************ 00:13:44.604 11:54:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 3 true 00:13:44.604 11:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:13:44.604 11:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:13:44.604 11:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:13:44.604 11:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:44.604 11:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:44.604 11:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:44.604 11:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:44.604 11:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:44.604 11:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:44.604 11:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:44.604 11:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:44.604 11:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:44.604 11:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:13:44.604 11:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:44.604 11:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:44.604 11:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:44.604 11:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:44.604 11:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:44.604 11:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:44.604 11:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:44.604 11:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:44.605 11:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:13:44.605 11:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:13:44.605 11:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:13:44.605 11:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:13:44.605 11:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=640098 00:13:44.605 11:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 640098' 00:13:44.605 Process raid pid: 640098 00:13:44.605 11:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 640098 /var/tmp/spdk-raid.sock 00:13:44.605 11:54:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 640098 ']' 00:13:44.605 11:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:44.605 11:54:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:44.605 11:54:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:44.605 11:54:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:44.605 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:44.605 11:54:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:44.605 11:54:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:44.605 [2024-07-12 11:54:34.844874] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:13:44.605 [2024-07-12 11:54:34.844908] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:44.864 [2024-07-12 11:54:34.908639] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:44.864 [2024-07-12 11:54:34.987000] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:13:44.864 [2024-07-12 11:54:35.037119] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:44.864 [2024-07-12 11:54:35.037141] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:45.433 11:54:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:45.433 11:54:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:13:45.433 11:54:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:45.693 [2024-07-12 11:54:35.791484] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:45.693 [2024-07-12 11:54:35.791514] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:45.693 [2024-07-12 11:54:35.791526] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:45.693 [2024-07-12 11:54:35.791532] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:45.693 [2024-07-12 11:54:35.791551] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:45.693 [2024-07-12 11:54:35.791556] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:45.693 11:54:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:45.693 11:54:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:45.693 11:54:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:45.693 11:54:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:45.693 11:54:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:45.693 11:54:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:45.693 11:54:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:45.693 11:54:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:45.693 11:54:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:45.693 11:54:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:45.693 11:54:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:45.693 11:54:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:45.952 11:54:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:45.952 "name": "Existed_Raid", 00:13:45.952 "uuid": "0c53557b-8505-4797-bf97-97c8eb5eeff6", 00:13:45.952 "strip_size_kb": 0, 00:13:45.952 "state": "configuring", 00:13:45.952 "raid_level": "raid1", 00:13:45.952 "superblock": true, 00:13:45.952 "num_base_bdevs": 3, 00:13:45.952 "num_base_bdevs_discovered": 0, 00:13:45.952 "num_base_bdevs_operational": 3, 00:13:45.952 "base_bdevs_list": [ 00:13:45.952 { 00:13:45.952 "name": "BaseBdev1", 00:13:45.952 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:45.952 "is_configured": false, 00:13:45.952 "data_offset": 0, 00:13:45.952 "data_size": 0 00:13:45.952 }, 00:13:45.952 { 00:13:45.952 "name": "BaseBdev2", 00:13:45.952 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:45.952 "is_configured": false, 00:13:45.952 "data_offset": 0, 00:13:45.952 "data_size": 0 00:13:45.952 }, 00:13:45.952 { 00:13:45.952 "name": "BaseBdev3", 00:13:45.952 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:45.952 "is_configured": false, 00:13:45.952 "data_offset": 0, 00:13:45.952 "data_size": 0 00:13:45.952 } 00:13:45.952 ] 00:13:45.952 }' 00:13:45.952 11:54:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:45.952 11:54:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:46.520 11:54:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:46.520 [2024-07-12 11:54:36.613523] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:46.520 [2024-07-12 11:54:36.613542] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x228a1d0 name Existed_Raid, state configuring 00:13:46.520 11:54:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:46.780 [2024-07-12 11:54:36.785987] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:46.780 [2024-07-12 11:54:36.786007] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:46.780 [2024-07-12 11:54:36.786012] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:46.780 [2024-07-12 11:54:36.786018] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:46.780 [2024-07-12 11:54:36.786022] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:46.780 [2024-07-12 11:54:36.786027] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:46.780 11:54:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:46.780 [2024-07-12 11:54:36.962617] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:46.780 BaseBdev1 00:13:46.780 11:54:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:46.780 11:54:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:46.780 11:54:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:46.780 11:54:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:46.780 11:54:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:46.780 11:54:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:46.780 11:54:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:47.039 11:54:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:47.298 [ 00:13:47.299 { 00:13:47.299 "name": "BaseBdev1", 00:13:47.299 "aliases": [ 00:13:47.299 "7fddcd1e-403a-4a25-9e5d-3a53eff344c6" 00:13:47.299 ], 00:13:47.299 "product_name": "Malloc disk", 00:13:47.299 "block_size": 512, 00:13:47.299 "num_blocks": 65536, 00:13:47.299 "uuid": "7fddcd1e-403a-4a25-9e5d-3a53eff344c6", 00:13:47.299 "assigned_rate_limits": { 00:13:47.299 "rw_ios_per_sec": 0, 00:13:47.299 "rw_mbytes_per_sec": 0, 00:13:47.299 "r_mbytes_per_sec": 0, 00:13:47.299 "w_mbytes_per_sec": 0 00:13:47.299 }, 00:13:47.299 "claimed": true, 00:13:47.299 "claim_type": "exclusive_write", 00:13:47.299 "zoned": false, 00:13:47.299 "supported_io_types": { 00:13:47.299 "read": true, 00:13:47.299 "write": true, 00:13:47.299 "unmap": true, 00:13:47.299 "flush": true, 00:13:47.299 "reset": true, 00:13:47.299 "nvme_admin": false, 00:13:47.299 "nvme_io": false, 00:13:47.299 "nvme_io_md": false, 00:13:47.299 "write_zeroes": true, 00:13:47.299 "zcopy": true, 00:13:47.299 "get_zone_info": false, 00:13:47.299 "zone_management": false, 00:13:47.299 "zone_append": false, 00:13:47.299 "compare": false, 00:13:47.299 "compare_and_write": false, 00:13:47.299 "abort": true, 00:13:47.299 "seek_hole": false, 00:13:47.299 "seek_data": false, 00:13:47.299 "copy": true, 00:13:47.299 "nvme_iov_md": false 00:13:47.299 }, 00:13:47.299 "memory_domains": [ 00:13:47.299 { 00:13:47.299 "dma_device_id": "system", 00:13:47.299 "dma_device_type": 1 00:13:47.299 }, 00:13:47.299 { 00:13:47.299 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:47.299 "dma_device_type": 2 00:13:47.299 } 00:13:47.299 ], 00:13:47.299 "driver_specific": {} 00:13:47.299 } 00:13:47.299 ] 00:13:47.299 11:54:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:47.299 11:54:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:47.299 11:54:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:47.299 11:54:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:47.299 11:54:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:47.299 11:54:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:47.299 11:54:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:47.299 11:54:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:47.299 11:54:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:47.299 11:54:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:47.299 11:54:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:47.299 11:54:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:47.299 11:54:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:47.299 11:54:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:47.299 "name": "Existed_Raid", 00:13:47.299 "uuid": "945d8712-4547-4474-8a61-1d3dfdb12ab5", 00:13:47.299 "strip_size_kb": 0, 00:13:47.299 "state": "configuring", 00:13:47.299 "raid_level": "raid1", 00:13:47.299 "superblock": true, 00:13:47.299 "num_base_bdevs": 3, 00:13:47.299 "num_base_bdevs_discovered": 1, 00:13:47.299 "num_base_bdevs_operational": 3, 00:13:47.299 "base_bdevs_list": [ 00:13:47.299 { 00:13:47.299 "name": "BaseBdev1", 00:13:47.299 "uuid": "7fddcd1e-403a-4a25-9e5d-3a53eff344c6", 00:13:47.299 "is_configured": true, 00:13:47.299 "data_offset": 2048, 00:13:47.299 "data_size": 63488 00:13:47.299 }, 00:13:47.299 { 00:13:47.299 "name": "BaseBdev2", 00:13:47.299 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:47.299 "is_configured": false, 00:13:47.299 "data_offset": 0, 00:13:47.299 "data_size": 0 00:13:47.299 }, 00:13:47.299 { 00:13:47.299 "name": "BaseBdev3", 00:13:47.299 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:47.299 "is_configured": false, 00:13:47.299 "data_offset": 0, 00:13:47.299 "data_size": 0 00:13:47.299 } 00:13:47.299 ] 00:13:47.299 }' 00:13:47.299 11:54:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:47.299 11:54:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:47.910 11:54:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:47.910 [2024-07-12 11:54:38.141660] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:47.910 [2024-07-12 11:54:38.141686] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2289aa0 name Existed_Raid, state configuring 00:13:48.169 11:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:48.169 [2024-07-12 11:54:38.310117] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:48.169 [2024-07-12 11:54:38.311157] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:48.169 [2024-07-12 11:54:38.311180] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:48.169 [2024-07-12 11:54:38.311185] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:48.169 [2024-07-12 11:54:38.311190] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:48.169 11:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:48.169 11:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:48.169 11:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:48.169 11:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:48.169 11:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:48.169 11:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:48.169 11:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:48.169 11:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:48.169 11:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:48.169 11:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:48.169 11:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:48.169 11:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:48.169 11:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:48.169 11:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:48.428 11:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:48.428 "name": "Existed_Raid", 00:13:48.428 "uuid": "0ea7db48-1eb2-4ddc-8c3b-f2c43b178414", 00:13:48.428 "strip_size_kb": 0, 00:13:48.428 "state": "configuring", 00:13:48.428 "raid_level": "raid1", 00:13:48.428 "superblock": true, 00:13:48.428 "num_base_bdevs": 3, 00:13:48.428 "num_base_bdevs_discovered": 1, 00:13:48.428 "num_base_bdevs_operational": 3, 00:13:48.428 "base_bdevs_list": [ 00:13:48.428 { 00:13:48.428 "name": "BaseBdev1", 00:13:48.428 "uuid": "7fddcd1e-403a-4a25-9e5d-3a53eff344c6", 00:13:48.428 "is_configured": true, 00:13:48.428 "data_offset": 2048, 00:13:48.428 "data_size": 63488 00:13:48.428 }, 00:13:48.428 { 00:13:48.428 "name": "BaseBdev2", 00:13:48.428 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:48.428 "is_configured": false, 00:13:48.428 "data_offset": 0, 00:13:48.428 "data_size": 0 00:13:48.428 }, 00:13:48.428 { 00:13:48.428 "name": "BaseBdev3", 00:13:48.428 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:48.428 "is_configured": false, 00:13:48.428 "data_offset": 0, 00:13:48.428 "data_size": 0 00:13:48.428 } 00:13:48.428 ] 00:13:48.428 }' 00:13:48.428 11:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:48.428 11:54:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:48.995 11:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:48.995 [2024-07-12 11:54:39.162941] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:48.995 BaseBdev2 00:13:48.995 11:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:48.995 11:54:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:48.995 11:54:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:48.995 11:54:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:48.995 11:54:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:48.995 11:54:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:48.995 11:54:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:49.255 11:54:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:49.255 [ 00:13:49.255 { 00:13:49.255 "name": "BaseBdev2", 00:13:49.255 "aliases": [ 00:13:49.255 "afed175a-1475-44d9-95d6-6954006f8e0f" 00:13:49.255 ], 00:13:49.255 "product_name": "Malloc disk", 00:13:49.255 "block_size": 512, 00:13:49.255 "num_blocks": 65536, 00:13:49.255 "uuid": "afed175a-1475-44d9-95d6-6954006f8e0f", 00:13:49.255 "assigned_rate_limits": { 00:13:49.255 "rw_ios_per_sec": 0, 00:13:49.255 "rw_mbytes_per_sec": 0, 00:13:49.255 "r_mbytes_per_sec": 0, 00:13:49.255 "w_mbytes_per_sec": 0 00:13:49.255 }, 00:13:49.255 "claimed": true, 00:13:49.255 "claim_type": "exclusive_write", 00:13:49.255 "zoned": false, 00:13:49.255 "supported_io_types": { 00:13:49.255 "read": true, 00:13:49.255 "write": true, 00:13:49.255 "unmap": true, 00:13:49.255 "flush": true, 00:13:49.255 "reset": true, 00:13:49.255 "nvme_admin": false, 00:13:49.255 "nvme_io": false, 00:13:49.255 "nvme_io_md": false, 00:13:49.255 "write_zeroes": true, 00:13:49.255 "zcopy": true, 00:13:49.255 "get_zone_info": false, 00:13:49.255 "zone_management": false, 00:13:49.255 "zone_append": false, 00:13:49.255 "compare": false, 00:13:49.255 "compare_and_write": false, 00:13:49.255 "abort": true, 00:13:49.255 "seek_hole": false, 00:13:49.255 "seek_data": false, 00:13:49.255 "copy": true, 00:13:49.255 "nvme_iov_md": false 00:13:49.255 }, 00:13:49.255 "memory_domains": [ 00:13:49.255 { 00:13:49.255 "dma_device_id": "system", 00:13:49.255 "dma_device_type": 1 00:13:49.255 }, 00:13:49.255 { 00:13:49.255 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:49.255 "dma_device_type": 2 00:13:49.255 } 00:13:49.255 ], 00:13:49.255 "driver_specific": {} 00:13:49.255 } 00:13:49.255 ] 00:13:49.514 11:54:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:49.514 11:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:49.514 11:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:49.514 11:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:49.514 11:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:49.514 11:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:49.514 11:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:49.514 11:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:49.514 11:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:49.514 11:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:49.514 11:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:49.514 11:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:49.514 11:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:49.514 11:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:49.514 11:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:49.514 11:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:49.514 "name": "Existed_Raid", 00:13:49.514 "uuid": "0ea7db48-1eb2-4ddc-8c3b-f2c43b178414", 00:13:49.514 "strip_size_kb": 0, 00:13:49.514 "state": "configuring", 00:13:49.514 "raid_level": "raid1", 00:13:49.514 "superblock": true, 00:13:49.514 "num_base_bdevs": 3, 00:13:49.514 "num_base_bdevs_discovered": 2, 00:13:49.514 "num_base_bdevs_operational": 3, 00:13:49.514 "base_bdevs_list": [ 00:13:49.514 { 00:13:49.514 "name": "BaseBdev1", 00:13:49.514 "uuid": "7fddcd1e-403a-4a25-9e5d-3a53eff344c6", 00:13:49.514 "is_configured": true, 00:13:49.514 "data_offset": 2048, 00:13:49.514 "data_size": 63488 00:13:49.514 }, 00:13:49.514 { 00:13:49.514 "name": "BaseBdev2", 00:13:49.514 "uuid": "afed175a-1475-44d9-95d6-6954006f8e0f", 00:13:49.514 "is_configured": true, 00:13:49.514 "data_offset": 2048, 00:13:49.514 "data_size": 63488 00:13:49.514 }, 00:13:49.514 { 00:13:49.514 "name": "BaseBdev3", 00:13:49.514 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:49.514 "is_configured": false, 00:13:49.514 "data_offset": 0, 00:13:49.514 "data_size": 0 00:13:49.514 } 00:13:49.514 ] 00:13:49.514 }' 00:13:49.514 11:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:49.514 11:54:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:50.082 11:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:50.341 [2024-07-12 11:54:40.340704] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:50.341 [2024-07-12 11:54:40.340821] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x228a990 00:13:50.341 [2024-07-12 11:54:40.340829] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:50.341 [2024-07-12 11:54:40.340945] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22a1880 00:13:50.341 [2024-07-12 11:54:40.341029] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x228a990 00:13:50.341 [2024-07-12 11:54:40.341034] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x228a990 00:13:50.341 [2024-07-12 11:54:40.341099] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:50.341 BaseBdev3 00:13:50.341 11:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:13:50.341 11:54:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:50.341 11:54:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:50.341 11:54:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:50.341 11:54:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:50.341 11:54:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:50.341 11:54:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:50.341 11:54:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:50.601 [ 00:13:50.601 { 00:13:50.601 "name": "BaseBdev3", 00:13:50.601 "aliases": [ 00:13:50.601 "764ee4c1-b98c-45fa-ace5-f1aed3957d82" 00:13:50.601 ], 00:13:50.601 "product_name": "Malloc disk", 00:13:50.601 "block_size": 512, 00:13:50.601 "num_blocks": 65536, 00:13:50.601 "uuid": "764ee4c1-b98c-45fa-ace5-f1aed3957d82", 00:13:50.601 "assigned_rate_limits": { 00:13:50.601 "rw_ios_per_sec": 0, 00:13:50.601 "rw_mbytes_per_sec": 0, 00:13:50.601 "r_mbytes_per_sec": 0, 00:13:50.601 "w_mbytes_per_sec": 0 00:13:50.601 }, 00:13:50.601 "claimed": true, 00:13:50.601 "claim_type": "exclusive_write", 00:13:50.601 "zoned": false, 00:13:50.601 "supported_io_types": { 00:13:50.601 "read": true, 00:13:50.601 "write": true, 00:13:50.601 "unmap": true, 00:13:50.601 "flush": true, 00:13:50.601 "reset": true, 00:13:50.601 "nvme_admin": false, 00:13:50.601 "nvme_io": false, 00:13:50.601 "nvme_io_md": false, 00:13:50.601 "write_zeroes": true, 00:13:50.601 "zcopy": true, 00:13:50.601 "get_zone_info": false, 00:13:50.601 "zone_management": false, 00:13:50.601 "zone_append": false, 00:13:50.601 "compare": false, 00:13:50.601 "compare_and_write": false, 00:13:50.601 "abort": true, 00:13:50.601 "seek_hole": false, 00:13:50.601 "seek_data": false, 00:13:50.601 "copy": true, 00:13:50.601 "nvme_iov_md": false 00:13:50.601 }, 00:13:50.601 "memory_domains": [ 00:13:50.601 { 00:13:50.601 "dma_device_id": "system", 00:13:50.601 "dma_device_type": 1 00:13:50.601 }, 00:13:50.601 { 00:13:50.601 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:50.601 "dma_device_type": 2 00:13:50.601 } 00:13:50.601 ], 00:13:50.601 "driver_specific": {} 00:13:50.601 } 00:13:50.601 ] 00:13:50.601 11:54:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:50.601 11:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:50.601 11:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:50.601 11:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:13:50.601 11:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:50.601 11:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:50.601 11:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:50.601 11:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:50.601 11:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:50.601 11:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:50.601 11:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:50.601 11:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:50.601 11:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:50.601 11:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:50.601 11:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:50.860 11:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:50.860 "name": "Existed_Raid", 00:13:50.860 "uuid": "0ea7db48-1eb2-4ddc-8c3b-f2c43b178414", 00:13:50.860 "strip_size_kb": 0, 00:13:50.860 "state": "online", 00:13:50.860 "raid_level": "raid1", 00:13:50.860 "superblock": true, 00:13:50.860 "num_base_bdevs": 3, 00:13:50.860 "num_base_bdevs_discovered": 3, 00:13:50.860 "num_base_bdevs_operational": 3, 00:13:50.860 "base_bdevs_list": [ 00:13:50.860 { 00:13:50.860 "name": "BaseBdev1", 00:13:50.860 "uuid": "7fddcd1e-403a-4a25-9e5d-3a53eff344c6", 00:13:50.860 "is_configured": true, 00:13:50.860 "data_offset": 2048, 00:13:50.860 "data_size": 63488 00:13:50.860 }, 00:13:50.860 { 00:13:50.860 "name": "BaseBdev2", 00:13:50.861 "uuid": "afed175a-1475-44d9-95d6-6954006f8e0f", 00:13:50.861 "is_configured": true, 00:13:50.861 "data_offset": 2048, 00:13:50.861 "data_size": 63488 00:13:50.861 }, 00:13:50.861 { 00:13:50.861 "name": "BaseBdev3", 00:13:50.861 "uuid": "764ee4c1-b98c-45fa-ace5-f1aed3957d82", 00:13:50.861 "is_configured": true, 00:13:50.861 "data_offset": 2048, 00:13:50.861 "data_size": 63488 00:13:50.861 } 00:13:50.861 ] 00:13:50.861 }' 00:13:50.861 11:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:50.861 11:54:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:51.119 11:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:51.119 11:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:51.119 11:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:51.119 11:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:51.119 11:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:51.119 11:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:13:51.119 11:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:51.119 11:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:51.379 [2024-07-12 11:54:41.507894] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:51.379 11:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:51.379 "name": "Existed_Raid", 00:13:51.379 "aliases": [ 00:13:51.379 "0ea7db48-1eb2-4ddc-8c3b-f2c43b178414" 00:13:51.379 ], 00:13:51.379 "product_name": "Raid Volume", 00:13:51.379 "block_size": 512, 00:13:51.379 "num_blocks": 63488, 00:13:51.379 "uuid": "0ea7db48-1eb2-4ddc-8c3b-f2c43b178414", 00:13:51.379 "assigned_rate_limits": { 00:13:51.379 "rw_ios_per_sec": 0, 00:13:51.379 "rw_mbytes_per_sec": 0, 00:13:51.379 "r_mbytes_per_sec": 0, 00:13:51.379 "w_mbytes_per_sec": 0 00:13:51.379 }, 00:13:51.379 "claimed": false, 00:13:51.379 "zoned": false, 00:13:51.379 "supported_io_types": { 00:13:51.379 "read": true, 00:13:51.379 "write": true, 00:13:51.379 "unmap": false, 00:13:51.379 "flush": false, 00:13:51.379 "reset": true, 00:13:51.379 "nvme_admin": false, 00:13:51.379 "nvme_io": false, 00:13:51.379 "nvme_io_md": false, 00:13:51.379 "write_zeroes": true, 00:13:51.379 "zcopy": false, 00:13:51.379 "get_zone_info": false, 00:13:51.379 "zone_management": false, 00:13:51.379 "zone_append": false, 00:13:51.379 "compare": false, 00:13:51.379 "compare_and_write": false, 00:13:51.379 "abort": false, 00:13:51.379 "seek_hole": false, 00:13:51.379 "seek_data": false, 00:13:51.379 "copy": false, 00:13:51.379 "nvme_iov_md": false 00:13:51.379 }, 00:13:51.379 "memory_domains": [ 00:13:51.379 { 00:13:51.379 "dma_device_id": "system", 00:13:51.379 "dma_device_type": 1 00:13:51.379 }, 00:13:51.379 { 00:13:51.379 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:51.379 "dma_device_type": 2 00:13:51.379 }, 00:13:51.379 { 00:13:51.379 "dma_device_id": "system", 00:13:51.379 "dma_device_type": 1 00:13:51.379 }, 00:13:51.379 { 00:13:51.379 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:51.379 "dma_device_type": 2 00:13:51.379 }, 00:13:51.379 { 00:13:51.379 "dma_device_id": "system", 00:13:51.379 "dma_device_type": 1 00:13:51.379 }, 00:13:51.379 { 00:13:51.379 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:51.379 "dma_device_type": 2 00:13:51.379 } 00:13:51.379 ], 00:13:51.379 "driver_specific": { 00:13:51.379 "raid": { 00:13:51.379 "uuid": "0ea7db48-1eb2-4ddc-8c3b-f2c43b178414", 00:13:51.379 "strip_size_kb": 0, 00:13:51.379 "state": "online", 00:13:51.379 "raid_level": "raid1", 00:13:51.379 "superblock": true, 00:13:51.379 "num_base_bdevs": 3, 00:13:51.379 "num_base_bdevs_discovered": 3, 00:13:51.379 "num_base_bdevs_operational": 3, 00:13:51.379 "base_bdevs_list": [ 00:13:51.379 { 00:13:51.379 "name": "BaseBdev1", 00:13:51.379 "uuid": "7fddcd1e-403a-4a25-9e5d-3a53eff344c6", 00:13:51.379 "is_configured": true, 00:13:51.379 "data_offset": 2048, 00:13:51.379 "data_size": 63488 00:13:51.379 }, 00:13:51.379 { 00:13:51.379 "name": "BaseBdev2", 00:13:51.379 "uuid": "afed175a-1475-44d9-95d6-6954006f8e0f", 00:13:51.379 "is_configured": true, 00:13:51.379 "data_offset": 2048, 00:13:51.379 "data_size": 63488 00:13:51.379 }, 00:13:51.379 { 00:13:51.379 "name": "BaseBdev3", 00:13:51.379 "uuid": "764ee4c1-b98c-45fa-ace5-f1aed3957d82", 00:13:51.379 "is_configured": true, 00:13:51.379 "data_offset": 2048, 00:13:51.379 "data_size": 63488 00:13:51.379 } 00:13:51.379 ] 00:13:51.379 } 00:13:51.379 } 00:13:51.379 }' 00:13:51.379 11:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:51.379 11:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:51.379 BaseBdev2 00:13:51.379 BaseBdev3' 00:13:51.379 11:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:51.379 11:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:51.379 11:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:51.638 11:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:51.638 "name": "BaseBdev1", 00:13:51.638 "aliases": [ 00:13:51.638 "7fddcd1e-403a-4a25-9e5d-3a53eff344c6" 00:13:51.638 ], 00:13:51.638 "product_name": "Malloc disk", 00:13:51.638 "block_size": 512, 00:13:51.638 "num_blocks": 65536, 00:13:51.638 "uuid": "7fddcd1e-403a-4a25-9e5d-3a53eff344c6", 00:13:51.638 "assigned_rate_limits": { 00:13:51.638 "rw_ios_per_sec": 0, 00:13:51.638 "rw_mbytes_per_sec": 0, 00:13:51.638 "r_mbytes_per_sec": 0, 00:13:51.638 "w_mbytes_per_sec": 0 00:13:51.638 }, 00:13:51.638 "claimed": true, 00:13:51.638 "claim_type": "exclusive_write", 00:13:51.638 "zoned": false, 00:13:51.638 "supported_io_types": { 00:13:51.638 "read": true, 00:13:51.638 "write": true, 00:13:51.638 "unmap": true, 00:13:51.638 "flush": true, 00:13:51.638 "reset": true, 00:13:51.638 "nvme_admin": false, 00:13:51.638 "nvme_io": false, 00:13:51.638 "nvme_io_md": false, 00:13:51.638 "write_zeroes": true, 00:13:51.638 "zcopy": true, 00:13:51.638 "get_zone_info": false, 00:13:51.638 "zone_management": false, 00:13:51.638 "zone_append": false, 00:13:51.638 "compare": false, 00:13:51.638 "compare_and_write": false, 00:13:51.638 "abort": true, 00:13:51.638 "seek_hole": false, 00:13:51.638 "seek_data": false, 00:13:51.638 "copy": true, 00:13:51.638 "nvme_iov_md": false 00:13:51.638 }, 00:13:51.638 "memory_domains": [ 00:13:51.638 { 00:13:51.638 "dma_device_id": "system", 00:13:51.638 "dma_device_type": 1 00:13:51.638 }, 00:13:51.638 { 00:13:51.638 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:51.638 "dma_device_type": 2 00:13:51.638 } 00:13:51.638 ], 00:13:51.638 "driver_specific": {} 00:13:51.638 }' 00:13:51.638 11:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:51.638 11:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:51.638 11:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:51.638 11:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:51.638 11:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:51.638 11:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:51.638 11:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:51.897 11:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:51.897 11:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:51.898 11:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:51.898 11:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:51.898 11:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:51.898 11:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:51.898 11:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:51.898 11:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:52.157 11:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:52.157 "name": "BaseBdev2", 00:13:52.157 "aliases": [ 00:13:52.157 "afed175a-1475-44d9-95d6-6954006f8e0f" 00:13:52.157 ], 00:13:52.157 "product_name": "Malloc disk", 00:13:52.157 "block_size": 512, 00:13:52.157 "num_blocks": 65536, 00:13:52.157 "uuid": "afed175a-1475-44d9-95d6-6954006f8e0f", 00:13:52.157 "assigned_rate_limits": { 00:13:52.157 "rw_ios_per_sec": 0, 00:13:52.157 "rw_mbytes_per_sec": 0, 00:13:52.157 "r_mbytes_per_sec": 0, 00:13:52.157 "w_mbytes_per_sec": 0 00:13:52.157 }, 00:13:52.157 "claimed": true, 00:13:52.157 "claim_type": "exclusive_write", 00:13:52.157 "zoned": false, 00:13:52.157 "supported_io_types": { 00:13:52.157 "read": true, 00:13:52.157 "write": true, 00:13:52.157 "unmap": true, 00:13:52.157 "flush": true, 00:13:52.157 "reset": true, 00:13:52.157 "nvme_admin": false, 00:13:52.157 "nvme_io": false, 00:13:52.157 "nvme_io_md": false, 00:13:52.157 "write_zeroes": true, 00:13:52.157 "zcopy": true, 00:13:52.157 "get_zone_info": false, 00:13:52.157 "zone_management": false, 00:13:52.157 "zone_append": false, 00:13:52.157 "compare": false, 00:13:52.157 "compare_and_write": false, 00:13:52.157 "abort": true, 00:13:52.157 "seek_hole": false, 00:13:52.157 "seek_data": false, 00:13:52.157 "copy": true, 00:13:52.157 "nvme_iov_md": false 00:13:52.157 }, 00:13:52.157 "memory_domains": [ 00:13:52.157 { 00:13:52.157 "dma_device_id": "system", 00:13:52.157 "dma_device_type": 1 00:13:52.157 }, 00:13:52.157 { 00:13:52.157 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:52.157 "dma_device_type": 2 00:13:52.157 } 00:13:52.157 ], 00:13:52.157 "driver_specific": {} 00:13:52.157 }' 00:13:52.157 11:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:52.157 11:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:52.157 11:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:52.157 11:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:52.157 11:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:52.157 11:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:52.157 11:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:52.157 11:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:52.157 11:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:52.157 11:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:52.416 11:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:52.416 11:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:52.416 11:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:52.416 11:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:52.416 11:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:52.416 11:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:52.416 "name": "BaseBdev3", 00:13:52.416 "aliases": [ 00:13:52.416 "764ee4c1-b98c-45fa-ace5-f1aed3957d82" 00:13:52.416 ], 00:13:52.416 "product_name": "Malloc disk", 00:13:52.416 "block_size": 512, 00:13:52.416 "num_blocks": 65536, 00:13:52.416 "uuid": "764ee4c1-b98c-45fa-ace5-f1aed3957d82", 00:13:52.416 "assigned_rate_limits": { 00:13:52.416 "rw_ios_per_sec": 0, 00:13:52.416 "rw_mbytes_per_sec": 0, 00:13:52.416 "r_mbytes_per_sec": 0, 00:13:52.416 "w_mbytes_per_sec": 0 00:13:52.416 }, 00:13:52.416 "claimed": true, 00:13:52.416 "claim_type": "exclusive_write", 00:13:52.416 "zoned": false, 00:13:52.416 "supported_io_types": { 00:13:52.416 "read": true, 00:13:52.416 "write": true, 00:13:52.416 "unmap": true, 00:13:52.416 "flush": true, 00:13:52.416 "reset": true, 00:13:52.416 "nvme_admin": false, 00:13:52.416 "nvme_io": false, 00:13:52.416 "nvme_io_md": false, 00:13:52.416 "write_zeroes": true, 00:13:52.416 "zcopy": true, 00:13:52.416 "get_zone_info": false, 00:13:52.416 "zone_management": false, 00:13:52.416 "zone_append": false, 00:13:52.416 "compare": false, 00:13:52.416 "compare_and_write": false, 00:13:52.416 "abort": true, 00:13:52.416 "seek_hole": false, 00:13:52.416 "seek_data": false, 00:13:52.416 "copy": true, 00:13:52.416 "nvme_iov_md": false 00:13:52.416 }, 00:13:52.416 "memory_domains": [ 00:13:52.416 { 00:13:52.416 "dma_device_id": "system", 00:13:52.416 "dma_device_type": 1 00:13:52.416 }, 00:13:52.416 { 00:13:52.416 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:52.416 "dma_device_type": 2 00:13:52.416 } 00:13:52.416 ], 00:13:52.416 "driver_specific": {} 00:13:52.416 }' 00:13:52.416 11:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:52.675 11:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:52.675 11:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:52.675 11:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:52.675 11:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:52.675 11:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:52.675 11:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:52.675 11:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:52.675 11:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:52.675 11:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:52.675 11:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:52.944 11:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:52.944 11:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:52.945 [2024-07-12 11:54:43.083826] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:52.945 11:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:52.945 11:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:13:52.945 11:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:52.945 11:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:13:52.945 11:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:13:52.945 11:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:13:52.945 11:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:52.945 11:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:52.945 11:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:52.945 11:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:52.945 11:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:52.945 11:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:52.945 11:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:52.945 11:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:52.945 11:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:52.945 11:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:52.945 11:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:53.207 11:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:53.207 "name": "Existed_Raid", 00:13:53.207 "uuid": "0ea7db48-1eb2-4ddc-8c3b-f2c43b178414", 00:13:53.207 "strip_size_kb": 0, 00:13:53.207 "state": "online", 00:13:53.207 "raid_level": "raid1", 00:13:53.207 "superblock": true, 00:13:53.207 "num_base_bdevs": 3, 00:13:53.207 "num_base_bdevs_discovered": 2, 00:13:53.207 "num_base_bdevs_operational": 2, 00:13:53.207 "base_bdevs_list": [ 00:13:53.207 { 00:13:53.207 "name": null, 00:13:53.207 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:53.207 "is_configured": false, 00:13:53.207 "data_offset": 2048, 00:13:53.207 "data_size": 63488 00:13:53.207 }, 00:13:53.207 { 00:13:53.207 "name": "BaseBdev2", 00:13:53.207 "uuid": "afed175a-1475-44d9-95d6-6954006f8e0f", 00:13:53.207 "is_configured": true, 00:13:53.207 "data_offset": 2048, 00:13:53.207 "data_size": 63488 00:13:53.207 }, 00:13:53.207 { 00:13:53.207 "name": "BaseBdev3", 00:13:53.207 "uuid": "764ee4c1-b98c-45fa-ace5-f1aed3957d82", 00:13:53.207 "is_configured": true, 00:13:53.207 "data_offset": 2048, 00:13:53.207 "data_size": 63488 00:13:53.207 } 00:13:53.207 ] 00:13:53.207 }' 00:13:53.207 11:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:53.207 11:54:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:53.773 11:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:53.773 11:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:53.773 11:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:53.773 11:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:53.773 11:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:53.773 11:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:53.773 11:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:54.031 [2024-07-12 11:54:44.099258] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:54.031 11:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:54.031 11:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:54.031 11:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:54.031 11:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:54.289 11:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:54.289 11:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:54.289 11:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:13:54.289 [2024-07-12 11:54:44.433990] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:54.289 [2024-07-12 11:54:44.434046] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:54.289 [2024-07-12 11:54:44.444032] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:54.289 [2024-07-12 11:54:44.444074] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:54.289 [2024-07-12 11:54:44.444080] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x228a990 name Existed_Raid, state offline 00:13:54.289 11:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:54.289 11:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:54.289 11:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:54.289 11:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:54.547 11:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:54.547 11:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:54.547 11:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:13:54.547 11:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:13:54.547 11:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:54.547 11:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:54.547 BaseBdev2 00:13:54.547 11:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:13:54.547 11:54:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:54.547 11:54:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:54.547 11:54:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:54.547 11:54:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:54.547 11:54:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:54.547 11:54:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:54.806 11:54:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:55.065 [ 00:13:55.065 { 00:13:55.065 "name": "BaseBdev2", 00:13:55.065 "aliases": [ 00:13:55.065 "52076b56-80aa-4064-8973-94241ed6b0d6" 00:13:55.065 ], 00:13:55.065 "product_name": "Malloc disk", 00:13:55.065 "block_size": 512, 00:13:55.065 "num_blocks": 65536, 00:13:55.065 "uuid": "52076b56-80aa-4064-8973-94241ed6b0d6", 00:13:55.065 "assigned_rate_limits": { 00:13:55.065 "rw_ios_per_sec": 0, 00:13:55.065 "rw_mbytes_per_sec": 0, 00:13:55.065 "r_mbytes_per_sec": 0, 00:13:55.065 "w_mbytes_per_sec": 0 00:13:55.065 }, 00:13:55.065 "claimed": false, 00:13:55.065 "zoned": false, 00:13:55.065 "supported_io_types": { 00:13:55.065 "read": true, 00:13:55.065 "write": true, 00:13:55.065 "unmap": true, 00:13:55.065 "flush": true, 00:13:55.065 "reset": true, 00:13:55.065 "nvme_admin": false, 00:13:55.065 "nvme_io": false, 00:13:55.065 "nvme_io_md": false, 00:13:55.065 "write_zeroes": true, 00:13:55.065 "zcopy": true, 00:13:55.065 "get_zone_info": false, 00:13:55.065 "zone_management": false, 00:13:55.065 "zone_append": false, 00:13:55.065 "compare": false, 00:13:55.065 "compare_and_write": false, 00:13:55.065 "abort": true, 00:13:55.065 "seek_hole": false, 00:13:55.065 "seek_data": false, 00:13:55.065 "copy": true, 00:13:55.065 "nvme_iov_md": false 00:13:55.065 }, 00:13:55.065 "memory_domains": [ 00:13:55.065 { 00:13:55.065 "dma_device_id": "system", 00:13:55.065 "dma_device_type": 1 00:13:55.065 }, 00:13:55.065 { 00:13:55.065 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:55.065 "dma_device_type": 2 00:13:55.065 } 00:13:55.065 ], 00:13:55.065 "driver_specific": {} 00:13:55.065 } 00:13:55.065 ] 00:13:55.065 11:54:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:55.065 11:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:55.066 11:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:55.066 11:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:55.066 BaseBdev3 00:13:55.066 11:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:13:55.066 11:54:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:55.066 11:54:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:55.066 11:54:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:55.066 11:54:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:55.066 11:54:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:55.066 11:54:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:55.324 11:54:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:55.582 [ 00:13:55.582 { 00:13:55.582 "name": "BaseBdev3", 00:13:55.582 "aliases": [ 00:13:55.582 "ce05c121-c9a3-4a43-816a-3e49144767a2" 00:13:55.582 ], 00:13:55.582 "product_name": "Malloc disk", 00:13:55.583 "block_size": 512, 00:13:55.583 "num_blocks": 65536, 00:13:55.583 "uuid": "ce05c121-c9a3-4a43-816a-3e49144767a2", 00:13:55.583 "assigned_rate_limits": { 00:13:55.583 "rw_ios_per_sec": 0, 00:13:55.583 "rw_mbytes_per_sec": 0, 00:13:55.583 "r_mbytes_per_sec": 0, 00:13:55.583 "w_mbytes_per_sec": 0 00:13:55.583 }, 00:13:55.583 "claimed": false, 00:13:55.583 "zoned": false, 00:13:55.583 "supported_io_types": { 00:13:55.583 "read": true, 00:13:55.583 "write": true, 00:13:55.583 "unmap": true, 00:13:55.583 "flush": true, 00:13:55.583 "reset": true, 00:13:55.583 "nvme_admin": false, 00:13:55.583 "nvme_io": false, 00:13:55.583 "nvme_io_md": false, 00:13:55.583 "write_zeroes": true, 00:13:55.583 "zcopy": true, 00:13:55.583 "get_zone_info": false, 00:13:55.583 "zone_management": false, 00:13:55.583 "zone_append": false, 00:13:55.583 "compare": false, 00:13:55.583 "compare_and_write": false, 00:13:55.583 "abort": true, 00:13:55.583 "seek_hole": false, 00:13:55.583 "seek_data": false, 00:13:55.583 "copy": true, 00:13:55.583 "nvme_iov_md": false 00:13:55.583 }, 00:13:55.583 "memory_domains": [ 00:13:55.583 { 00:13:55.583 "dma_device_id": "system", 00:13:55.583 "dma_device_type": 1 00:13:55.583 }, 00:13:55.583 { 00:13:55.583 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:55.583 "dma_device_type": 2 00:13:55.583 } 00:13:55.583 ], 00:13:55.583 "driver_specific": {} 00:13:55.583 } 00:13:55.583 ] 00:13:55.583 11:54:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:55.583 11:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:55.583 11:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:55.583 11:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:55.583 [2024-07-12 11:54:45.754949] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:55.583 [2024-07-12 11:54:45.754976] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:55.583 [2024-07-12 11:54:45.754987] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:55.583 [2024-07-12 11:54:45.755959] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:55.583 11:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:55.583 11:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:55.583 11:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:55.583 11:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:55.583 11:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:55.583 11:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:55.583 11:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:55.583 11:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:55.583 11:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:55.583 11:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:55.583 11:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:55.583 11:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:55.842 11:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:55.842 "name": "Existed_Raid", 00:13:55.842 "uuid": "06f4d934-18f5-4f62-9862-5c437a62b5ba", 00:13:55.842 "strip_size_kb": 0, 00:13:55.842 "state": "configuring", 00:13:55.842 "raid_level": "raid1", 00:13:55.842 "superblock": true, 00:13:55.842 "num_base_bdevs": 3, 00:13:55.842 "num_base_bdevs_discovered": 2, 00:13:55.842 "num_base_bdevs_operational": 3, 00:13:55.842 "base_bdevs_list": [ 00:13:55.842 { 00:13:55.842 "name": "BaseBdev1", 00:13:55.842 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:55.842 "is_configured": false, 00:13:55.842 "data_offset": 0, 00:13:55.842 "data_size": 0 00:13:55.842 }, 00:13:55.842 { 00:13:55.842 "name": "BaseBdev2", 00:13:55.842 "uuid": "52076b56-80aa-4064-8973-94241ed6b0d6", 00:13:55.842 "is_configured": true, 00:13:55.842 "data_offset": 2048, 00:13:55.842 "data_size": 63488 00:13:55.842 }, 00:13:55.842 { 00:13:55.842 "name": "BaseBdev3", 00:13:55.842 "uuid": "ce05c121-c9a3-4a43-816a-3e49144767a2", 00:13:55.842 "is_configured": true, 00:13:55.842 "data_offset": 2048, 00:13:55.842 "data_size": 63488 00:13:55.842 } 00:13:55.842 ] 00:13:55.842 }' 00:13:55.842 11:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:55.842 11:54:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:56.408 11:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:13:56.408 [2024-07-12 11:54:46.577057] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:56.408 11:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:56.408 11:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:56.408 11:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:56.408 11:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:56.408 11:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:56.408 11:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:56.408 11:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:56.408 11:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:56.408 11:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:56.408 11:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:56.408 11:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:56.408 11:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:56.667 11:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:56.667 "name": "Existed_Raid", 00:13:56.667 "uuid": "06f4d934-18f5-4f62-9862-5c437a62b5ba", 00:13:56.667 "strip_size_kb": 0, 00:13:56.667 "state": "configuring", 00:13:56.667 "raid_level": "raid1", 00:13:56.667 "superblock": true, 00:13:56.667 "num_base_bdevs": 3, 00:13:56.667 "num_base_bdevs_discovered": 1, 00:13:56.667 "num_base_bdevs_operational": 3, 00:13:56.667 "base_bdevs_list": [ 00:13:56.667 { 00:13:56.667 "name": "BaseBdev1", 00:13:56.667 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:56.667 "is_configured": false, 00:13:56.667 "data_offset": 0, 00:13:56.667 "data_size": 0 00:13:56.667 }, 00:13:56.667 { 00:13:56.667 "name": null, 00:13:56.667 "uuid": "52076b56-80aa-4064-8973-94241ed6b0d6", 00:13:56.667 "is_configured": false, 00:13:56.667 "data_offset": 2048, 00:13:56.667 "data_size": 63488 00:13:56.667 }, 00:13:56.667 { 00:13:56.667 "name": "BaseBdev3", 00:13:56.667 "uuid": "ce05c121-c9a3-4a43-816a-3e49144767a2", 00:13:56.667 "is_configured": true, 00:13:56.667 "data_offset": 2048, 00:13:56.667 "data_size": 63488 00:13:56.667 } 00:13:56.667 ] 00:13:56.667 }' 00:13:56.667 11:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:56.667 11:54:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:57.234 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:57.234 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:57.234 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:13:57.234 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:57.491 [2024-07-12 11:54:47.566101] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:57.491 BaseBdev1 00:13:57.491 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:13:57.491 11:54:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:57.491 11:54:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:57.491 11:54:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:57.491 11:54:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:57.491 11:54:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:57.491 11:54:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:57.748 11:54:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:57.749 [ 00:13:57.749 { 00:13:57.749 "name": "BaseBdev1", 00:13:57.749 "aliases": [ 00:13:57.749 "58175efa-3e45-4b79-991f-e5313ffc03cd" 00:13:57.749 ], 00:13:57.749 "product_name": "Malloc disk", 00:13:57.749 "block_size": 512, 00:13:57.749 "num_blocks": 65536, 00:13:57.749 "uuid": "58175efa-3e45-4b79-991f-e5313ffc03cd", 00:13:57.749 "assigned_rate_limits": { 00:13:57.749 "rw_ios_per_sec": 0, 00:13:57.749 "rw_mbytes_per_sec": 0, 00:13:57.749 "r_mbytes_per_sec": 0, 00:13:57.749 "w_mbytes_per_sec": 0 00:13:57.749 }, 00:13:57.749 "claimed": true, 00:13:57.749 "claim_type": "exclusive_write", 00:13:57.749 "zoned": false, 00:13:57.749 "supported_io_types": { 00:13:57.749 "read": true, 00:13:57.749 "write": true, 00:13:57.749 "unmap": true, 00:13:57.749 "flush": true, 00:13:57.749 "reset": true, 00:13:57.749 "nvme_admin": false, 00:13:57.749 "nvme_io": false, 00:13:57.749 "nvme_io_md": false, 00:13:57.749 "write_zeroes": true, 00:13:57.749 "zcopy": true, 00:13:57.749 "get_zone_info": false, 00:13:57.749 "zone_management": false, 00:13:57.749 "zone_append": false, 00:13:57.749 "compare": false, 00:13:57.749 "compare_and_write": false, 00:13:57.749 "abort": true, 00:13:57.749 "seek_hole": false, 00:13:57.749 "seek_data": false, 00:13:57.749 "copy": true, 00:13:57.749 "nvme_iov_md": false 00:13:57.749 }, 00:13:57.749 "memory_domains": [ 00:13:57.749 { 00:13:57.749 "dma_device_id": "system", 00:13:57.749 "dma_device_type": 1 00:13:57.749 }, 00:13:57.749 { 00:13:57.749 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:57.749 "dma_device_type": 2 00:13:57.749 } 00:13:57.749 ], 00:13:57.749 "driver_specific": {} 00:13:57.749 } 00:13:57.749 ] 00:13:57.749 11:54:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:57.749 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:57.749 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:57.749 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:57.749 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:57.749 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:57.749 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:57.749 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:57.749 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:57.749 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:57.749 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:57.749 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:57.749 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:58.006 11:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:58.006 "name": "Existed_Raid", 00:13:58.006 "uuid": "06f4d934-18f5-4f62-9862-5c437a62b5ba", 00:13:58.006 "strip_size_kb": 0, 00:13:58.006 "state": "configuring", 00:13:58.006 "raid_level": "raid1", 00:13:58.006 "superblock": true, 00:13:58.006 "num_base_bdevs": 3, 00:13:58.006 "num_base_bdevs_discovered": 2, 00:13:58.006 "num_base_bdevs_operational": 3, 00:13:58.006 "base_bdevs_list": [ 00:13:58.006 { 00:13:58.006 "name": "BaseBdev1", 00:13:58.006 "uuid": "58175efa-3e45-4b79-991f-e5313ffc03cd", 00:13:58.006 "is_configured": true, 00:13:58.006 "data_offset": 2048, 00:13:58.006 "data_size": 63488 00:13:58.006 }, 00:13:58.006 { 00:13:58.006 "name": null, 00:13:58.006 "uuid": "52076b56-80aa-4064-8973-94241ed6b0d6", 00:13:58.006 "is_configured": false, 00:13:58.006 "data_offset": 2048, 00:13:58.006 "data_size": 63488 00:13:58.006 }, 00:13:58.006 { 00:13:58.006 "name": "BaseBdev3", 00:13:58.006 "uuid": "ce05c121-c9a3-4a43-816a-3e49144767a2", 00:13:58.006 "is_configured": true, 00:13:58.006 "data_offset": 2048, 00:13:58.006 "data_size": 63488 00:13:58.006 } 00:13:58.006 ] 00:13:58.006 }' 00:13:58.006 11:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:58.006 11:54:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:58.573 11:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:58.573 11:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:58.573 11:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:13:58.573 11:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:13:58.832 [2024-07-12 11:54:48.881548] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:58.832 11:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:58.832 11:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:58.832 11:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:58.832 11:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:58.832 11:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:58.832 11:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:58.832 11:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:58.832 11:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:58.832 11:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:58.832 11:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:58.832 11:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:58.832 11:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:58.832 11:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:58.832 "name": "Existed_Raid", 00:13:58.832 "uuid": "06f4d934-18f5-4f62-9862-5c437a62b5ba", 00:13:58.832 "strip_size_kb": 0, 00:13:58.832 "state": "configuring", 00:13:58.832 "raid_level": "raid1", 00:13:58.832 "superblock": true, 00:13:58.832 "num_base_bdevs": 3, 00:13:58.832 "num_base_bdevs_discovered": 1, 00:13:58.832 "num_base_bdevs_operational": 3, 00:13:58.832 "base_bdevs_list": [ 00:13:58.832 { 00:13:58.832 "name": "BaseBdev1", 00:13:58.832 "uuid": "58175efa-3e45-4b79-991f-e5313ffc03cd", 00:13:58.832 "is_configured": true, 00:13:58.832 "data_offset": 2048, 00:13:58.832 "data_size": 63488 00:13:58.832 }, 00:13:58.832 { 00:13:58.832 "name": null, 00:13:58.832 "uuid": "52076b56-80aa-4064-8973-94241ed6b0d6", 00:13:58.832 "is_configured": false, 00:13:58.832 "data_offset": 2048, 00:13:58.832 "data_size": 63488 00:13:58.832 }, 00:13:58.832 { 00:13:58.832 "name": null, 00:13:58.832 "uuid": "ce05c121-c9a3-4a43-816a-3e49144767a2", 00:13:58.832 "is_configured": false, 00:13:58.832 "data_offset": 2048, 00:13:58.832 "data_size": 63488 00:13:58.832 } 00:13:58.832 ] 00:13:58.832 }' 00:13:58.832 11:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:58.832 11:54:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:59.398 11:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:59.398 11:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:59.655 11:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:13:59.656 11:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:13:59.656 [2024-07-12 11:54:49.860099] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:59.656 11:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:59.656 11:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:59.656 11:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:59.656 11:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:59.656 11:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:59.656 11:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:59.656 11:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:59.656 11:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:59.656 11:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:59.656 11:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:59.656 11:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:59.656 11:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:59.913 11:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:59.913 "name": "Existed_Raid", 00:13:59.913 "uuid": "06f4d934-18f5-4f62-9862-5c437a62b5ba", 00:13:59.913 "strip_size_kb": 0, 00:13:59.913 "state": "configuring", 00:13:59.913 "raid_level": "raid1", 00:13:59.913 "superblock": true, 00:13:59.913 "num_base_bdevs": 3, 00:13:59.913 "num_base_bdevs_discovered": 2, 00:13:59.913 "num_base_bdevs_operational": 3, 00:13:59.914 "base_bdevs_list": [ 00:13:59.914 { 00:13:59.914 "name": "BaseBdev1", 00:13:59.914 "uuid": "58175efa-3e45-4b79-991f-e5313ffc03cd", 00:13:59.914 "is_configured": true, 00:13:59.914 "data_offset": 2048, 00:13:59.914 "data_size": 63488 00:13:59.914 }, 00:13:59.914 { 00:13:59.914 "name": null, 00:13:59.914 "uuid": "52076b56-80aa-4064-8973-94241ed6b0d6", 00:13:59.914 "is_configured": false, 00:13:59.914 "data_offset": 2048, 00:13:59.914 "data_size": 63488 00:13:59.914 }, 00:13:59.914 { 00:13:59.914 "name": "BaseBdev3", 00:13:59.914 "uuid": "ce05c121-c9a3-4a43-816a-3e49144767a2", 00:13:59.914 "is_configured": true, 00:13:59.914 "data_offset": 2048, 00:13:59.914 "data_size": 63488 00:13:59.914 } 00:13:59.914 ] 00:13:59.914 }' 00:13:59.914 11:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:59.914 11:54:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:00.481 11:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:00.481 11:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:00.481 11:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:14:00.481 11:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:00.740 [2024-07-12 11:54:50.806567] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:00.740 11:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:00.740 11:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:00.740 11:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:00.740 11:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:00.740 11:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:00.740 11:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:00.740 11:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:00.740 11:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:00.740 11:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:00.740 11:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:00.740 11:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:00.740 11:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:00.998 11:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:00.998 "name": "Existed_Raid", 00:14:00.998 "uuid": "06f4d934-18f5-4f62-9862-5c437a62b5ba", 00:14:00.998 "strip_size_kb": 0, 00:14:00.998 "state": "configuring", 00:14:00.998 "raid_level": "raid1", 00:14:00.998 "superblock": true, 00:14:00.998 "num_base_bdevs": 3, 00:14:00.998 "num_base_bdevs_discovered": 1, 00:14:00.998 "num_base_bdevs_operational": 3, 00:14:00.998 "base_bdevs_list": [ 00:14:00.998 { 00:14:00.998 "name": null, 00:14:00.998 "uuid": "58175efa-3e45-4b79-991f-e5313ffc03cd", 00:14:00.998 "is_configured": false, 00:14:00.998 "data_offset": 2048, 00:14:00.998 "data_size": 63488 00:14:00.998 }, 00:14:00.998 { 00:14:00.998 "name": null, 00:14:00.998 "uuid": "52076b56-80aa-4064-8973-94241ed6b0d6", 00:14:00.998 "is_configured": false, 00:14:00.998 "data_offset": 2048, 00:14:00.998 "data_size": 63488 00:14:00.998 }, 00:14:00.998 { 00:14:00.998 "name": "BaseBdev3", 00:14:00.998 "uuid": "ce05c121-c9a3-4a43-816a-3e49144767a2", 00:14:00.998 "is_configured": true, 00:14:00.998 "data_offset": 2048, 00:14:00.998 "data_size": 63488 00:14:00.998 } 00:14:00.998 ] 00:14:00.998 }' 00:14:00.998 11:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:00.998 11:54:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:01.256 11:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:01.256 11:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:01.515 11:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:14:01.515 11:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:01.774 [2024-07-12 11:54:51.794415] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:01.774 11:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:01.774 11:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:01.774 11:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:01.774 11:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:01.774 11:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:01.774 11:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:01.774 11:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:01.774 11:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:01.774 11:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:01.774 11:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:01.774 11:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:01.774 11:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:01.774 11:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:01.774 "name": "Existed_Raid", 00:14:01.774 "uuid": "06f4d934-18f5-4f62-9862-5c437a62b5ba", 00:14:01.774 "strip_size_kb": 0, 00:14:01.774 "state": "configuring", 00:14:01.774 "raid_level": "raid1", 00:14:01.774 "superblock": true, 00:14:01.774 "num_base_bdevs": 3, 00:14:01.774 "num_base_bdevs_discovered": 2, 00:14:01.774 "num_base_bdevs_operational": 3, 00:14:01.774 "base_bdevs_list": [ 00:14:01.774 { 00:14:01.774 "name": null, 00:14:01.774 "uuid": "58175efa-3e45-4b79-991f-e5313ffc03cd", 00:14:01.774 "is_configured": false, 00:14:01.774 "data_offset": 2048, 00:14:01.774 "data_size": 63488 00:14:01.774 }, 00:14:01.774 { 00:14:01.774 "name": "BaseBdev2", 00:14:01.774 "uuid": "52076b56-80aa-4064-8973-94241ed6b0d6", 00:14:01.774 "is_configured": true, 00:14:01.774 "data_offset": 2048, 00:14:01.774 "data_size": 63488 00:14:01.774 }, 00:14:01.774 { 00:14:01.774 "name": "BaseBdev3", 00:14:01.774 "uuid": "ce05c121-c9a3-4a43-816a-3e49144767a2", 00:14:01.774 "is_configured": true, 00:14:01.774 "data_offset": 2048, 00:14:01.774 "data_size": 63488 00:14:01.774 } 00:14:01.774 ] 00:14:01.774 }' 00:14:01.774 11:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:01.774 11:54:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:02.340 11:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:02.340 11:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:02.598 11:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:14:02.598 11:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:02.598 11:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:02.598 11:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 58175efa-3e45-4b79-991f-e5313ffc03cd 00:14:02.856 [2024-07-12 11:54:52.971983] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:02.856 [2024-07-12 11:54:52.972088] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x228b830 00:14:02.856 [2024-07-12 11:54:52.972096] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:02.856 [2024-07-12 11:54:52.972215] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x243cd70 00:14:02.856 [2024-07-12 11:54:52.972296] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x228b830 00:14:02.856 [2024-07-12 11:54:52.972300] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x228b830 00:14:02.856 [2024-07-12 11:54:52.972360] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:02.856 NewBaseBdev 00:14:02.856 11:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:14:02.856 11:54:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:14:02.856 11:54:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:02.856 11:54:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:02.856 11:54:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:02.856 11:54:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:02.856 11:54:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:03.115 11:54:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:03.115 [ 00:14:03.115 { 00:14:03.115 "name": "NewBaseBdev", 00:14:03.115 "aliases": [ 00:14:03.115 "58175efa-3e45-4b79-991f-e5313ffc03cd" 00:14:03.115 ], 00:14:03.115 "product_name": "Malloc disk", 00:14:03.115 "block_size": 512, 00:14:03.115 "num_blocks": 65536, 00:14:03.115 "uuid": "58175efa-3e45-4b79-991f-e5313ffc03cd", 00:14:03.115 "assigned_rate_limits": { 00:14:03.115 "rw_ios_per_sec": 0, 00:14:03.115 "rw_mbytes_per_sec": 0, 00:14:03.115 "r_mbytes_per_sec": 0, 00:14:03.115 "w_mbytes_per_sec": 0 00:14:03.115 }, 00:14:03.115 "claimed": true, 00:14:03.115 "claim_type": "exclusive_write", 00:14:03.115 "zoned": false, 00:14:03.115 "supported_io_types": { 00:14:03.115 "read": true, 00:14:03.115 "write": true, 00:14:03.115 "unmap": true, 00:14:03.115 "flush": true, 00:14:03.115 "reset": true, 00:14:03.115 "nvme_admin": false, 00:14:03.115 "nvme_io": false, 00:14:03.115 "nvme_io_md": false, 00:14:03.115 "write_zeroes": true, 00:14:03.115 "zcopy": true, 00:14:03.115 "get_zone_info": false, 00:14:03.115 "zone_management": false, 00:14:03.115 "zone_append": false, 00:14:03.115 "compare": false, 00:14:03.115 "compare_and_write": false, 00:14:03.115 "abort": true, 00:14:03.115 "seek_hole": false, 00:14:03.115 "seek_data": false, 00:14:03.115 "copy": true, 00:14:03.115 "nvme_iov_md": false 00:14:03.115 }, 00:14:03.115 "memory_domains": [ 00:14:03.115 { 00:14:03.115 "dma_device_id": "system", 00:14:03.115 "dma_device_type": 1 00:14:03.115 }, 00:14:03.115 { 00:14:03.115 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:03.115 "dma_device_type": 2 00:14:03.115 } 00:14:03.115 ], 00:14:03.115 "driver_specific": {} 00:14:03.115 } 00:14:03.115 ] 00:14:03.115 11:54:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:03.115 11:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:14:03.115 11:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:03.115 11:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:03.115 11:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:03.115 11:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:03.115 11:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:03.115 11:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:03.115 11:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:03.115 11:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:03.115 11:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:03.115 11:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:03.115 11:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:03.373 11:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:03.373 "name": "Existed_Raid", 00:14:03.373 "uuid": "06f4d934-18f5-4f62-9862-5c437a62b5ba", 00:14:03.373 "strip_size_kb": 0, 00:14:03.373 "state": "online", 00:14:03.373 "raid_level": "raid1", 00:14:03.373 "superblock": true, 00:14:03.373 "num_base_bdevs": 3, 00:14:03.373 "num_base_bdevs_discovered": 3, 00:14:03.373 "num_base_bdevs_operational": 3, 00:14:03.373 "base_bdevs_list": [ 00:14:03.373 { 00:14:03.373 "name": "NewBaseBdev", 00:14:03.373 "uuid": "58175efa-3e45-4b79-991f-e5313ffc03cd", 00:14:03.373 "is_configured": true, 00:14:03.373 "data_offset": 2048, 00:14:03.373 "data_size": 63488 00:14:03.373 }, 00:14:03.373 { 00:14:03.373 "name": "BaseBdev2", 00:14:03.373 "uuid": "52076b56-80aa-4064-8973-94241ed6b0d6", 00:14:03.373 "is_configured": true, 00:14:03.373 "data_offset": 2048, 00:14:03.373 "data_size": 63488 00:14:03.373 }, 00:14:03.373 { 00:14:03.373 "name": "BaseBdev3", 00:14:03.373 "uuid": "ce05c121-c9a3-4a43-816a-3e49144767a2", 00:14:03.373 "is_configured": true, 00:14:03.373 "data_offset": 2048, 00:14:03.373 "data_size": 63488 00:14:03.373 } 00:14:03.373 ] 00:14:03.373 }' 00:14:03.373 11:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:03.373 11:54:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:03.939 11:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:14:03.939 11:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:03.939 11:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:03.939 11:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:03.939 11:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:03.939 11:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:14:03.939 11:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:03.939 11:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:03.939 [2024-07-12 11:54:54.087042] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:03.939 11:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:03.939 "name": "Existed_Raid", 00:14:03.939 "aliases": [ 00:14:03.939 "06f4d934-18f5-4f62-9862-5c437a62b5ba" 00:14:03.939 ], 00:14:03.939 "product_name": "Raid Volume", 00:14:03.939 "block_size": 512, 00:14:03.939 "num_blocks": 63488, 00:14:03.939 "uuid": "06f4d934-18f5-4f62-9862-5c437a62b5ba", 00:14:03.939 "assigned_rate_limits": { 00:14:03.939 "rw_ios_per_sec": 0, 00:14:03.939 "rw_mbytes_per_sec": 0, 00:14:03.939 "r_mbytes_per_sec": 0, 00:14:03.939 "w_mbytes_per_sec": 0 00:14:03.939 }, 00:14:03.939 "claimed": false, 00:14:03.939 "zoned": false, 00:14:03.939 "supported_io_types": { 00:14:03.939 "read": true, 00:14:03.939 "write": true, 00:14:03.939 "unmap": false, 00:14:03.939 "flush": false, 00:14:03.939 "reset": true, 00:14:03.939 "nvme_admin": false, 00:14:03.939 "nvme_io": false, 00:14:03.939 "nvme_io_md": false, 00:14:03.939 "write_zeroes": true, 00:14:03.939 "zcopy": false, 00:14:03.939 "get_zone_info": false, 00:14:03.939 "zone_management": false, 00:14:03.939 "zone_append": false, 00:14:03.939 "compare": false, 00:14:03.939 "compare_and_write": false, 00:14:03.939 "abort": false, 00:14:03.939 "seek_hole": false, 00:14:03.939 "seek_data": false, 00:14:03.939 "copy": false, 00:14:03.939 "nvme_iov_md": false 00:14:03.939 }, 00:14:03.939 "memory_domains": [ 00:14:03.939 { 00:14:03.939 "dma_device_id": "system", 00:14:03.939 "dma_device_type": 1 00:14:03.939 }, 00:14:03.939 { 00:14:03.939 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:03.939 "dma_device_type": 2 00:14:03.939 }, 00:14:03.939 { 00:14:03.939 "dma_device_id": "system", 00:14:03.939 "dma_device_type": 1 00:14:03.939 }, 00:14:03.939 { 00:14:03.939 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:03.939 "dma_device_type": 2 00:14:03.939 }, 00:14:03.939 { 00:14:03.939 "dma_device_id": "system", 00:14:03.939 "dma_device_type": 1 00:14:03.939 }, 00:14:03.939 { 00:14:03.939 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:03.939 "dma_device_type": 2 00:14:03.939 } 00:14:03.939 ], 00:14:03.939 "driver_specific": { 00:14:03.939 "raid": { 00:14:03.939 "uuid": "06f4d934-18f5-4f62-9862-5c437a62b5ba", 00:14:03.939 "strip_size_kb": 0, 00:14:03.939 "state": "online", 00:14:03.940 "raid_level": "raid1", 00:14:03.940 "superblock": true, 00:14:03.940 "num_base_bdevs": 3, 00:14:03.940 "num_base_bdevs_discovered": 3, 00:14:03.940 "num_base_bdevs_operational": 3, 00:14:03.940 "base_bdevs_list": [ 00:14:03.940 { 00:14:03.940 "name": "NewBaseBdev", 00:14:03.940 "uuid": "58175efa-3e45-4b79-991f-e5313ffc03cd", 00:14:03.940 "is_configured": true, 00:14:03.940 "data_offset": 2048, 00:14:03.940 "data_size": 63488 00:14:03.940 }, 00:14:03.940 { 00:14:03.940 "name": "BaseBdev2", 00:14:03.940 "uuid": "52076b56-80aa-4064-8973-94241ed6b0d6", 00:14:03.940 "is_configured": true, 00:14:03.940 "data_offset": 2048, 00:14:03.940 "data_size": 63488 00:14:03.940 }, 00:14:03.940 { 00:14:03.940 "name": "BaseBdev3", 00:14:03.940 "uuid": "ce05c121-c9a3-4a43-816a-3e49144767a2", 00:14:03.940 "is_configured": true, 00:14:03.940 "data_offset": 2048, 00:14:03.940 "data_size": 63488 00:14:03.940 } 00:14:03.940 ] 00:14:03.940 } 00:14:03.940 } 00:14:03.940 }' 00:14:03.940 11:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:03.940 11:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:14:03.940 BaseBdev2 00:14:03.940 BaseBdev3' 00:14:03.940 11:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:03.940 11:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:03.940 11:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:04.198 11:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:04.198 "name": "NewBaseBdev", 00:14:04.198 "aliases": [ 00:14:04.198 "58175efa-3e45-4b79-991f-e5313ffc03cd" 00:14:04.198 ], 00:14:04.198 "product_name": "Malloc disk", 00:14:04.198 "block_size": 512, 00:14:04.198 "num_blocks": 65536, 00:14:04.198 "uuid": "58175efa-3e45-4b79-991f-e5313ffc03cd", 00:14:04.198 "assigned_rate_limits": { 00:14:04.198 "rw_ios_per_sec": 0, 00:14:04.198 "rw_mbytes_per_sec": 0, 00:14:04.198 "r_mbytes_per_sec": 0, 00:14:04.198 "w_mbytes_per_sec": 0 00:14:04.198 }, 00:14:04.198 "claimed": true, 00:14:04.198 "claim_type": "exclusive_write", 00:14:04.198 "zoned": false, 00:14:04.198 "supported_io_types": { 00:14:04.198 "read": true, 00:14:04.198 "write": true, 00:14:04.198 "unmap": true, 00:14:04.198 "flush": true, 00:14:04.198 "reset": true, 00:14:04.198 "nvme_admin": false, 00:14:04.198 "nvme_io": false, 00:14:04.198 "nvme_io_md": false, 00:14:04.198 "write_zeroes": true, 00:14:04.198 "zcopy": true, 00:14:04.198 "get_zone_info": false, 00:14:04.198 "zone_management": false, 00:14:04.198 "zone_append": false, 00:14:04.198 "compare": false, 00:14:04.198 "compare_and_write": false, 00:14:04.198 "abort": true, 00:14:04.198 "seek_hole": false, 00:14:04.198 "seek_data": false, 00:14:04.198 "copy": true, 00:14:04.198 "nvme_iov_md": false 00:14:04.198 }, 00:14:04.198 "memory_domains": [ 00:14:04.198 { 00:14:04.198 "dma_device_id": "system", 00:14:04.198 "dma_device_type": 1 00:14:04.198 }, 00:14:04.198 { 00:14:04.198 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:04.198 "dma_device_type": 2 00:14:04.198 } 00:14:04.198 ], 00:14:04.198 "driver_specific": {} 00:14:04.198 }' 00:14:04.198 11:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:04.198 11:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:04.198 11:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:04.198 11:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:04.456 11:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:04.456 11:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:04.456 11:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:04.456 11:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:04.456 11:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:04.456 11:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:04.456 11:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:04.456 11:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:04.456 11:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:04.456 11:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:04.456 11:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:04.715 11:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:04.715 "name": "BaseBdev2", 00:14:04.715 "aliases": [ 00:14:04.715 "52076b56-80aa-4064-8973-94241ed6b0d6" 00:14:04.715 ], 00:14:04.715 "product_name": "Malloc disk", 00:14:04.715 "block_size": 512, 00:14:04.715 "num_blocks": 65536, 00:14:04.715 "uuid": "52076b56-80aa-4064-8973-94241ed6b0d6", 00:14:04.715 "assigned_rate_limits": { 00:14:04.715 "rw_ios_per_sec": 0, 00:14:04.715 "rw_mbytes_per_sec": 0, 00:14:04.715 "r_mbytes_per_sec": 0, 00:14:04.715 "w_mbytes_per_sec": 0 00:14:04.715 }, 00:14:04.715 "claimed": true, 00:14:04.715 "claim_type": "exclusive_write", 00:14:04.715 "zoned": false, 00:14:04.715 "supported_io_types": { 00:14:04.715 "read": true, 00:14:04.715 "write": true, 00:14:04.715 "unmap": true, 00:14:04.715 "flush": true, 00:14:04.715 "reset": true, 00:14:04.715 "nvme_admin": false, 00:14:04.715 "nvme_io": false, 00:14:04.715 "nvme_io_md": false, 00:14:04.715 "write_zeroes": true, 00:14:04.715 "zcopy": true, 00:14:04.715 "get_zone_info": false, 00:14:04.715 "zone_management": false, 00:14:04.715 "zone_append": false, 00:14:04.715 "compare": false, 00:14:04.715 "compare_and_write": false, 00:14:04.715 "abort": true, 00:14:04.715 "seek_hole": false, 00:14:04.715 "seek_data": false, 00:14:04.715 "copy": true, 00:14:04.715 "nvme_iov_md": false 00:14:04.715 }, 00:14:04.715 "memory_domains": [ 00:14:04.715 { 00:14:04.715 "dma_device_id": "system", 00:14:04.715 "dma_device_type": 1 00:14:04.715 }, 00:14:04.715 { 00:14:04.715 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:04.715 "dma_device_type": 2 00:14:04.715 } 00:14:04.715 ], 00:14:04.715 "driver_specific": {} 00:14:04.715 }' 00:14:04.715 11:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:04.715 11:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:04.715 11:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:04.715 11:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:04.715 11:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:04.715 11:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:04.715 11:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:04.974 11:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:04.974 11:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:04.974 11:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:04.974 11:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:04.974 11:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:04.974 11:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:04.974 11:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:04.974 11:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:05.232 11:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:05.232 "name": "BaseBdev3", 00:14:05.232 "aliases": [ 00:14:05.232 "ce05c121-c9a3-4a43-816a-3e49144767a2" 00:14:05.232 ], 00:14:05.232 "product_name": "Malloc disk", 00:14:05.232 "block_size": 512, 00:14:05.232 "num_blocks": 65536, 00:14:05.232 "uuid": "ce05c121-c9a3-4a43-816a-3e49144767a2", 00:14:05.232 "assigned_rate_limits": { 00:14:05.232 "rw_ios_per_sec": 0, 00:14:05.232 "rw_mbytes_per_sec": 0, 00:14:05.232 "r_mbytes_per_sec": 0, 00:14:05.232 "w_mbytes_per_sec": 0 00:14:05.232 }, 00:14:05.232 "claimed": true, 00:14:05.232 "claim_type": "exclusive_write", 00:14:05.232 "zoned": false, 00:14:05.232 "supported_io_types": { 00:14:05.232 "read": true, 00:14:05.232 "write": true, 00:14:05.232 "unmap": true, 00:14:05.232 "flush": true, 00:14:05.232 "reset": true, 00:14:05.232 "nvme_admin": false, 00:14:05.232 "nvme_io": false, 00:14:05.232 "nvme_io_md": false, 00:14:05.232 "write_zeroes": true, 00:14:05.232 "zcopy": true, 00:14:05.232 "get_zone_info": false, 00:14:05.232 "zone_management": false, 00:14:05.232 "zone_append": false, 00:14:05.233 "compare": false, 00:14:05.233 "compare_and_write": false, 00:14:05.233 "abort": true, 00:14:05.233 "seek_hole": false, 00:14:05.233 "seek_data": false, 00:14:05.233 "copy": true, 00:14:05.233 "nvme_iov_md": false 00:14:05.233 }, 00:14:05.233 "memory_domains": [ 00:14:05.233 { 00:14:05.233 "dma_device_id": "system", 00:14:05.233 "dma_device_type": 1 00:14:05.233 }, 00:14:05.233 { 00:14:05.233 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:05.233 "dma_device_type": 2 00:14:05.233 } 00:14:05.233 ], 00:14:05.233 "driver_specific": {} 00:14:05.233 }' 00:14:05.233 11:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:05.233 11:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:05.233 11:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:05.233 11:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:05.233 11:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:05.233 11:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:05.233 11:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:05.233 11:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:05.233 11:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:05.233 11:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:05.491 11:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:05.491 11:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:05.491 11:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:05.491 [2024-07-12 11:54:55.707066] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:05.491 [2024-07-12 11:54:55.707084] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:05.491 [2024-07-12 11:54:55.707117] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:05.491 [2024-07-12 11:54:55.707288] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:05.491 [2024-07-12 11:54:55.707293] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x228b830 name Existed_Raid, state offline 00:14:05.491 11:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 640098 00:14:05.491 11:54:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 640098 ']' 00:14:05.491 11:54:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 640098 00:14:05.491 11:54:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:14:05.491 11:54:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:05.491 11:54:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 640098 00:14:05.750 11:54:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:05.750 11:54:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:05.750 11:54:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 640098' 00:14:05.750 killing process with pid 640098 00:14:05.750 11:54:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 640098 00:14:05.750 [2024-07-12 11:54:55.770351] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:05.750 11:54:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 640098 00:14:05.750 [2024-07-12 11:54:55.793397] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:05.750 11:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:14:05.750 00:14:05.750 real 0m21.161s 00:14:05.750 user 0m39.440s 00:14:05.750 sys 0m3.289s 00:14:05.750 11:54:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:05.750 11:54:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:05.750 ************************************ 00:14:05.750 END TEST raid_state_function_test_sb 00:14:05.750 ************************************ 00:14:06.008 11:54:55 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:06.008 11:54:55 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 3 00:14:06.008 11:54:55 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:14:06.008 11:54:55 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:06.008 11:54:55 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:06.008 ************************************ 00:14:06.008 START TEST raid_superblock_test 00:14:06.008 ************************************ 00:14:06.008 11:54:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 3 00:14:06.008 11:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:14:06.008 11:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:14:06.008 11:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:14:06.008 11:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:14:06.008 11:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:14:06.008 11:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:14:06.008 11:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:14:06.008 11:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:14:06.008 11:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:14:06.008 11:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:14:06.008 11:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:14:06.008 11:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:14:06.008 11:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:14:06.008 11:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:14:06.008 11:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:14:06.008 11:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=644127 00:14:06.008 11:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 644127 /var/tmp/spdk-raid.sock 00:14:06.008 11:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:14:06.008 11:54:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 644127 ']' 00:14:06.008 11:54:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:06.008 11:54:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:06.008 11:54:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:06.009 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:06.009 11:54:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:06.009 11:54:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:06.009 [2024-07-12 11:54:56.079430] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:14:06.009 [2024-07-12 11:54:56.079469] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid644127 ] 00:14:06.009 [2024-07-12 11:54:56.141188] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:06.009 [2024-07-12 11:54:56.218953] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:14:06.266 [2024-07-12 11:54:56.272300] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:06.266 [2024-07-12 11:54:56.272338] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:06.832 11:54:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:06.832 11:54:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:14:06.832 11:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:14:06.832 11:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:06.832 11:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:14:06.832 11:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:14:06.832 11:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:14:06.832 11:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:06.832 11:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:06.832 11:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:06.832 11:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:14:06.832 malloc1 00:14:06.832 11:54:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:07.092 [2024-07-12 11:54:57.195897] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:07.092 [2024-07-12 11:54:57.195927] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:07.092 [2024-07-12 11:54:57.195939] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16b6270 00:14:07.092 [2024-07-12 11:54:57.195945] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:07.092 [2024-07-12 11:54:57.197211] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:07.092 [2024-07-12 11:54:57.197231] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:07.092 pt1 00:14:07.092 11:54:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:07.092 11:54:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:07.092 11:54:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:14:07.092 11:54:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:14:07.092 11:54:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:14:07.092 11:54:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:07.092 11:54:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:07.092 11:54:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:07.092 11:54:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:14:07.351 malloc2 00:14:07.351 11:54:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:07.351 [2024-07-12 11:54:57.520221] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:07.351 [2024-07-12 11:54:57.520248] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:07.351 [2024-07-12 11:54:57.520260] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16b7580 00:14:07.351 [2024-07-12 11:54:57.520281] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:07.351 [2024-07-12 11:54:57.521277] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:07.351 [2024-07-12 11:54:57.521297] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:07.351 pt2 00:14:07.351 11:54:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:07.351 11:54:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:07.351 11:54:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:14:07.351 11:54:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:14:07.352 11:54:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:14:07.352 11:54:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:07.352 11:54:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:07.352 11:54:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:07.352 11:54:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:14:07.610 malloc3 00:14:07.611 11:54:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:07.611 [2024-07-12 11:54:57.844388] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:07.611 [2024-07-12 11:54:57.844417] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:07.611 [2024-07-12 11:54:57.844427] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1861e30 00:14:07.611 [2024-07-12 11:54:57.844432] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:07.611 [2024-07-12 11:54:57.845500] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:07.611 [2024-07-12 11:54:57.845528] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:07.611 pt3 00:14:07.870 11:54:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:07.870 11:54:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:07.870 11:54:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:14:07.870 [2024-07-12 11:54:58.012839] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:07.870 [2024-07-12 11:54:58.013745] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:07.870 [2024-07-12 11:54:58.013782] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:07.870 [2024-07-12 11:54:58.013886] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1865390 00:14:07.870 [2024-07-12 11:54:58.013892] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:07.870 [2024-07-12 11:54:58.014025] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1867c00 00:14:07.870 [2024-07-12 11:54:58.014126] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1865390 00:14:07.870 [2024-07-12 11:54:58.014131] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1865390 00:14:07.870 [2024-07-12 11:54:58.014197] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:07.870 11:54:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:14:07.870 11:54:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:07.870 11:54:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:07.870 11:54:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:07.870 11:54:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:07.870 11:54:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:07.870 11:54:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:07.870 11:54:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:07.870 11:54:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:07.870 11:54:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:07.870 11:54:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:07.870 11:54:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:08.130 11:54:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:08.130 "name": "raid_bdev1", 00:14:08.130 "uuid": "053e32e1-804d-4ea7-bdd6-497900c26f9e", 00:14:08.130 "strip_size_kb": 0, 00:14:08.130 "state": "online", 00:14:08.130 "raid_level": "raid1", 00:14:08.130 "superblock": true, 00:14:08.130 "num_base_bdevs": 3, 00:14:08.130 "num_base_bdevs_discovered": 3, 00:14:08.130 "num_base_bdevs_operational": 3, 00:14:08.130 "base_bdevs_list": [ 00:14:08.130 { 00:14:08.130 "name": "pt1", 00:14:08.130 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:08.130 "is_configured": true, 00:14:08.130 "data_offset": 2048, 00:14:08.130 "data_size": 63488 00:14:08.130 }, 00:14:08.130 { 00:14:08.130 "name": "pt2", 00:14:08.130 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:08.130 "is_configured": true, 00:14:08.130 "data_offset": 2048, 00:14:08.130 "data_size": 63488 00:14:08.130 }, 00:14:08.130 { 00:14:08.130 "name": "pt3", 00:14:08.130 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:08.130 "is_configured": true, 00:14:08.130 "data_offset": 2048, 00:14:08.130 "data_size": 63488 00:14:08.130 } 00:14:08.130 ] 00:14:08.130 }' 00:14:08.130 11:54:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:08.130 11:54:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:08.698 11:54:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:14:08.698 11:54:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:08.698 11:54:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:08.698 11:54:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:08.698 11:54:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:08.698 11:54:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:08.698 11:54:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:08.698 11:54:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:08.698 [2024-07-12 11:54:58.823083] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:08.698 11:54:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:08.698 "name": "raid_bdev1", 00:14:08.698 "aliases": [ 00:14:08.698 "053e32e1-804d-4ea7-bdd6-497900c26f9e" 00:14:08.698 ], 00:14:08.698 "product_name": "Raid Volume", 00:14:08.698 "block_size": 512, 00:14:08.698 "num_blocks": 63488, 00:14:08.698 "uuid": "053e32e1-804d-4ea7-bdd6-497900c26f9e", 00:14:08.698 "assigned_rate_limits": { 00:14:08.698 "rw_ios_per_sec": 0, 00:14:08.698 "rw_mbytes_per_sec": 0, 00:14:08.698 "r_mbytes_per_sec": 0, 00:14:08.698 "w_mbytes_per_sec": 0 00:14:08.698 }, 00:14:08.698 "claimed": false, 00:14:08.698 "zoned": false, 00:14:08.698 "supported_io_types": { 00:14:08.698 "read": true, 00:14:08.698 "write": true, 00:14:08.698 "unmap": false, 00:14:08.698 "flush": false, 00:14:08.698 "reset": true, 00:14:08.698 "nvme_admin": false, 00:14:08.698 "nvme_io": false, 00:14:08.698 "nvme_io_md": false, 00:14:08.698 "write_zeroes": true, 00:14:08.698 "zcopy": false, 00:14:08.698 "get_zone_info": false, 00:14:08.698 "zone_management": false, 00:14:08.698 "zone_append": false, 00:14:08.698 "compare": false, 00:14:08.698 "compare_and_write": false, 00:14:08.698 "abort": false, 00:14:08.698 "seek_hole": false, 00:14:08.698 "seek_data": false, 00:14:08.698 "copy": false, 00:14:08.698 "nvme_iov_md": false 00:14:08.698 }, 00:14:08.698 "memory_domains": [ 00:14:08.698 { 00:14:08.698 "dma_device_id": "system", 00:14:08.698 "dma_device_type": 1 00:14:08.698 }, 00:14:08.698 { 00:14:08.698 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:08.698 "dma_device_type": 2 00:14:08.698 }, 00:14:08.698 { 00:14:08.698 "dma_device_id": "system", 00:14:08.698 "dma_device_type": 1 00:14:08.698 }, 00:14:08.698 { 00:14:08.698 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:08.698 "dma_device_type": 2 00:14:08.698 }, 00:14:08.698 { 00:14:08.698 "dma_device_id": "system", 00:14:08.698 "dma_device_type": 1 00:14:08.698 }, 00:14:08.698 { 00:14:08.698 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:08.698 "dma_device_type": 2 00:14:08.698 } 00:14:08.698 ], 00:14:08.698 "driver_specific": { 00:14:08.698 "raid": { 00:14:08.698 "uuid": "053e32e1-804d-4ea7-bdd6-497900c26f9e", 00:14:08.698 "strip_size_kb": 0, 00:14:08.698 "state": "online", 00:14:08.698 "raid_level": "raid1", 00:14:08.698 "superblock": true, 00:14:08.698 "num_base_bdevs": 3, 00:14:08.698 "num_base_bdevs_discovered": 3, 00:14:08.698 "num_base_bdevs_operational": 3, 00:14:08.698 "base_bdevs_list": [ 00:14:08.698 { 00:14:08.698 "name": "pt1", 00:14:08.698 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:08.698 "is_configured": true, 00:14:08.698 "data_offset": 2048, 00:14:08.698 "data_size": 63488 00:14:08.698 }, 00:14:08.698 { 00:14:08.698 "name": "pt2", 00:14:08.698 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:08.698 "is_configured": true, 00:14:08.698 "data_offset": 2048, 00:14:08.698 "data_size": 63488 00:14:08.698 }, 00:14:08.698 { 00:14:08.698 "name": "pt3", 00:14:08.698 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:08.698 "is_configured": true, 00:14:08.698 "data_offset": 2048, 00:14:08.698 "data_size": 63488 00:14:08.698 } 00:14:08.698 ] 00:14:08.698 } 00:14:08.698 } 00:14:08.698 }' 00:14:08.698 11:54:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:08.698 11:54:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:08.698 pt2 00:14:08.698 pt3' 00:14:08.698 11:54:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:08.698 11:54:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:08.698 11:54:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:08.957 11:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:08.957 "name": "pt1", 00:14:08.957 "aliases": [ 00:14:08.957 "00000000-0000-0000-0000-000000000001" 00:14:08.957 ], 00:14:08.957 "product_name": "passthru", 00:14:08.957 "block_size": 512, 00:14:08.957 "num_blocks": 65536, 00:14:08.957 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:08.957 "assigned_rate_limits": { 00:14:08.957 "rw_ios_per_sec": 0, 00:14:08.957 "rw_mbytes_per_sec": 0, 00:14:08.957 "r_mbytes_per_sec": 0, 00:14:08.957 "w_mbytes_per_sec": 0 00:14:08.957 }, 00:14:08.957 "claimed": true, 00:14:08.957 "claim_type": "exclusive_write", 00:14:08.957 "zoned": false, 00:14:08.957 "supported_io_types": { 00:14:08.957 "read": true, 00:14:08.957 "write": true, 00:14:08.957 "unmap": true, 00:14:08.957 "flush": true, 00:14:08.957 "reset": true, 00:14:08.957 "nvme_admin": false, 00:14:08.957 "nvme_io": false, 00:14:08.957 "nvme_io_md": false, 00:14:08.957 "write_zeroes": true, 00:14:08.957 "zcopy": true, 00:14:08.957 "get_zone_info": false, 00:14:08.957 "zone_management": false, 00:14:08.957 "zone_append": false, 00:14:08.957 "compare": false, 00:14:08.957 "compare_and_write": false, 00:14:08.957 "abort": true, 00:14:08.957 "seek_hole": false, 00:14:08.957 "seek_data": false, 00:14:08.957 "copy": true, 00:14:08.957 "nvme_iov_md": false 00:14:08.957 }, 00:14:08.958 "memory_domains": [ 00:14:08.958 { 00:14:08.958 "dma_device_id": "system", 00:14:08.958 "dma_device_type": 1 00:14:08.958 }, 00:14:08.958 { 00:14:08.958 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:08.958 "dma_device_type": 2 00:14:08.958 } 00:14:08.958 ], 00:14:08.958 "driver_specific": { 00:14:08.958 "passthru": { 00:14:08.958 "name": "pt1", 00:14:08.958 "base_bdev_name": "malloc1" 00:14:08.958 } 00:14:08.958 } 00:14:08.958 }' 00:14:08.958 11:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:08.958 11:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:08.958 11:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:08.958 11:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:08.958 11:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:08.958 11:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:08.958 11:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:09.216 11:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:09.216 11:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:09.216 11:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:09.216 11:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:09.216 11:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:09.216 11:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:09.216 11:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:09.216 11:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:09.476 11:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:09.476 "name": "pt2", 00:14:09.476 "aliases": [ 00:14:09.476 "00000000-0000-0000-0000-000000000002" 00:14:09.476 ], 00:14:09.476 "product_name": "passthru", 00:14:09.476 "block_size": 512, 00:14:09.476 "num_blocks": 65536, 00:14:09.476 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:09.476 "assigned_rate_limits": { 00:14:09.476 "rw_ios_per_sec": 0, 00:14:09.476 "rw_mbytes_per_sec": 0, 00:14:09.476 "r_mbytes_per_sec": 0, 00:14:09.476 "w_mbytes_per_sec": 0 00:14:09.476 }, 00:14:09.476 "claimed": true, 00:14:09.476 "claim_type": "exclusive_write", 00:14:09.476 "zoned": false, 00:14:09.476 "supported_io_types": { 00:14:09.476 "read": true, 00:14:09.476 "write": true, 00:14:09.476 "unmap": true, 00:14:09.476 "flush": true, 00:14:09.476 "reset": true, 00:14:09.476 "nvme_admin": false, 00:14:09.476 "nvme_io": false, 00:14:09.476 "nvme_io_md": false, 00:14:09.476 "write_zeroes": true, 00:14:09.476 "zcopy": true, 00:14:09.476 "get_zone_info": false, 00:14:09.476 "zone_management": false, 00:14:09.476 "zone_append": false, 00:14:09.476 "compare": false, 00:14:09.476 "compare_and_write": false, 00:14:09.476 "abort": true, 00:14:09.476 "seek_hole": false, 00:14:09.476 "seek_data": false, 00:14:09.476 "copy": true, 00:14:09.476 "nvme_iov_md": false 00:14:09.476 }, 00:14:09.476 "memory_domains": [ 00:14:09.476 { 00:14:09.476 "dma_device_id": "system", 00:14:09.477 "dma_device_type": 1 00:14:09.477 }, 00:14:09.477 { 00:14:09.477 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:09.477 "dma_device_type": 2 00:14:09.477 } 00:14:09.477 ], 00:14:09.477 "driver_specific": { 00:14:09.477 "passthru": { 00:14:09.477 "name": "pt2", 00:14:09.477 "base_bdev_name": "malloc2" 00:14:09.477 } 00:14:09.477 } 00:14:09.477 }' 00:14:09.477 11:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:09.477 11:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:09.477 11:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:09.477 11:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:09.477 11:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:09.477 11:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:09.477 11:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:09.477 11:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:09.477 11:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:09.477 11:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:09.736 11:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:09.736 11:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:09.736 11:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:09.736 11:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:09.736 11:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:09.736 11:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:09.736 "name": "pt3", 00:14:09.736 "aliases": [ 00:14:09.736 "00000000-0000-0000-0000-000000000003" 00:14:09.736 ], 00:14:09.736 "product_name": "passthru", 00:14:09.736 "block_size": 512, 00:14:09.736 "num_blocks": 65536, 00:14:09.736 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:09.736 "assigned_rate_limits": { 00:14:09.736 "rw_ios_per_sec": 0, 00:14:09.736 "rw_mbytes_per_sec": 0, 00:14:09.736 "r_mbytes_per_sec": 0, 00:14:09.736 "w_mbytes_per_sec": 0 00:14:09.736 }, 00:14:09.736 "claimed": true, 00:14:09.736 "claim_type": "exclusive_write", 00:14:09.736 "zoned": false, 00:14:09.736 "supported_io_types": { 00:14:09.736 "read": true, 00:14:09.736 "write": true, 00:14:09.736 "unmap": true, 00:14:09.736 "flush": true, 00:14:09.736 "reset": true, 00:14:09.736 "nvme_admin": false, 00:14:09.736 "nvme_io": false, 00:14:09.736 "nvme_io_md": false, 00:14:09.736 "write_zeroes": true, 00:14:09.736 "zcopy": true, 00:14:09.736 "get_zone_info": false, 00:14:09.736 "zone_management": false, 00:14:09.736 "zone_append": false, 00:14:09.736 "compare": false, 00:14:09.736 "compare_and_write": false, 00:14:09.736 "abort": true, 00:14:09.736 "seek_hole": false, 00:14:09.736 "seek_data": false, 00:14:09.737 "copy": true, 00:14:09.737 "nvme_iov_md": false 00:14:09.737 }, 00:14:09.737 "memory_domains": [ 00:14:09.737 { 00:14:09.737 "dma_device_id": "system", 00:14:09.737 "dma_device_type": 1 00:14:09.737 }, 00:14:09.737 { 00:14:09.737 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:09.737 "dma_device_type": 2 00:14:09.737 } 00:14:09.737 ], 00:14:09.737 "driver_specific": { 00:14:09.737 "passthru": { 00:14:09.737 "name": "pt3", 00:14:09.737 "base_bdev_name": "malloc3" 00:14:09.737 } 00:14:09.737 } 00:14:09.737 }' 00:14:09.737 11:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:09.995 11:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:09.995 11:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:09.995 11:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:09.995 11:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:09.995 11:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:09.995 11:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:09.995 11:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:09.995 11:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:09.995 11:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:10.253 11:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:10.253 11:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:10.253 11:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:10.253 11:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:14:10.253 [2024-07-12 11:55:00.439241] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:10.253 11:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=053e32e1-804d-4ea7-bdd6-497900c26f9e 00:14:10.253 11:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 053e32e1-804d-4ea7-bdd6-497900c26f9e ']' 00:14:10.253 11:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:10.512 [2024-07-12 11:55:00.615525] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:10.512 [2024-07-12 11:55:00.615537] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:10.512 [2024-07-12 11:55:00.615570] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:10.512 [2024-07-12 11:55:00.615615] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:10.512 [2024-07-12 11:55:00.615621] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1865390 name raid_bdev1, state offline 00:14:10.512 11:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:10.512 11:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:14:10.771 11:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:14:10.771 11:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:14:10.771 11:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:10.771 11:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:14:10.771 11:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:10.771 11:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:11.029 11:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:11.029 11:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:14:11.288 11:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:14:11.288 11:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:14:11.288 11:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:14:11.288 11:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:11.288 11:55:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:14:11.288 11:55:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:11.288 11:55:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:11.288 11:55:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:11.288 11:55:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:11.288 11:55:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:11.288 11:55:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:11.288 11:55:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:11.288 11:55:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:11.288 11:55:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:14:11.288 11:55:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:11.548 [2024-07-12 11:55:01.618105] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:14:11.548 [2024-07-12 11:55:01.619083] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:14:11.548 [2024-07-12 11:55:01.619114] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:14:11.548 [2024-07-12 11:55:01.619147] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:14:11.548 [2024-07-12 11:55:01.619172] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:14:11.548 [2024-07-12 11:55:01.619184] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:14:11.548 [2024-07-12 11:55:01.619209] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:11.548 [2024-07-12 11:55:01.619216] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x185f470 name raid_bdev1, state configuring 00:14:11.548 request: 00:14:11.548 { 00:14:11.548 "name": "raid_bdev1", 00:14:11.548 "raid_level": "raid1", 00:14:11.548 "base_bdevs": [ 00:14:11.548 "malloc1", 00:14:11.548 "malloc2", 00:14:11.548 "malloc3" 00:14:11.548 ], 00:14:11.548 "superblock": false, 00:14:11.548 "method": "bdev_raid_create", 00:14:11.548 "req_id": 1 00:14:11.548 } 00:14:11.548 Got JSON-RPC error response 00:14:11.548 response: 00:14:11.548 { 00:14:11.548 "code": -17, 00:14:11.548 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:14:11.548 } 00:14:11.548 11:55:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:14:11.548 11:55:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:14:11.548 11:55:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:14:11.548 11:55:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:14:11.548 11:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:11.548 11:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:14:11.807 11:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:14:11.807 11:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:14:11.807 11:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:11.807 [2024-07-12 11:55:01.946919] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:11.807 [2024-07-12 11:55:01.946949] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:11.807 [2024-07-12 11:55:01.946958] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1862060 00:14:11.807 [2024-07-12 11:55:01.946964] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:11.807 [2024-07-12 11:55:01.948165] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:11.807 [2024-07-12 11:55:01.948185] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:11.807 [2024-07-12 11:55:01.948230] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:14:11.807 [2024-07-12 11:55:01.948249] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:11.807 pt1 00:14:11.807 11:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:14:11.807 11:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:11.807 11:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:11.807 11:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:11.807 11:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:11.807 11:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:11.807 11:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:11.807 11:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:11.807 11:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:11.807 11:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:11.807 11:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:11.807 11:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:12.066 11:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:12.066 "name": "raid_bdev1", 00:14:12.066 "uuid": "053e32e1-804d-4ea7-bdd6-497900c26f9e", 00:14:12.066 "strip_size_kb": 0, 00:14:12.066 "state": "configuring", 00:14:12.066 "raid_level": "raid1", 00:14:12.066 "superblock": true, 00:14:12.066 "num_base_bdevs": 3, 00:14:12.066 "num_base_bdevs_discovered": 1, 00:14:12.066 "num_base_bdevs_operational": 3, 00:14:12.066 "base_bdevs_list": [ 00:14:12.066 { 00:14:12.066 "name": "pt1", 00:14:12.066 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:12.066 "is_configured": true, 00:14:12.066 "data_offset": 2048, 00:14:12.066 "data_size": 63488 00:14:12.066 }, 00:14:12.066 { 00:14:12.066 "name": null, 00:14:12.066 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:12.066 "is_configured": false, 00:14:12.066 "data_offset": 2048, 00:14:12.066 "data_size": 63488 00:14:12.066 }, 00:14:12.066 { 00:14:12.066 "name": null, 00:14:12.066 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:12.066 "is_configured": false, 00:14:12.066 "data_offset": 2048, 00:14:12.066 "data_size": 63488 00:14:12.066 } 00:14:12.066 ] 00:14:12.066 }' 00:14:12.066 11:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:12.066 11:55:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:12.634 11:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:14:12.634 11:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:12.634 [2024-07-12 11:55:02.765031] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:12.634 [2024-07-12 11:55:02.765064] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:12.634 [2024-07-12 11:55:02.765077] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16b6d90 00:14:12.634 [2024-07-12 11:55:02.765083] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:12.634 [2024-07-12 11:55:02.765320] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:12.634 [2024-07-12 11:55:02.765329] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:12.634 [2024-07-12 11:55:02.765373] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:12.634 [2024-07-12 11:55:02.765386] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:12.634 pt2 00:14:12.634 11:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:12.893 [2024-07-12 11:55:02.929464] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:14:12.893 11:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:14:12.893 11:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:12.893 11:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:12.893 11:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:12.893 11:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:12.893 11:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:12.893 11:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:12.893 11:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:12.893 11:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:12.893 11:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:12.893 11:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:12.893 11:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:12.893 11:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:12.893 "name": "raid_bdev1", 00:14:12.893 "uuid": "053e32e1-804d-4ea7-bdd6-497900c26f9e", 00:14:12.893 "strip_size_kb": 0, 00:14:12.893 "state": "configuring", 00:14:12.893 "raid_level": "raid1", 00:14:12.893 "superblock": true, 00:14:12.893 "num_base_bdevs": 3, 00:14:12.893 "num_base_bdevs_discovered": 1, 00:14:12.893 "num_base_bdevs_operational": 3, 00:14:12.893 "base_bdevs_list": [ 00:14:12.893 { 00:14:12.893 "name": "pt1", 00:14:12.893 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:12.893 "is_configured": true, 00:14:12.893 "data_offset": 2048, 00:14:12.893 "data_size": 63488 00:14:12.893 }, 00:14:12.893 { 00:14:12.893 "name": null, 00:14:12.893 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:12.893 "is_configured": false, 00:14:12.893 "data_offset": 2048, 00:14:12.893 "data_size": 63488 00:14:12.893 }, 00:14:12.893 { 00:14:12.893 "name": null, 00:14:12.893 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:12.893 "is_configured": false, 00:14:12.893 "data_offset": 2048, 00:14:12.893 "data_size": 63488 00:14:12.893 } 00:14:12.893 ] 00:14:12.893 }' 00:14:12.893 11:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:12.893 11:55:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:13.458 11:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:14:13.458 11:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:13.458 11:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:13.718 [2024-07-12 11:55:03.759623] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:13.718 [2024-07-12 11:55:03.759662] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:13.718 [2024-07-12 11:55:03.759672] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1865fa0 00:14:13.718 [2024-07-12 11:55:03.759678] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:13.718 [2024-07-12 11:55:03.759920] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:13.718 [2024-07-12 11:55:03.759929] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:13.718 [2024-07-12 11:55:03.759972] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:13.718 [2024-07-12 11:55:03.759984] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:13.718 pt2 00:14:13.718 11:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:14:13.718 11:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:13.718 11:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:13.718 [2024-07-12 11:55:03.932056] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:13.718 [2024-07-12 11:55:03.932076] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:13.718 [2024-07-12 11:55:03.932083] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1864540 00:14:13.718 [2024-07-12 11:55:03.932088] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:13.718 [2024-07-12 11:55:03.932295] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:13.718 [2024-07-12 11:55:03.932304] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:13.718 [2024-07-12 11:55:03.932337] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:14:13.718 [2024-07-12 11:55:03.932348] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:13.718 [2024-07-12 11:55:03.932423] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1860820 00:14:13.718 [2024-07-12 11:55:03.932429] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:13.718 [2024-07-12 11:55:03.932543] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17466a0 00:14:13.718 [2024-07-12 11:55:03.932634] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1860820 00:14:13.718 [2024-07-12 11:55:03.932639] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1860820 00:14:13.718 [2024-07-12 11:55:03.932703] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:13.718 pt3 00:14:13.718 11:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:14:13.718 11:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:13.718 11:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:14:13.718 11:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:13.718 11:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:13.718 11:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:13.718 11:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:13.718 11:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:13.718 11:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:13.718 11:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:13.718 11:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:13.718 11:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:13.718 11:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:13.718 11:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:13.977 11:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:13.977 "name": "raid_bdev1", 00:14:13.977 "uuid": "053e32e1-804d-4ea7-bdd6-497900c26f9e", 00:14:13.977 "strip_size_kb": 0, 00:14:13.977 "state": "online", 00:14:13.977 "raid_level": "raid1", 00:14:13.977 "superblock": true, 00:14:13.977 "num_base_bdevs": 3, 00:14:13.977 "num_base_bdevs_discovered": 3, 00:14:13.977 "num_base_bdevs_operational": 3, 00:14:13.977 "base_bdevs_list": [ 00:14:13.977 { 00:14:13.977 "name": "pt1", 00:14:13.977 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:13.977 "is_configured": true, 00:14:13.977 "data_offset": 2048, 00:14:13.977 "data_size": 63488 00:14:13.977 }, 00:14:13.977 { 00:14:13.977 "name": "pt2", 00:14:13.977 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:13.977 "is_configured": true, 00:14:13.977 "data_offset": 2048, 00:14:13.977 "data_size": 63488 00:14:13.977 }, 00:14:13.977 { 00:14:13.977 "name": "pt3", 00:14:13.977 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:13.977 "is_configured": true, 00:14:13.977 "data_offset": 2048, 00:14:13.977 "data_size": 63488 00:14:13.977 } 00:14:13.977 ] 00:14:13.977 }' 00:14:13.977 11:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:13.977 11:55:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:14.601 11:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:14:14.601 11:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:14.601 11:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:14.601 11:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:14.601 11:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:14.601 11:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:14.601 11:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:14.601 11:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:14.601 [2024-07-12 11:55:04.766430] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:14.601 11:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:14.601 "name": "raid_bdev1", 00:14:14.601 "aliases": [ 00:14:14.601 "053e32e1-804d-4ea7-bdd6-497900c26f9e" 00:14:14.601 ], 00:14:14.601 "product_name": "Raid Volume", 00:14:14.601 "block_size": 512, 00:14:14.601 "num_blocks": 63488, 00:14:14.601 "uuid": "053e32e1-804d-4ea7-bdd6-497900c26f9e", 00:14:14.601 "assigned_rate_limits": { 00:14:14.601 "rw_ios_per_sec": 0, 00:14:14.601 "rw_mbytes_per_sec": 0, 00:14:14.601 "r_mbytes_per_sec": 0, 00:14:14.601 "w_mbytes_per_sec": 0 00:14:14.601 }, 00:14:14.601 "claimed": false, 00:14:14.601 "zoned": false, 00:14:14.601 "supported_io_types": { 00:14:14.601 "read": true, 00:14:14.601 "write": true, 00:14:14.601 "unmap": false, 00:14:14.601 "flush": false, 00:14:14.601 "reset": true, 00:14:14.601 "nvme_admin": false, 00:14:14.601 "nvme_io": false, 00:14:14.601 "nvme_io_md": false, 00:14:14.601 "write_zeroes": true, 00:14:14.601 "zcopy": false, 00:14:14.601 "get_zone_info": false, 00:14:14.601 "zone_management": false, 00:14:14.601 "zone_append": false, 00:14:14.601 "compare": false, 00:14:14.601 "compare_and_write": false, 00:14:14.601 "abort": false, 00:14:14.601 "seek_hole": false, 00:14:14.601 "seek_data": false, 00:14:14.601 "copy": false, 00:14:14.601 "nvme_iov_md": false 00:14:14.601 }, 00:14:14.601 "memory_domains": [ 00:14:14.601 { 00:14:14.601 "dma_device_id": "system", 00:14:14.601 "dma_device_type": 1 00:14:14.601 }, 00:14:14.601 { 00:14:14.601 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:14.601 "dma_device_type": 2 00:14:14.601 }, 00:14:14.601 { 00:14:14.601 "dma_device_id": "system", 00:14:14.601 "dma_device_type": 1 00:14:14.601 }, 00:14:14.601 { 00:14:14.601 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:14.601 "dma_device_type": 2 00:14:14.601 }, 00:14:14.601 { 00:14:14.601 "dma_device_id": "system", 00:14:14.601 "dma_device_type": 1 00:14:14.601 }, 00:14:14.601 { 00:14:14.601 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:14.601 "dma_device_type": 2 00:14:14.601 } 00:14:14.601 ], 00:14:14.601 "driver_specific": { 00:14:14.601 "raid": { 00:14:14.601 "uuid": "053e32e1-804d-4ea7-bdd6-497900c26f9e", 00:14:14.601 "strip_size_kb": 0, 00:14:14.601 "state": "online", 00:14:14.601 "raid_level": "raid1", 00:14:14.601 "superblock": true, 00:14:14.601 "num_base_bdevs": 3, 00:14:14.601 "num_base_bdevs_discovered": 3, 00:14:14.601 "num_base_bdevs_operational": 3, 00:14:14.601 "base_bdevs_list": [ 00:14:14.601 { 00:14:14.601 "name": "pt1", 00:14:14.601 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:14.601 "is_configured": true, 00:14:14.601 "data_offset": 2048, 00:14:14.601 "data_size": 63488 00:14:14.601 }, 00:14:14.601 { 00:14:14.601 "name": "pt2", 00:14:14.601 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:14.601 "is_configured": true, 00:14:14.601 "data_offset": 2048, 00:14:14.601 "data_size": 63488 00:14:14.601 }, 00:14:14.601 { 00:14:14.601 "name": "pt3", 00:14:14.601 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:14.601 "is_configured": true, 00:14:14.601 "data_offset": 2048, 00:14:14.601 "data_size": 63488 00:14:14.601 } 00:14:14.601 ] 00:14:14.601 } 00:14:14.601 } 00:14:14.601 }' 00:14:14.601 11:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:14.861 11:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:14.861 pt2 00:14:14.861 pt3' 00:14:14.861 11:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:14.861 11:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:14.861 11:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:14.861 11:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:14.861 "name": "pt1", 00:14:14.861 "aliases": [ 00:14:14.861 "00000000-0000-0000-0000-000000000001" 00:14:14.861 ], 00:14:14.861 "product_name": "passthru", 00:14:14.861 "block_size": 512, 00:14:14.861 "num_blocks": 65536, 00:14:14.861 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:14.861 "assigned_rate_limits": { 00:14:14.861 "rw_ios_per_sec": 0, 00:14:14.861 "rw_mbytes_per_sec": 0, 00:14:14.861 "r_mbytes_per_sec": 0, 00:14:14.861 "w_mbytes_per_sec": 0 00:14:14.861 }, 00:14:14.861 "claimed": true, 00:14:14.861 "claim_type": "exclusive_write", 00:14:14.861 "zoned": false, 00:14:14.861 "supported_io_types": { 00:14:14.861 "read": true, 00:14:14.861 "write": true, 00:14:14.861 "unmap": true, 00:14:14.861 "flush": true, 00:14:14.861 "reset": true, 00:14:14.861 "nvme_admin": false, 00:14:14.861 "nvme_io": false, 00:14:14.861 "nvme_io_md": false, 00:14:14.861 "write_zeroes": true, 00:14:14.861 "zcopy": true, 00:14:14.861 "get_zone_info": false, 00:14:14.861 "zone_management": false, 00:14:14.861 "zone_append": false, 00:14:14.861 "compare": false, 00:14:14.861 "compare_and_write": false, 00:14:14.861 "abort": true, 00:14:14.861 "seek_hole": false, 00:14:14.861 "seek_data": false, 00:14:14.861 "copy": true, 00:14:14.861 "nvme_iov_md": false 00:14:14.861 }, 00:14:14.861 "memory_domains": [ 00:14:14.861 { 00:14:14.861 "dma_device_id": "system", 00:14:14.861 "dma_device_type": 1 00:14:14.861 }, 00:14:14.861 { 00:14:14.861 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:14.861 "dma_device_type": 2 00:14:14.861 } 00:14:14.861 ], 00:14:14.861 "driver_specific": { 00:14:14.861 "passthru": { 00:14:14.861 "name": "pt1", 00:14:14.861 "base_bdev_name": "malloc1" 00:14:14.861 } 00:14:14.861 } 00:14:14.861 }' 00:14:14.861 11:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:14.861 11:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:14.861 11:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:14.861 11:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:15.120 11:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:15.120 11:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:15.120 11:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:15.120 11:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:15.120 11:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:15.120 11:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:15.120 11:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:15.120 11:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:15.120 11:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:15.120 11:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:15.120 11:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:15.379 11:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:15.379 "name": "pt2", 00:14:15.379 "aliases": [ 00:14:15.379 "00000000-0000-0000-0000-000000000002" 00:14:15.379 ], 00:14:15.379 "product_name": "passthru", 00:14:15.379 "block_size": 512, 00:14:15.379 "num_blocks": 65536, 00:14:15.379 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:15.379 "assigned_rate_limits": { 00:14:15.379 "rw_ios_per_sec": 0, 00:14:15.379 "rw_mbytes_per_sec": 0, 00:14:15.379 "r_mbytes_per_sec": 0, 00:14:15.379 "w_mbytes_per_sec": 0 00:14:15.379 }, 00:14:15.379 "claimed": true, 00:14:15.379 "claim_type": "exclusive_write", 00:14:15.379 "zoned": false, 00:14:15.379 "supported_io_types": { 00:14:15.379 "read": true, 00:14:15.379 "write": true, 00:14:15.379 "unmap": true, 00:14:15.379 "flush": true, 00:14:15.379 "reset": true, 00:14:15.379 "nvme_admin": false, 00:14:15.379 "nvme_io": false, 00:14:15.379 "nvme_io_md": false, 00:14:15.379 "write_zeroes": true, 00:14:15.379 "zcopy": true, 00:14:15.379 "get_zone_info": false, 00:14:15.379 "zone_management": false, 00:14:15.379 "zone_append": false, 00:14:15.379 "compare": false, 00:14:15.379 "compare_and_write": false, 00:14:15.379 "abort": true, 00:14:15.379 "seek_hole": false, 00:14:15.379 "seek_data": false, 00:14:15.379 "copy": true, 00:14:15.379 "nvme_iov_md": false 00:14:15.379 }, 00:14:15.379 "memory_domains": [ 00:14:15.379 { 00:14:15.379 "dma_device_id": "system", 00:14:15.379 "dma_device_type": 1 00:14:15.379 }, 00:14:15.379 { 00:14:15.379 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:15.379 "dma_device_type": 2 00:14:15.379 } 00:14:15.379 ], 00:14:15.379 "driver_specific": { 00:14:15.379 "passthru": { 00:14:15.379 "name": "pt2", 00:14:15.379 "base_bdev_name": "malloc2" 00:14:15.379 } 00:14:15.379 } 00:14:15.379 }' 00:14:15.379 11:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:15.379 11:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:15.379 11:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:15.379 11:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:15.379 11:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:15.638 11:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:15.638 11:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:15.638 11:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:15.638 11:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:15.638 11:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:15.638 11:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:15.638 11:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:15.638 11:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:15.638 11:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:15.638 11:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:15.897 11:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:15.897 "name": "pt3", 00:14:15.897 "aliases": [ 00:14:15.897 "00000000-0000-0000-0000-000000000003" 00:14:15.897 ], 00:14:15.897 "product_name": "passthru", 00:14:15.897 "block_size": 512, 00:14:15.897 "num_blocks": 65536, 00:14:15.897 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:15.897 "assigned_rate_limits": { 00:14:15.897 "rw_ios_per_sec": 0, 00:14:15.897 "rw_mbytes_per_sec": 0, 00:14:15.897 "r_mbytes_per_sec": 0, 00:14:15.897 "w_mbytes_per_sec": 0 00:14:15.897 }, 00:14:15.897 "claimed": true, 00:14:15.897 "claim_type": "exclusive_write", 00:14:15.897 "zoned": false, 00:14:15.897 "supported_io_types": { 00:14:15.897 "read": true, 00:14:15.897 "write": true, 00:14:15.897 "unmap": true, 00:14:15.897 "flush": true, 00:14:15.897 "reset": true, 00:14:15.897 "nvme_admin": false, 00:14:15.897 "nvme_io": false, 00:14:15.897 "nvme_io_md": false, 00:14:15.897 "write_zeroes": true, 00:14:15.897 "zcopy": true, 00:14:15.897 "get_zone_info": false, 00:14:15.897 "zone_management": false, 00:14:15.897 "zone_append": false, 00:14:15.897 "compare": false, 00:14:15.897 "compare_and_write": false, 00:14:15.897 "abort": true, 00:14:15.897 "seek_hole": false, 00:14:15.897 "seek_data": false, 00:14:15.897 "copy": true, 00:14:15.897 "nvme_iov_md": false 00:14:15.897 }, 00:14:15.897 "memory_domains": [ 00:14:15.897 { 00:14:15.897 "dma_device_id": "system", 00:14:15.897 "dma_device_type": 1 00:14:15.897 }, 00:14:15.897 { 00:14:15.897 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:15.897 "dma_device_type": 2 00:14:15.897 } 00:14:15.897 ], 00:14:15.897 "driver_specific": { 00:14:15.897 "passthru": { 00:14:15.897 "name": "pt3", 00:14:15.897 "base_bdev_name": "malloc3" 00:14:15.897 } 00:14:15.897 } 00:14:15.897 }' 00:14:15.897 11:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:15.897 11:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:15.897 11:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:15.897 11:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:15.897 11:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:16.156 11:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:16.156 11:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:16.156 11:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:16.156 11:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:16.156 11:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:16.156 11:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:16.156 11:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:16.156 11:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:16.156 11:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:14:16.414 [2024-07-12 11:55:06.438723] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:16.414 11:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 053e32e1-804d-4ea7-bdd6-497900c26f9e '!=' 053e32e1-804d-4ea7-bdd6-497900c26f9e ']' 00:14:16.414 11:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:14:16.414 11:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:16.414 11:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:14:16.414 11:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:14:16.414 [2024-07-12 11:55:06.602977] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:14:16.414 11:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:14:16.414 11:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:16.414 11:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:16.414 11:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:16.414 11:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:16.414 11:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:16.414 11:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:16.414 11:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:16.414 11:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:16.414 11:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:16.414 11:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:16.414 11:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:16.673 11:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:16.673 "name": "raid_bdev1", 00:14:16.673 "uuid": "053e32e1-804d-4ea7-bdd6-497900c26f9e", 00:14:16.673 "strip_size_kb": 0, 00:14:16.673 "state": "online", 00:14:16.673 "raid_level": "raid1", 00:14:16.673 "superblock": true, 00:14:16.673 "num_base_bdevs": 3, 00:14:16.673 "num_base_bdevs_discovered": 2, 00:14:16.673 "num_base_bdevs_operational": 2, 00:14:16.673 "base_bdevs_list": [ 00:14:16.673 { 00:14:16.673 "name": null, 00:14:16.673 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:16.673 "is_configured": false, 00:14:16.673 "data_offset": 2048, 00:14:16.673 "data_size": 63488 00:14:16.673 }, 00:14:16.673 { 00:14:16.673 "name": "pt2", 00:14:16.673 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:16.673 "is_configured": true, 00:14:16.673 "data_offset": 2048, 00:14:16.673 "data_size": 63488 00:14:16.673 }, 00:14:16.673 { 00:14:16.673 "name": "pt3", 00:14:16.673 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:16.673 "is_configured": true, 00:14:16.673 "data_offset": 2048, 00:14:16.673 "data_size": 63488 00:14:16.673 } 00:14:16.673 ] 00:14:16.673 }' 00:14:16.673 11:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:16.673 11:55:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:17.239 11:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:17.239 [2024-07-12 11:55:07.421064] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:17.239 [2024-07-12 11:55:07.421084] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:17.239 [2024-07-12 11:55:07.421122] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:17.239 [2024-07-12 11:55:07.421158] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:17.239 [2024-07-12 11:55:07.421165] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1860820 name raid_bdev1, state offline 00:14:17.239 11:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:17.239 11:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:14:17.498 11:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:14:17.498 11:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:14:17.498 11:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:14:17.498 11:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:14:17.498 11:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:17.756 11:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:14:17.756 11:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:14:17.756 11:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:14:17.756 11:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:14:17.756 11:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:14:17.756 11:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:14:17.756 11:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:14:17.756 11:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:18.015 [2024-07-12 11:55:08.082767] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:18.015 [2024-07-12 11:55:08.082800] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:18.015 [2024-07-12 11:55:08.082810] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1866610 00:14:18.015 [2024-07-12 11:55:08.082815] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:18.015 [2024-07-12 11:55:08.084001] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:18.015 [2024-07-12 11:55:08.084022] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:18.015 [2024-07-12 11:55:08.084067] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:18.015 [2024-07-12 11:55:08.084085] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:18.015 pt2 00:14:18.015 11:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:14:18.015 11:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:18.015 11:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:18.015 11:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:18.015 11:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:18.016 11:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:18.016 11:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:18.016 11:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:18.016 11:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:18.016 11:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:18.016 11:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:18.016 11:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:18.274 11:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:18.274 "name": "raid_bdev1", 00:14:18.274 "uuid": "053e32e1-804d-4ea7-bdd6-497900c26f9e", 00:14:18.274 "strip_size_kb": 0, 00:14:18.274 "state": "configuring", 00:14:18.274 "raid_level": "raid1", 00:14:18.274 "superblock": true, 00:14:18.274 "num_base_bdevs": 3, 00:14:18.274 "num_base_bdevs_discovered": 1, 00:14:18.274 "num_base_bdevs_operational": 2, 00:14:18.274 "base_bdevs_list": [ 00:14:18.274 { 00:14:18.274 "name": null, 00:14:18.274 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:18.274 "is_configured": false, 00:14:18.274 "data_offset": 2048, 00:14:18.274 "data_size": 63488 00:14:18.274 }, 00:14:18.274 { 00:14:18.274 "name": "pt2", 00:14:18.274 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:18.274 "is_configured": true, 00:14:18.274 "data_offset": 2048, 00:14:18.274 "data_size": 63488 00:14:18.274 }, 00:14:18.274 { 00:14:18.274 "name": null, 00:14:18.274 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:18.274 "is_configured": false, 00:14:18.274 "data_offset": 2048, 00:14:18.274 "data_size": 63488 00:14:18.274 } 00:14:18.274 ] 00:14:18.274 }' 00:14:18.274 11:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:18.274 11:55:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:18.533 11:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:14:18.533 11:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:14:18.533 11:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=2 00:14:18.533 11:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:18.791 [2024-07-12 11:55:08.872795] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:18.791 [2024-07-12 11:55:08.872830] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:18.791 [2024-07-12 11:55:08.872839] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1865a20 00:14:18.791 [2024-07-12 11:55:08.872845] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:18.791 [2024-07-12 11:55:08.873069] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:18.791 [2024-07-12 11:55:08.873078] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:18.791 [2024-07-12 11:55:08.873118] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:14:18.791 [2024-07-12 11:55:08.873130] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:18.791 [2024-07-12 11:55:08.873197] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1862e30 00:14:18.791 [2024-07-12 11:55:08.873202] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:18.791 [2024-07-12 11:55:08.873304] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1864490 00:14:18.791 [2024-07-12 11:55:08.873385] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1862e30 00:14:18.791 [2024-07-12 11:55:08.873390] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1862e30 00:14:18.791 [2024-07-12 11:55:08.873449] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:18.791 pt3 00:14:18.791 11:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:14:18.791 11:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:18.791 11:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:18.791 11:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:18.791 11:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:18.791 11:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:18.792 11:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:18.792 11:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:18.792 11:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:18.792 11:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:18.792 11:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:18.792 11:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:19.050 11:55:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:19.050 "name": "raid_bdev1", 00:14:19.050 "uuid": "053e32e1-804d-4ea7-bdd6-497900c26f9e", 00:14:19.050 "strip_size_kb": 0, 00:14:19.050 "state": "online", 00:14:19.050 "raid_level": "raid1", 00:14:19.050 "superblock": true, 00:14:19.050 "num_base_bdevs": 3, 00:14:19.050 "num_base_bdevs_discovered": 2, 00:14:19.050 "num_base_bdevs_operational": 2, 00:14:19.050 "base_bdevs_list": [ 00:14:19.050 { 00:14:19.050 "name": null, 00:14:19.050 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:19.050 "is_configured": false, 00:14:19.050 "data_offset": 2048, 00:14:19.050 "data_size": 63488 00:14:19.050 }, 00:14:19.050 { 00:14:19.050 "name": "pt2", 00:14:19.050 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:19.050 "is_configured": true, 00:14:19.050 "data_offset": 2048, 00:14:19.050 "data_size": 63488 00:14:19.050 }, 00:14:19.050 { 00:14:19.050 "name": "pt3", 00:14:19.050 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:19.050 "is_configured": true, 00:14:19.050 "data_offset": 2048, 00:14:19.050 "data_size": 63488 00:14:19.050 } 00:14:19.050 ] 00:14:19.050 }' 00:14:19.050 11:55:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:19.050 11:55:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:19.308 11:55:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:19.566 [2024-07-12 11:55:09.686885] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:19.566 [2024-07-12 11:55:09.686904] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:19.566 [2024-07-12 11:55:09.686941] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:19.566 [2024-07-12 11:55:09.686977] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:19.566 [2024-07-12 11:55:09.686984] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1862e30 name raid_bdev1, state offline 00:14:19.566 11:55:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:14:19.566 11:55:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:19.824 11:55:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:14:19.824 11:55:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:14:19.824 11:55:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 3 -gt 2 ']' 00:14:19.824 11:55:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=2 00:14:19.824 11:55:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:14:19.824 11:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:20.082 [2024-07-12 11:55:10.204200] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:20.082 [2024-07-12 11:55:10.204234] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:20.082 [2024-07-12 11:55:10.204246] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1863220 00:14:20.082 [2024-07-12 11:55:10.204257] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:20.082 [2024-07-12 11:55:10.205439] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:20.082 [2024-07-12 11:55:10.205459] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:20.082 [2024-07-12 11:55:10.205504] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:14:20.082 [2024-07-12 11:55:10.205533] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:20.082 [2024-07-12 11:55:10.205604] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:14:20.082 [2024-07-12 11:55:10.205611] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:20.082 [2024-07-12 11:55:10.205620] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16b53b0 name raid_bdev1, state configuring 00:14:20.082 [2024-07-12 11:55:10.205636] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:20.082 pt1 00:14:20.082 11:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 3 -gt 2 ']' 00:14:20.082 11:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:14:20.082 11:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:20.082 11:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:20.082 11:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:20.082 11:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:20.082 11:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:20.082 11:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:20.082 11:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:20.082 11:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:20.082 11:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:20.082 11:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:20.082 11:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:20.340 11:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:20.340 "name": "raid_bdev1", 00:14:20.340 "uuid": "053e32e1-804d-4ea7-bdd6-497900c26f9e", 00:14:20.340 "strip_size_kb": 0, 00:14:20.340 "state": "configuring", 00:14:20.340 "raid_level": "raid1", 00:14:20.340 "superblock": true, 00:14:20.340 "num_base_bdevs": 3, 00:14:20.340 "num_base_bdevs_discovered": 1, 00:14:20.340 "num_base_bdevs_operational": 2, 00:14:20.340 "base_bdevs_list": [ 00:14:20.340 { 00:14:20.340 "name": null, 00:14:20.340 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:20.340 "is_configured": false, 00:14:20.340 "data_offset": 2048, 00:14:20.340 "data_size": 63488 00:14:20.340 }, 00:14:20.340 { 00:14:20.340 "name": "pt2", 00:14:20.340 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:20.340 "is_configured": true, 00:14:20.340 "data_offset": 2048, 00:14:20.340 "data_size": 63488 00:14:20.340 }, 00:14:20.340 { 00:14:20.340 "name": null, 00:14:20.340 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:20.340 "is_configured": false, 00:14:20.340 "data_offset": 2048, 00:14:20.340 "data_size": 63488 00:14:20.340 } 00:14:20.340 ] 00:14:20.340 }' 00:14:20.340 11:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:20.340 11:55:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:20.905 11:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:14:20.905 11:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:14:20.905 11:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:14:20.905 11:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:21.164 [2024-07-12 11:55:11.202891] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:21.164 [2024-07-12 11:55:11.202930] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:21.164 [2024-07-12 11:55:11.202942] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16b5630 00:14:21.164 [2024-07-12 11:55:11.202965] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:21.164 [2024-07-12 11:55:11.203202] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:21.164 [2024-07-12 11:55:11.203211] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:21.164 [2024-07-12 11:55:11.203255] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:14:21.164 [2024-07-12 11:55:11.203266] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:21.164 [2024-07-12 11:55:11.203335] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16b53b0 00:14:21.164 [2024-07-12 11:55:11.203341] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:21.164 [2024-07-12 11:55:11.203450] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1865da0 00:14:21.164 [2024-07-12 11:55:11.203543] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16b53b0 00:14:21.164 [2024-07-12 11:55:11.203550] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x16b53b0 00:14:21.164 [2024-07-12 11:55:11.203618] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:21.164 pt3 00:14:21.164 11:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:14:21.164 11:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:21.164 11:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:21.164 11:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:21.164 11:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:21.164 11:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:21.164 11:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:21.164 11:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:21.164 11:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:21.164 11:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:21.164 11:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:21.164 11:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:21.164 11:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:21.164 "name": "raid_bdev1", 00:14:21.164 "uuid": "053e32e1-804d-4ea7-bdd6-497900c26f9e", 00:14:21.164 "strip_size_kb": 0, 00:14:21.164 "state": "online", 00:14:21.164 "raid_level": "raid1", 00:14:21.164 "superblock": true, 00:14:21.164 "num_base_bdevs": 3, 00:14:21.164 "num_base_bdevs_discovered": 2, 00:14:21.164 "num_base_bdevs_operational": 2, 00:14:21.164 "base_bdevs_list": [ 00:14:21.164 { 00:14:21.164 "name": null, 00:14:21.164 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:21.164 "is_configured": false, 00:14:21.164 "data_offset": 2048, 00:14:21.164 "data_size": 63488 00:14:21.164 }, 00:14:21.164 { 00:14:21.164 "name": "pt2", 00:14:21.164 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:21.164 "is_configured": true, 00:14:21.164 "data_offset": 2048, 00:14:21.164 "data_size": 63488 00:14:21.164 }, 00:14:21.164 { 00:14:21.164 "name": "pt3", 00:14:21.164 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:21.164 "is_configured": true, 00:14:21.164 "data_offset": 2048, 00:14:21.164 "data_size": 63488 00:14:21.164 } 00:14:21.164 ] 00:14:21.164 }' 00:14:21.164 11:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:21.164 11:55:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:21.730 11:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:14:21.730 11:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:14:21.987 11:55:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:14:21.987 11:55:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:21.987 11:55:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:14:21.987 [2024-07-12 11:55:12.181641] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:21.987 11:55:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 053e32e1-804d-4ea7-bdd6-497900c26f9e '!=' 053e32e1-804d-4ea7-bdd6-497900c26f9e ']' 00:14:21.987 11:55:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 644127 00:14:21.987 11:55:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 644127 ']' 00:14:21.987 11:55:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 644127 00:14:21.987 11:55:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:14:21.987 11:55:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:21.987 11:55:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 644127 00:14:21.987 11:55:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:21.987 11:55:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:21.987 11:55:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 644127' 00:14:21.987 killing process with pid 644127 00:14:21.987 11:55:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 644127 00:14:21.987 [2024-07-12 11:55:12.227653] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:21.988 [2024-07-12 11:55:12.227693] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:21.988 [2024-07-12 11:55:12.227731] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:21.988 [2024-07-12 11:55:12.227738] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16b53b0 name raid_bdev1, state offline 00:14:21.988 11:55:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 644127 00:14:22.246 [2024-07-12 11:55:12.251282] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:22.246 11:55:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:14:22.246 00:14:22.246 real 0m16.395s 00:14:22.246 user 0m30.400s 00:14:22.246 sys 0m2.520s 00:14:22.246 11:55:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:22.246 11:55:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:22.246 ************************************ 00:14:22.246 END TEST raid_superblock_test 00:14:22.246 ************************************ 00:14:22.246 11:55:12 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:22.246 11:55:12 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 3 read 00:14:22.246 11:55:12 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:22.246 11:55:12 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:22.246 11:55:12 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:22.246 ************************************ 00:14:22.246 START TEST raid_read_error_test 00:14:22.246 ************************************ 00:14:22.246 11:55:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 3 read 00:14:22.246 11:55:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:14:22.246 11:55:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:14:22.246 11:55:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:14:22.246 11:55:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:14:22.246 11:55:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:22.246 11:55:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:14:22.246 11:55:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:22.246 11:55:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:22.246 11:55:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:14:22.246 11:55:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:22.246 11:55:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:22.246 11:55:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:14:22.246 11:55:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:22.246 11:55:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:22.246 11:55:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:22.246 11:55:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:14:22.246 11:55:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:14:22.246 11:55:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:14:22.246 11:55:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:14:22.246 11:55:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:14:22.246 11:55:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:14:22.246 11:55:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:14:22.246 11:55:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:14:22.246 11:55:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:14:22.505 11:55:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.ixHX7ls4jv 00:14:22.505 11:55:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=647317 00:14:22.505 11:55:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 647317 /var/tmp/spdk-raid.sock 00:14:22.505 11:55:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:22.505 11:55:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 647317 ']' 00:14:22.505 11:55:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:22.505 11:55:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:22.505 11:55:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:22.505 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:22.505 11:55:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:22.505 11:55:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:22.505 [2024-07-12 11:55:12.544162] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:14:22.505 [2024-07-12 11:55:12.544199] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid647317 ] 00:14:22.505 [2024-07-12 11:55:12.606450] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:22.505 [2024-07-12 11:55:12.683545] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:14:22.505 [2024-07-12 11:55:12.738393] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:22.505 [2024-07-12 11:55:12.738419] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:23.440 11:55:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:23.440 11:55:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:14:23.440 11:55:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:23.440 11:55:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:23.440 BaseBdev1_malloc 00:14:23.440 11:55:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:14:23.440 true 00:14:23.440 11:55:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:14:23.699 [2024-07-12 11:55:13.806933] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:14:23.699 [2024-07-12 11:55:13.806961] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:23.699 [2024-07-12 11:55:13.806971] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27532d0 00:14:23.699 [2024-07-12 11:55:13.806978] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:23.699 [2024-07-12 11:55:13.808183] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:23.699 [2024-07-12 11:55:13.808203] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:14:23.699 BaseBdev1 00:14:23.699 11:55:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:23.699 11:55:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:14:23.958 BaseBdev2_malloc 00:14:23.958 11:55:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:14:23.958 true 00:14:23.958 11:55:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:14:24.216 [2024-07-12 11:55:14.291769] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:14:24.216 [2024-07-12 11:55:14.291799] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:24.216 [2024-07-12 11:55:14.291810] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2757f40 00:14:24.216 [2024-07-12 11:55:14.291817] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:24.216 [2024-07-12 11:55:14.292829] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:24.216 [2024-07-12 11:55:14.292849] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:14:24.216 BaseBdev2 00:14:24.216 11:55:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:24.216 11:55:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:14:24.475 BaseBdev3_malloc 00:14:24.475 11:55:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:14:24.475 true 00:14:24.475 11:55:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:14:24.734 [2024-07-12 11:55:14.788589] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:14:24.734 [2024-07-12 11:55:14.788617] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:24.734 [2024-07-12 11:55:14.788627] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x275aea0 00:14:24.734 [2024-07-12 11:55:14.788633] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:24.734 [2024-07-12 11:55:14.789655] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:24.734 [2024-07-12 11:55:14.789674] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:14:24.734 BaseBdev3 00:14:24.734 11:55:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:14:24.734 [2024-07-12 11:55:14.940986] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:24.734 [2024-07-12 11:55:14.941861] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:24.734 [2024-07-12 11:55:14.941907] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:24.734 [2024-07-12 11:55:14.942042] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2754000 00:14:24.734 [2024-07-12 11:55:14.942049] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:24.734 [2024-07-12 11:55:14.942171] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x275b610 00:14:24.734 [2024-07-12 11:55:14.942270] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2754000 00:14:24.734 [2024-07-12 11:55:14.942275] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2754000 00:14:24.734 [2024-07-12 11:55:14.942339] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:24.734 11:55:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:14:24.734 11:55:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:24.734 11:55:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:24.734 11:55:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:24.734 11:55:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:24.734 11:55:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:24.734 11:55:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:24.734 11:55:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:24.734 11:55:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:24.734 11:55:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:24.734 11:55:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:24.734 11:55:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:24.993 11:55:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:24.993 "name": "raid_bdev1", 00:14:24.993 "uuid": "b66157a5-305e-4637-afce-54e92c131623", 00:14:24.993 "strip_size_kb": 0, 00:14:24.993 "state": "online", 00:14:24.993 "raid_level": "raid1", 00:14:24.993 "superblock": true, 00:14:24.993 "num_base_bdevs": 3, 00:14:24.993 "num_base_bdevs_discovered": 3, 00:14:24.993 "num_base_bdevs_operational": 3, 00:14:24.993 "base_bdevs_list": [ 00:14:24.993 { 00:14:24.993 "name": "BaseBdev1", 00:14:24.993 "uuid": "63e9bda0-8507-5bec-96cf-2cd008f64109", 00:14:24.993 "is_configured": true, 00:14:24.993 "data_offset": 2048, 00:14:24.993 "data_size": 63488 00:14:24.993 }, 00:14:24.993 { 00:14:24.993 "name": "BaseBdev2", 00:14:24.993 "uuid": "f9562689-dcfe-52b3-a9eb-9c75bc5da7e5", 00:14:24.993 "is_configured": true, 00:14:24.993 "data_offset": 2048, 00:14:24.993 "data_size": 63488 00:14:24.993 }, 00:14:24.993 { 00:14:24.993 "name": "BaseBdev3", 00:14:24.993 "uuid": "74fa3b44-2ce7-528f-8448-133f13c15a06", 00:14:24.993 "is_configured": true, 00:14:24.993 "data_offset": 2048, 00:14:24.993 "data_size": 63488 00:14:24.993 } 00:14:24.993 ] 00:14:24.993 }' 00:14:24.993 11:55:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:24.993 11:55:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:25.561 11:55:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:14:25.561 11:55:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:14:25.561 [2024-07-12 11:55:15.683096] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x275a700 00:14:26.496 11:55:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:14:26.755 11:55:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:14:26.755 11:55:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:14:26.755 11:55:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:14:26.755 11:55:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:14:26.755 11:55:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:14:26.755 11:55:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:26.755 11:55:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:26.755 11:55:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:26.755 11:55:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:26.755 11:55:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:26.755 11:55:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:26.755 11:55:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:26.755 11:55:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:26.755 11:55:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:26.755 11:55:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:26.755 11:55:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:26.755 11:55:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:26.755 "name": "raid_bdev1", 00:14:26.755 "uuid": "b66157a5-305e-4637-afce-54e92c131623", 00:14:26.756 "strip_size_kb": 0, 00:14:26.756 "state": "online", 00:14:26.756 "raid_level": "raid1", 00:14:26.756 "superblock": true, 00:14:26.756 "num_base_bdevs": 3, 00:14:26.756 "num_base_bdevs_discovered": 3, 00:14:26.756 "num_base_bdevs_operational": 3, 00:14:26.756 "base_bdevs_list": [ 00:14:26.756 { 00:14:26.756 "name": "BaseBdev1", 00:14:26.756 "uuid": "63e9bda0-8507-5bec-96cf-2cd008f64109", 00:14:26.756 "is_configured": true, 00:14:26.756 "data_offset": 2048, 00:14:26.756 "data_size": 63488 00:14:26.756 }, 00:14:26.756 { 00:14:26.756 "name": "BaseBdev2", 00:14:26.756 "uuid": "f9562689-dcfe-52b3-a9eb-9c75bc5da7e5", 00:14:26.756 "is_configured": true, 00:14:26.756 "data_offset": 2048, 00:14:26.756 "data_size": 63488 00:14:26.756 }, 00:14:26.756 { 00:14:26.756 "name": "BaseBdev3", 00:14:26.756 "uuid": "74fa3b44-2ce7-528f-8448-133f13c15a06", 00:14:26.756 "is_configured": true, 00:14:26.756 "data_offset": 2048, 00:14:26.756 "data_size": 63488 00:14:26.756 } 00:14:26.756 ] 00:14:26.756 }' 00:14:26.756 11:55:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:26.756 11:55:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:27.324 11:55:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:27.584 [2024-07-12 11:55:17.608071] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:27.584 [2024-07-12 11:55:17.608099] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:27.584 [2024-07-12 11:55:17.610183] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:27.584 [2024-07-12 11:55:17.610211] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:27.584 [2024-07-12 11:55:17.610274] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:27.584 [2024-07-12 11:55:17.610280] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2754000 name raid_bdev1, state offline 00:14:27.584 0 00:14:27.584 11:55:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 647317 00:14:27.584 11:55:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 647317 ']' 00:14:27.584 11:55:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 647317 00:14:27.584 11:55:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:14:27.584 11:55:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:27.584 11:55:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 647317 00:14:27.584 11:55:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:27.584 11:55:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:27.584 11:55:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 647317' 00:14:27.584 killing process with pid 647317 00:14:27.584 11:55:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 647317 00:14:27.584 [2024-07-12 11:55:17.661546] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:27.584 11:55:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 647317 00:14:27.584 [2024-07-12 11:55:17.680393] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:27.843 11:55:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.ixHX7ls4jv 00:14:27.843 11:55:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:14:27.843 11:55:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:14:27.843 11:55:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:14:27.843 11:55:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:14:27.843 11:55:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:27.843 11:55:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:14:27.844 11:55:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:14:27.844 00:14:27.844 real 0m5.388s 00:14:27.844 user 0m8.339s 00:14:27.844 sys 0m0.788s 00:14:27.844 11:55:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:27.844 11:55:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:27.844 ************************************ 00:14:27.844 END TEST raid_read_error_test 00:14:27.844 ************************************ 00:14:27.844 11:55:17 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:27.844 11:55:17 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 3 write 00:14:27.844 11:55:17 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:27.844 11:55:17 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:27.844 11:55:17 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:27.844 ************************************ 00:14:27.844 START TEST raid_write_error_test 00:14:27.844 ************************************ 00:14:27.844 11:55:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 3 write 00:14:27.844 11:55:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:14:27.844 11:55:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:14:27.844 11:55:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:14:27.844 11:55:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:14:27.844 11:55:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:27.844 11:55:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:14:27.844 11:55:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:27.844 11:55:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:27.844 11:55:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:14:27.844 11:55:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:27.844 11:55:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:27.844 11:55:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:14:27.844 11:55:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:27.844 11:55:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:27.844 11:55:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:27.844 11:55:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:14:27.844 11:55:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:14:27.844 11:55:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:14:27.844 11:55:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:14:27.844 11:55:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:14:27.844 11:55:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:14:27.844 11:55:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:14:27.844 11:55:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:14:27.844 11:55:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:14:27.844 11:55:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.NlE9GrVinN 00:14:27.844 11:55:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=648323 00:14:27.844 11:55:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 648323 /var/tmp/spdk-raid.sock 00:14:27.844 11:55:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:27.844 11:55:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 648323 ']' 00:14:27.844 11:55:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:27.844 11:55:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:27.844 11:55:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:27.844 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:27.844 11:55:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:27.844 11:55:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:27.844 [2024-07-12 11:55:17.999049] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:14:27.844 [2024-07-12 11:55:17.999089] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid648323 ] 00:14:27.844 [2024-07-12 11:55:18.061186] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:28.103 [2024-07-12 11:55:18.132157] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:14:28.103 [2024-07-12 11:55:18.183547] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:28.103 [2024-07-12 11:55:18.183575] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:28.690 11:55:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:28.690 11:55:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:14:28.690 11:55:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:28.690 11:55:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:28.690 BaseBdev1_malloc 00:14:28.948 11:55:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:14:28.948 true 00:14:28.948 11:55:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:14:29.207 [2024-07-12 11:55:19.243740] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:14:29.207 [2024-07-12 11:55:19.243774] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:29.207 [2024-07-12 11:55:19.243783] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b712d0 00:14:29.207 [2024-07-12 11:55:19.243789] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:29.207 [2024-07-12 11:55:19.244851] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:29.207 [2024-07-12 11:55:19.244875] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:14:29.207 BaseBdev1 00:14:29.207 11:55:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:29.208 11:55:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:14:29.208 BaseBdev2_malloc 00:14:29.208 11:55:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:14:29.466 true 00:14:29.466 11:55:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:14:29.725 [2024-07-12 11:55:19.724235] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:14:29.725 [2024-07-12 11:55:19.724263] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:29.725 [2024-07-12 11:55:19.724273] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b75f40 00:14:29.725 [2024-07-12 11:55:19.724279] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:29.725 [2024-07-12 11:55:19.725233] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:29.725 [2024-07-12 11:55:19.725254] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:14:29.725 BaseBdev2 00:14:29.725 11:55:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:29.725 11:55:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:14:29.725 BaseBdev3_malloc 00:14:29.725 11:55:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:14:29.984 true 00:14:29.984 11:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:14:30.243 [2024-07-12 11:55:20.237298] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:14:30.243 [2024-07-12 11:55:20.237331] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:30.243 [2024-07-12 11:55:20.237342] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b78ea0 00:14:30.243 [2024-07-12 11:55:20.237348] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:30.243 [2024-07-12 11:55:20.238353] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:30.243 [2024-07-12 11:55:20.238375] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:14:30.243 BaseBdev3 00:14:30.243 11:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:14:30.243 [2024-07-12 11:55:20.405751] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:30.243 [2024-07-12 11:55:20.406566] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:30.243 [2024-07-12 11:55:20.406609] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:30.243 [2024-07-12 11:55:20.406738] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b72000 00:14:30.243 [2024-07-12 11:55:20.406744] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:30.243 [2024-07-12 11:55:20.406860] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b79610 00:14:30.243 [2024-07-12 11:55:20.406956] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b72000 00:14:30.243 [2024-07-12 11:55:20.406961] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1b72000 00:14:30.243 [2024-07-12 11:55:20.407025] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:30.243 11:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:14:30.243 11:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:30.243 11:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:30.243 11:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:30.243 11:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:30.243 11:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:30.243 11:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:30.243 11:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:30.243 11:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:30.243 11:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:30.243 11:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:30.243 11:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:30.502 11:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:30.502 "name": "raid_bdev1", 00:14:30.502 "uuid": "fa3a34b5-7eb0-42cf-b1c9-646f3911b5b4", 00:14:30.502 "strip_size_kb": 0, 00:14:30.502 "state": "online", 00:14:30.502 "raid_level": "raid1", 00:14:30.502 "superblock": true, 00:14:30.503 "num_base_bdevs": 3, 00:14:30.503 "num_base_bdevs_discovered": 3, 00:14:30.503 "num_base_bdevs_operational": 3, 00:14:30.503 "base_bdevs_list": [ 00:14:30.503 { 00:14:30.503 "name": "BaseBdev1", 00:14:30.503 "uuid": "a44c5385-c3a9-5286-9527-31aa80a293ed", 00:14:30.503 "is_configured": true, 00:14:30.503 "data_offset": 2048, 00:14:30.503 "data_size": 63488 00:14:30.503 }, 00:14:30.503 { 00:14:30.503 "name": "BaseBdev2", 00:14:30.503 "uuid": "c32a0e07-5ae7-504c-909b-3ca29164c36a", 00:14:30.503 "is_configured": true, 00:14:30.503 "data_offset": 2048, 00:14:30.503 "data_size": 63488 00:14:30.503 }, 00:14:30.503 { 00:14:30.503 "name": "BaseBdev3", 00:14:30.503 "uuid": "3015eb87-2ab8-54e2-b868-cbdc20740cef", 00:14:30.503 "is_configured": true, 00:14:30.503 "data_offset": 2048, 00:14:30.503 "data_size": 63488 00:14:30.503 } 00:14:30.503 ] 00:14:30.503 }' 00:14:30.503 11:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:30.503 11:55:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:31.071 11:55:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:14:31.071 11:55:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:14:31.071 [2024-07-12 11:55:21.139838] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b78700 00:14:32.009 11:55:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:14:32.009 [2024-07-12 11:55:22.215392] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:14:32.009 [2024-07-12 11:55:22.215440] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:32.009 [2024-07-12 11:55:22.215608] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1b78700 00:14:32.009 11:55:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:14:32.009 11:55:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:14:32.009 11:55:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:14:32.009 11:55:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=2 00:14:32.009 11:55:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:14:32.009 11:55:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:32.009 11:55:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:32.009 11:55:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:32.009 11:55:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:32.009 11:55:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:32.009 11:55:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:32.009 11:55:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:32.009 11:55:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:32.009 11:55:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:32.009 11:55:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:32.009 11:55:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:32.267 11:55:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:32.267 "name": "raid_bdev1", 00:14:32.267 "uuid": "fa3a34b5-7eb0-42cf-b1c9-646f3911b5b4", 00:14:32.267 "strip_size_kb": 0, 00:14:32.267 "state": "online", 00:14:32.267 "raid_level": "raid1", 00:14:32.267 "superblock": true, 00:14:32.267 "num_base_bdevs": 3, 00:14:32.267 "num_base_bdevs_discovered": 2, 00:14:32.267 "num_base_bdevs_operational": 2, 00:14:32.267 "base_bdevs_list": [ 00:14:32.267 { 00:14:32.267 "name": null, 00:14:32.267 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:32.267 "is_configured": false, 00:14:32.267 "data_offset": 2048, 00:14:32.267 "data_size": 63488 00:14:32.267 }, 00:14:32.267 { 00:14:32.267 "name": "BaseBdev2", 00:14:32.267 "uuid": "c32a0e07-5ae7-504c-909b-3ca29164c36a", 00:14:32.267 "is_configured": true, 00:14:32.267 "data_offset": 2048, 00:14:32.267 "data_size": 63488 00:14:32.267 }, 00:14:32.267 { 00:14:32.267 "name": "BaseBdev3", 00:14:32.267 "uuid": "3015eb87-2ab8-54e2-b868-cbdc20740cef", 00:14:32.267 "is_configured": true, 00:14:32.267 "data_offset": 2048, 00:14:32.267 "data_size": 63488 00:14:32.267 } 00:14:32.267 ] 00:14:32.267 }' 00:14:32.267 11:55:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:32.267 11:55:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:32.832 11:55:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:32.832 [2024-07-12 11:55:23.049785] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:32.832 [2024-07-12 11:55:23.049820] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:32.832 [2024-07-12 11:55:23.051769] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:32.832 [2024-07-12 11:55:23.051795] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:32.832 [2024-07-12 11:55:23.051840] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:32.832 [2024-07-12 11:55:23.051846] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b72000 name raid_bdev1, state offline 00:14:32.832 0 00:14:32.832 11:55:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 648323 00:14:32.832 11:55:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 648323 ']' 00:14:32.832 11:55:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 648323 00:14:32.832 11:55:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:14:32.832 11:55:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:32.832 11:55:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 648323 00:14:33.090 11:55:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:33.090 11:55:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:33.090 11:55:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 648323' 00:14:33.090 killing process with pid 648323 00:14:33.090 11:55:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 648323 00:14:33.090 [2024-07-12 11:55:23.109075] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:33.090 11:55:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 648323 00:14:33.090 [2024-07-12 11:55:23.126319] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:33.090 11:55:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.NlE9GrVinN 00:14:33.090 11:55:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:14:33.090 11:55:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:14:33.090 11:55:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:14:33.090 11:55:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:14:33.090 11:55:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:33.090 11:55:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:14:33.090 11:55:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:14:33.090 00:14:33.090 real 0m5.375s 00:14:33.090 user 0m8.328s 00:14:33.090 sys 0m0.793s 00:14:33.090 11:55:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:33.090 11:55:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:33.090 ************************************ 00:14:33.090 END TEST raid_write_error_test 00:14:33.090 ************************************ 00:14:33.348 11:55:23 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:33.348 11:55:23 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:14:33.348 11:55:23 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:14:33.348 11:55:23 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 4 false 00:14:33.348 11:55:23 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:33.348 11:55:23 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:33.348 11:55:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:33.348 ************************************ 00:14:33.348 START TEST raid_state_function_test 00:14:33.348 ************************************ 00:14:33.348 11:55:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 4 false 00:14:33.348 11:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:14:33.348 11:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:14:33.348 11:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:14:33.348 11:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:14:33.348 11:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:14:33.348 11:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:33.348 11:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:14:33.348 11:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:33.348 11:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:33.348 11:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:14:33.348 11:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:33.348 11:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:33.348 11:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:14:33.348 11:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:33.348 11:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:33.348 11:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:14:33.348 11:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:33.348 11:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:33.348 11:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:14:33.348 11:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:14:33.348 11:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:14:33.348 11:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:14:33.348 11:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:14:33.348 11:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:14:33.348 11:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:14:33.348 11:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:14:33.348 11:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:14:33.348 11:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:14:33.348 11:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:14:33.348 11:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=649337 00:14:33.348 11:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 649337' 00:14:33.348 Process raid pid: 649337 00:14:33.348 11:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 649337 /var/tmp/spdk-raid.sock 00:14:33.348 11:55:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 649337 ']' 00:14:33.348 11:55:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:33.348 11:55:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:33.348 11:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:33.348 11:55:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:33.348 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:33.349 11:55:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:33.349 11:55:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:33.349 [2024-07-12 11:55:23.431656] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:14:33.349 [2024-07-12 11:55:23.431690] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:33.349 [2024-07-12 11:55:23.494910] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:33.349 [2024-07-12 11:55:23.571743] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:14:33.607 [2024-07-12 11:55:23.623137] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:33.607 [2024-07-12 11:55:23.623181] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:34.174 11:55:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:34.174 11:55:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:14:34.174 11:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:14:34.174 [2024-07-12 11:55:24.354417] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:34.174 [2024-07-12 11:55:24.354446] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:34.174 [2024-07-12 11:55:24.354452] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:34.174 [2024-07-12 11:55:24.354457] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:34.174 [2024-07-12 11:55:24.354462] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:34.174 [2024-07-12 11:55:24.354467] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:34.174 [2024-07-12 11:55:24.354474] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:14:34.174 [2024-07-12 11:55:24.354479] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:14:34.174 11:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:34.174 11:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:34.174 11:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:34.174 11:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:34.174 11:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:34.174 11:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:34.174 11:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:34.174 11:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:34.174 11:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:34.174 11:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:34.174 11:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:34.174 11:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:34.432 11:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:34.432 "name": "Existed_Raid", 00:14:34.432 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:34.432 "strip_size_kb": 64, 00:14:34.432 "state": "configuring", 00:14:34.432 "raid_level": "raid0", 00:14:34.432 "superblock": false, 00:14:34.432 "num_base_bdevs": 4, 00:14:34.432 "num_base_bdevs_discovered": 0, 00:14:34.432 "num_base_bdevs_operational": 4, 00:14:34.432 "base_bdevs_list": [ 00:14:34.432 { 00:14:34.432 "name": "BaseBdev1", 00:14:34.432 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:34.432 "is_configured": false, 00:14:34.432 "data_offset": 0, 00:14:34.433 "data_size": 0 00:14:34.433 }, 00:14:34.433 { 00:14:34.433 "name": "BaseBdev2", 00:14:34.433 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:34.433 "is_configured": false, 00:14:34.433 "data_offset": 0, 00:14:34.433 "data_size": 0 00:14:34.433 }, 00:14:34.433 { 00:14:34.433 "name": "BaseBdev3", 00:14:34.433 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:34.433 "is_configured": false, 00:14:34.433 "data_offset": 0, 00:14:34.433 "data_size": 0 00:14:34.433 }, 00:14:34.433 { 00:14:34.433 "name": "BaseBdev4", 00:14:34.433 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:34.433 "is_configured": false, 00:14:34.433 "data_offset": 0, 00:14:34.433 "data_size": 0 00:14:34.433 } 00:14:34.433 ] 00:14:34.433 }' 00:14:34.433 11:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:34.433 11:55:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:35.000 11:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:35.000 [2024-07-12 11:55:25.152381] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:35.000 [2024-07-12 11:55:25.152401] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23c21f0 name Existed_Raid, state configuring 00:14:35.000 11:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:14:35.258 [2024-07-12 11:55:25.312812] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:35.258 [2024-07-12 11:55:25.312830] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:35.258 [2024-07-12 11:55:25.312835] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:35.258 [2024-07-12 11:55:25.312840] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:35.258 [2024-07-12 11:55:25.312844] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:35.258 [2024-07-12 11:55:25.312869] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:35.258 [2024-07-12 11:55:25.312874] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:14:35.258 [2024-07-12 11:55:25.312879] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:14:35.258 11:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:35.258 [2024-07-12 11:55:25.489385] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:35.258 BaseBdev1 00:14:35.258 11:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:14:35.258 11:55:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:35.258 11:55:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:35.258 11:55:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:35.258 11:55:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:35.258 11:55:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:35.258 11:55:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:35.516 11:55:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:35.775 [ 00:14:35.775 { 00:14:35.775 "name": "BaseBdev1", 00:14:35.775 "aliases": [ 00:14:35.775 "035a2ce8-f8e5-455c-9b3e-e65531930578" 00:14:35.775 ], 00:14:35.775 "product_name": "Malloc disk", 00:14:35.775 "block_size": 512, 00:14:35.775 "num_blocks": 65536, 00:14:35.775 "uuid": "035a2ce8-f8e5-455c-9b3e-e65531930578", 00:14:35.775 "assigned_rate_limits": { 00:14:35.775 "rw_ios_per_sec": 0, 00:14:35.775 "rw_mbytes_per_sec": 0, 00:14:35.775 "r_mbytes_per_sec": 0, 00:14:35.775 "w_mbytes_per_sec": 0 00:14:35.775 }, 00:14:35.775 "claimed": true, 00:14:35.775 "claim_type": "exclusive_write", 00:14:35.775 "zoned": false, 00:14:35.775 "supported_io_types": { 00:14:35.775 "read": true, 00:14:35.775 "write": true, 00:14:35.775 "unmap": true, 00:14:35.775 "flush": true, 00:14:35.775 "reset": true, 00:14:35.775 "nvme_admin": false, 00:14:35.775 "nvme_io": false, 00:14:35.775 "nvme_io_md": false, 00:14:35.775 "write_zeroes": true, 00:14:35.775 "zcopy": true, 00:14:35.775 "get_zone_info": false, 00:14:35.775 "zone_management": false, 00:14:35.775 "zone_append": false, 00:14:35.775 "compare": false, 00:14:35.775 "compare_and_write": false, 00:14:35.775 "abort": true, 00:14:35.775 "seek_hole": false, 00:14:35.775 "seek_data": false, 00:14:35.775 "copy": true, 00:14:35.775 "nvme_iov_md": false 00:14:35.775 }, 00:14:35.775 "memory_domains": [ 00:14:35.775 { 00:14:35.775 "dma_device_id": "system", 00:14:35.775 "dma_device_type": 1 00:14:35.775 }, 00:14:35.775 { 00:14:35.775 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:35.775 "dma_device_type": 2 00:14:35.775 } 00:14:35.775 ], 00:14:35.775 "driver_specific": {} 00:14:35.775 } 00:14:35.775 ] 00:14:35.775 11:55:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:35.775 11:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:35.775 11:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:35.775 11:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:35.775 11:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:35.775 11:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:35.775 11:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:35.775 11:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:35.775 11:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:35.775 11:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:35.775 11:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:35.775 11:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:35.775 11:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:35.775 11:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:35.775 "name": "Existed_Raid", 00:14:35.775 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:35.775 "strip_size_kb": 64, 00:14:35.775 "state": "configuring", 00:14:35.775 "raid_level": "raid0", 00:14:35.775 "superblock": false, 00:14:35.775 "num_base_bdevs": 4, 00:14:35.775 "num_base_bdevs_discovered": 1, 00:14:35.775 "num_base_bdevs_operational": 4, 00:14:35.775 "base_bdevs_list": [ 00:14:35.775 { 00:14:35.776 "name": "BaseBdev1", 00:14:35.776 "uuid": "035a2ce8-f8e5-455c-9b3e-e65531930578", 00:14:35.776 "is_configured": true, 00:14:35.776 "data_offset": 0, 00:14:35.776 "data_size": 65536 00:14:35.776 }, 00:14:35.776 { 00:14:35.776 "name": "BaseBdev2", 00:14:35.776 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:35.776 "is_configured": false, 00:14:35.776 "data_offset": 0, 00:14:35.776 "data_size": 0 00:14:35.776 }, 00:14:35.776 { 00:14:35.776 "name": "BaseBdev3", 00:14:35.776 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:35.776 "is_configured": false, 00:14:35.776 "data_offset": 0, 00:14:35.776 "data_size": 0 00:14:35.776 }, 00:14:35.776 { 00:14:35.776 "name": "BaseBdev4", 00:14:35.776 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:35.776 "is_configured": false, 00:14:35.776 "data_offset": 0, 00:14:35.776 "data_size": 0 00:14:35.776 } 00:14:35.776 ] 00:14:35.776 }' 00:14:35.776 11:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:35.776 11:55:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:36.342 11:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:36.623 [2024-07-12 11:55:26.628319] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:36.623 [2024-07-12 11:55:26.628346] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23c1a60 name Existed_Raid, state configuring 00:14:36.623 11:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:14:36.623 [2024-07-12 11:55:26.792787] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:36.624 [2024-07-12 11:55:26.793955] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:36.624 [2024-07-12 11:55:26.793979] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:36.624 [2024-07-12 11:55:26.793986] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:36.624 [2024-07-12 11:55:26.793992] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:36.624 [2024-07-12 11:55:26.793997] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:14:36.624 [2024-07-12 11:55:26.794003] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:14:36.624 11:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:14:36.624 11:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:36.624 11:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:36.624 11:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:36.624 11:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:36.624 11:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:36.624 11:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:36.624 11:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:36.624 11:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:36.624 11:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:36.624 11:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:36.624 11:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:36.624 11:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:36.624 11:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:36.897 11:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:36.897 "name": "Existed_Raid", 00:14:36.897 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:36.897 "strip_size_kb": 64, 00:14:36.897 "state": "configuring", 00:14:36.897 "raid_level": "raid0", 00:14:36.897 "superblock": false, 00:14:36.897 "num_base_bdevs": 4, 00:14:36.897 "num_base_bdevs_discovered": 1, 00:14:36.897 "num_base_bdevs_operational": 4, 00:14:36.897 "base_bdevs_list": [ 00:14:36.897 { 00:14:36.897 "name": "BaseBdev1", 00:14:36.897 "uuid": "035a2ce8-f8e5-455c-9b3e-e65531930578", 00:14:36.897 "is_configured": true, 00:14:36.897 "data_offset": 0, 00:14:36.897 "data_size": 65536 00:14:36.897 }, 00:14:36.897 { 00:14:36.897 "name": "BaseBdev2", 00:14:36.897 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:36.897 "is_configured": false, 00:14:36.897 "data_offset": 0, 00:14:36.897 "data_size": 0 00:14:36.897 }, 00:14:36.897 { 00:14:36.897 "name": "BaseBdev3", 00:14:36.897 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:36.897 "is_configured": false, 00:14:36.897 "data_offset": 0, 00:14:36.897 "data_size": 0 00:14:36.897 }, 00:14:36.897 { 00:14:36.897 "name": "BaseBdev4", 00:14:36.897 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:36.897 "is_configured": false, 00:14:36.897 "data_offset": 0, 00:14:36.897 "data_size": 0 00:14:36.897 } 00:14:36.897 ] 00:14:36.897 }' 00:14:36.897 11:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:36.897 11:55:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:37.463 11:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:37.463 [2024-07-12 11:55:27.621473] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:37.463 BaseBdev2 00:14:37.463 11:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:37.463 11:55:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:37.463 11:55:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:37.463 11:55:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:37.463 11:55:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:37.463 11:55:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:37.463 11:55:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:37.722 11:55:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:37.722 [ 00:14:37.722 { 00:14:37.722 "name": "BaseBdev2", 00:14:37.722 "aliases": [ 00:14:37.722 "1fde05c0-33fb-43bf-bc50-13fd0335942a" 00:14:37.722 ], 00:14:37.722 "product_name": "Malloc disk", 00:14:37.722 "block_size": 512, 00:14:37.722 "num_blocks": 65536, 00:14:37.722 "uuid": "1fde05c0-33fb-43bf-bc50-13fd0335942a", 00:14:37.722 "assigned_rate_limits": { 00:14:37.722 "rw_ios_per_sec": 0, 00:14:37.722 "rw_mbytes_per_sec": 0, 00:14:37.722 "r_mbytes_per_sec": 0, 00:14:37.722 "w_mbytes_per_sec": 0 00:14:37.722 }, 00:14:37.722 "claimed": true, 00:14:37.722 "claim_type": "exclusive_write", 00:14:37.722 "zoned": false, 00:14:37.722 "supported_io_types": { 00:14:37.722 "read": true, 00:14:37.722 "write": true, 00:14:37.722 "unmap": true, 00:14:37.722 "flush": true, 00:14:37.722 "reset": true, 00:14:37.722 "nvme_admin": false, 00:14:37.722 "nvme_io": false, 00:14:37.722 "nvme_io_md": false, 00:14:37.722 "write_zeroes": true, 00:14:37.722 "zcopy": true, 00:14:37.722 "get_zone_info": false, 00:14:37.722 "zone_management": false, 00:14:37.722 "zone_append": false, 00:14:37.722 "compare": false, 00:14:37.722 "compare_and_write": false, 00:14:37.722 "abort": true, 00:14:37.722 "seek_hole": false, 00:14:37.722 "seek_data": false, 00:14:37.722 "copy": true, 00:14:37.722 "nvme_iov_md": false 00:14:37.722 }, 00:14:37.722 "memory_domains": [ 00:14:37.722 { 00:14:37.722 "dma_device_id": "system", 00:14:37.722 "dma_device_type": 1 00:14:37.722 }, 00:14:37.722 { 00:14:37.722 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:37.722 "dma_device_type": 2 00:14:37.722 } 00:14:37.722 ], 00:14:37.722 "driver_specific": {} 00:14:37.722 } 00:14:37.722 ] 00:14:37.722 11:55:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:37.722 11:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:37.722 11:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:37.722 11:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:37.980 11:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:37.980 11:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:37.980 11:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:37.980 11:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:37.980 11:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:37.980 11:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:37.980 11:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:37.980 11:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:37.980 11:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:37.980 11:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:37.980 11:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:37.980 11:55:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:37.980 "name": "Existed_Raid", 00:14:37.980 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:37.980 "strip_size_kb": 64, 00:14:37.980 "state": "configuring", 00:14:37.980 "raid_level": "raid0", 00:14:37.980 "superblock": false, 00:14:37.980 "num_base_bdevs": 4, 00:14:37.980 "num_base_bdevs_discovered": 2, 00:14:37.980 "num_base_bdevs_operational": 4, 00:14:37.980 "base_bdevs_list": [ 00:14:37.980 { 00:14:37.980 "name": "BaseBdev1", 00:14:37.980 "uuid": "035a2ce8-f8e5-455c-9b3e-e65531930578", 00:14:37.980 "is_configured": true, 00:14:37.980 "data_offset": 0, 00:14:37.981 "data_size": 65536 00:14:37.981 }, 00:14:37.981 { 00:14:37.981 "name": "BaseBdev2", 00:14:37.981 "uuid": "1fde05c0-33fb-43bf-bc50-13fd0335942a", 00:14:37.981 "is_configured": true, 00:14:37.981 "data_offset": 0, 00:14:37.981 "data_size": 65536 00:14:37.981 }, 00:14:37.981 { 00:14:37.981 "name": "BaseBdev3", 00:14:37.981 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:37.981 "is_configured": false, 00:14:37.981 "data_offset": 0, 00:14:37.981 "data_size": 0 00:14:37.981 }, 00:14:37.981 { 00:14:37.981 "name": "BaseBdev4", 00:14:37.981 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:37.981 "is_configured": false, 00:14:37.981 "data_offset": 0, 00:14:37.981 "data_size": 0 00:14:37.981 } 00:14:37.981 ] 00:14:37.981 }' 00:14:37.981 11:55:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:37.981 11:55:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:38.549 11:55:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:38.549 [2024-07-12 11:55:28.795198] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:38.808 BaseBdev3 00:14:38.808 11:55:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:14:38.808 11:55:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:38.808 11:55:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:38.808 11:55:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:38.808 11:55:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:38.808 11:55:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:38.808 11:55:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:38.808 11:55:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:39.067 [ 00:14:39.067 { 00:14:39.067 "name": "BaseBdev3", 00:14:39.067 "aliases": [ 00:14:39.067 "7e107f66-539e-4a28-87c7-da8a76b14503" 00:14:39.067 ], 00:14:39.067 "product_name": "Malloc disk", 00:14:39.067 "block_size": 512, 00:14:39.067 "num_blocks": 65536, 00:14:39.067 "uuid": "7e107f66-539e-4a28-87c7-da8a76b14503", 00:14:39.067 "assigned_rate_limits": { 00:14:39.067 "rw_ios_per_sec": 0, 00:14:39.067 "rw_mbytes_per_sec": 0, 00:14:39.067 "r_mbytes_per_sec": 0, 00:14:39.067 "w_mbytes_per_sec": 0 00:14:39.067 }, 00:14:39.067 "claimed": true, 00:14:39.067 "claim_type": "exclusive_write", 00:14:39.067 "zoned": false, 00:14:39.067 "supported_io_types": { 00:14:39.067 "read": true, 00:14:39.067 "write": true, 00:14:39.067 "unmap": true, 00:14:39.067 "flush": true, 00:14:39.067 "reset": true, 00:14:39.067 "nvme_admin": false, 00:14:39.067 "nvme_io": false, 00:14:39.067 "nvme_io_md": false, 00:14:39.067 "write_zeroes": true, 00:14:39.067 "zcopy": true, 00:14:39.067 "get_zone_info": false, 00:14:39.067 "zone_management": false, 00:14:39.067 "zone_append": false, 00:14:39.067 "compare": false, 00:14:39.067 "compare_and_write": false, 00:14:39.067 "abort": true, 00:14:39.067 "seek_hole": false, 00:14:39.067 "seek_data": false, 00:14:39.067 "copy": true, 00:14:39.067 "nvme_iov_md": false 00:14:39.067 }, 00:14:39.067 "memory_domains": [ 00:14:39.067 { 00:14:39.067 "dma_device_id": "system", 00:14:39.067 "dma_device_type": 1 00:14:39.067 }, 00:14:39.067 { 00:14:39.067 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:39.067 "dma_device_type": 2 00:14:39.067 } 00:14:39.067 ], 00:14:39.067 "driver_specific": {} 00:14:39.067 } 00:14:39.067 ] 00:14:39.067 11:55:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:39.067 11:55:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:39.067 11:55:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:39.067 11:55:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:39.067 11:55:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:39.067 11:55:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:39.067 11:55:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:39.067 11:55:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:39.067 11:55:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:39.067 11:55:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:39.067 11:55:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:39.067 11:55:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:39.067 11:55:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:39.068 11:55:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:39.068 11:55:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:39.068 11:55:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:39.068 "name": "Existed_Raid", 00:14:39.068 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:39.068 "strip_size_kb": 64, 00:14:39.068 "state": "configuring", 00:14:39.068 "raid_level": "raid0", 00:14:39.068 "superblock": false, 00:14:39.068 "num_base_bdevs": 4, 00:14:39.068 "num_base_bdevs_discovered": 3, 00:14:39.068 "num_base_bdevs_operational": 4, 00:14:39.068 "base_bdevs_list": [ 00:14:39.068 { 00:14:39.068 "name": "BaseBdev1", 00:14:39.068 "uuid": "035a2ce8-f8e5-455c-9b3e-e65531930578", 00:14:39.068 "is_configured": true, 00:14:39.068 "data_offset": 0, 00:14:39.068 "data_size": 65536 00:14:39.068 }, 00:14:39.068 { 00:14:39.068 "name": "BaseBdev2", 00:14:39.068 "uuid": "1fde05c0-33fb-43bf-bc50-13fd0335942a", 00:14:39.068 "is_configured": true, 00:14:39.068 "data_offset": 0, 00:14:39.068 "data_size": 65536 00:14:39.068 }, 00:14:39.068 { 00:14:39.068 "name": "BaseBdev3", 00:14:39.068 "uuid": "7e107f66-539e-4a28-87c7-da8a76b14503", 00:14:39.068 "is_configured": true, 00:14:39.068 "data_offset": 0, 00:14:39.068 "data_size": 65536 00:14:39.068 }, 00:14:39.068 { 00:14:39.068 "name": "BaseBdev4", 00:14:39.068 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:39.068 "is_configured": false, 00:14:39.068 "data_offset": 0, 00:14:39.068 "data_size": 0 00:14:39.068 } 00:14:39.068 ] 00:14:39.068 }' 00:14:39.068 11:55:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:39.068 11:55:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:39.635 11:55:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:14:39.893 [2024-07-12 11:55:29.968849] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:14:39.893 [2024-07-12 11:55:29.968875] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x23c2b90 00:14:39.893 [2024-07-12 11:55:29.968879] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:14:39.893 [2024-07-12 11:55:29.969003] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23c2700 00:14:39.893 [2024-07-12 11:55:29.969085] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23c2b90 00:14:39.893 [2024-07-12 11:55:29.969090] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x23c2b90 00:14:39.893 [2024-07-12 11:55:29.969197] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:39.893 BaseBdev4 00:14:39.893 11:55:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:14:39.893 11:55:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:14:39.894 11:55:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:39.894 11:55:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:39.894 11:55:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:39.894 11:55:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:39.894 11:55:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:40.152 11:55:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:14:40.152 [ 00:14:40.152 { 00:14:40.152 "name": "BaseBdev4", 00:14:40.152 "aliases": [ 00:14:40.152 "124cdad5-cdcf-47c7-bd63-76c82e331575" 00:14:40.152 ], 00:14:40.152 "product_name": "Malloc disk", 00:14:40.152 "block_size": 512, 00:14:40.152 "num_blocks": 65536, 00:14:40.152 "uuid": "124cdad5-cdcf-47c7-bd63-76c82e331575", 00:14:40.152 "assigned_rate_limits": { 00:14:40.152 "rw_ios_per_sec": 0, 00:14:40.152 "rw_mbytes_per_sec": 0, 00:14:40.152 "r_mbytes_per_sec": 0, 00:14:40.152 "w_mbytes_per_sec": 0 00:14:40.152 }, 00:14:40.152 "claimed": true, 00:14:40.152 "claim_type": "exclusive_write", 00:14:40.152 "zoned": false, 00:14:40.152 "supported_io_types": { 00:14:40.152 "read": true, 00:14:40.152 "write": true, 00:14:40.152 "unmap": true, 00:14:40.152 "flush": true, 00:14:40.152 "reset": true, 00:14:40.152 "nvme_admin": false, 00:14:40.152 "nvme_io": false, 00:14:40.152 "nvme_io_md": false, 00:14:40.152 "write_zeroes": true, 00:14:40.152 "zcopy": true, 00:14:40.152 "get_zone_info": false, 00:14:40.152 "zone_management": false, 00:14:40.152 "zone_append": false, 00:14:40.152 "compare": false, 00:14:40.152 "compare_and_write": false, 00:14:40.152 "abort": true, 00:14:40.152 "seek_hole": false, 00:14:40.152 "seek_data": false, 00:14:40.152 "copy": true, 00:14:40.152 "nvme_iov_md": false 00:14:40.152 }, 00:14:40.152 "memory_domains": [ 00:14:40.152 { 00:14:40.152 "dma_device_id": "system", 00:14:40.152 "dma_device_type": 1 00:14:40.152 }, 00:14:40.152 { 00:14:40.152 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:40.152 "dma_device_type": 2 00:14:40.152 } 00:14:40.152 ], 00:14:40.152 "driver_specific": {} 00:14:40.152 } 00:14:40.152 ] 00:14:40.152 11:55:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:40.152 11:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:40.152 11:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:40.152 11:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:14:40.152 11:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:40.152 11:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:40.152 11:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:40.152 11:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:40.152 11:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:40.152 11:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:40.152 11:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:40.152 11:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:40.152 11:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:40.152 11:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:40.152 11:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:40.410 11:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:40.410 "name": "Existed_Raid", 00:14:40.410 "uuid": "6d09eb0c-eb09-4ecc-b2c7-954d85565544", 00:14:40.410 "strip_size_kb": 64, 00:14:40.410 "state": "online", 00:14:40.410 "raid_level": "raid0", 00:14:40.410 "superblock": false, 00:14:40.410 "num_base_bdevs": 4, 00:14:40.410 "num_base_bdevs_discovered": 4, 00:14:40.410 "num_base_bdevs_operational": 4, 00:14:40.410 "base_bdevs_list": [ 00:14:40.410 { 00:14:40.410 "name": "BaseBdev1", 00:14:40.410 "uuid": "035a2ce8-f8e5-455c-9b3e-e65531930578", 00:14:40.410 "is_configured": true, 00:14:40.410 "data_offset": 0, 00:14:40.410 "data_size": 65536 00:14:40.410 }, 00:14:40.410 { 00:14:40.410 "name": "BaseBdev2", 00:14:40.410 "uuid": "1fde05c0-33fb-43bf-bc50-13fd0335942a", 00:14:40.410 "is_configured": true, 00:14:40.410 "data_offset": 0, 00:14:40.410 "data_size": 65536 00:14:40.410 }, 00:14:40.410 { 00:14:40.410 "name": "BaseBdev3", 00:14:40.410 "uuid": "7e107f66-539e-4a28-87c7-da8a76b14503", 00:14:40.410 "is_configured": true, 00:14:40.410 "data_offset": 0, 00:14:40.410 "data_size": 65536 00:14:40.410 }, 00:14:40.410 { 00:14:40.410 "name": "BaseBdev4", 00:14:40.410 "uuid": "124cdad5-cdcf-47c7-bd63-76c82e331575", 00:14:40.410 "is_configured": true, 00:14:40.410 "data_offset": 0, 00:14:40.410 "data_size": 65536 00:14:40.410 } 00:14:40.410 ] 00:14:40.410 }' 00:14:40.410 11:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:40.410 11:55:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:40.978 11:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:14:40.978 11:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:40.978 11:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:40.978 11:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:40.978 11:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:40.978 11:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:40.978 11:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:40.978 11:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:40.978 [2024-07-12 11:55:31.091985] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:40.978 11:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:40.978 "name": "Existed_Raid", 00:14:40.978 "aliases": [ 00:14:40.978 "6d09eb0c-eb09-4ecc-b2c7-954d85565544" 00:14:40.978 ], 00:14:40.978 "product_name": "Raid Volume", 00:14:40.978 "block_size": 512, 00:14:40.978 "num_blocks": 262144, 00:14:40.978 "uuid": "6d09eb0c-eb09-4ecc-b2c7-954d85565544", 00:14:40.978 "assigned_rate_limits": { 00:14:40.978 "rw_ios_per_sec": 0, 00:14:40.978 "rw_mbytes_per_sec": 0, 00:14:40.978 "r_mbytes_per_sec": 0, 00:14:40.978 "w_mbytes_per_sec": 0 00:14:40.978 }, 00:14:40.978 "claimed": false, 00:14:40.978 "zoned": false, 00:14:40.978 "supported_io_types": { 00:14:40.978 "read": true, 00:14:40.978 "write": true, 00:14:40.978 "unmap": true, 00:14:40.978 "flush": true, 00:14:40.978 "reset": true, 00:14:40.978 "nvme_admin": false, 00:14:40.978 "nvme_io": false, 00:14:40.978 "nvme_io_md": false, 00:14:40.978 "write_zeroes": true, 00:14:40.978 "zcopy": false, 00:14:40.978 "get_zone_info": false, 00:14:40.978 "zone_management": false, 00:14:40.978 "zone_append": false, 00:14:40.978 "compare": false, 00:14:40.978 "compare_and_write": false, 00:14:40.978 "abort": false, 00:14:40.978 "seek_hole": false, 00:14:40.978 "seek_data": false, 00:14:40.978 "copy": false, 00:14:40.978 "nvme_iov_md": false 00:14:40.978 }, 00:14:40.978 "memory_domains": [ 00:14:40.978 { 00:14:40.978 "dma_device_id": "system", 00:14:40.978 "dma_device_type": 1 00:14:40.978 }, 00:14:40.978 { 00:14:40.978 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:40.978 "dma_device_type": 2 00:14:40.978 }, 00:14:40.978 { 00:14:40.978 "dma_device_id": "system", 00:14:40.978 "dma_device_type": 1 00:14:40.978 }, 00:14:40.978 { 00:14:40.978 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:40.978 "dma_device_type": 2 00:14:40.978 }, 00:14:40.978 { 00:14:40.978 "dma_device_id": "system", 00:14:40.978 "dma_device_type": 1 00:14:40.978 }, 00:14:40.978 { 00:14:40.978 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:40.978 "dma_device_type": 2 00:14:40.978 }, 00:14:40.978 { 00:14:40.978 "dma_device_id": "system", 00:14:40.978 "dma_device_type": 1 00:14:40.978 }, 00:14:40.978 { 00:14:40.978 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:40.978 "dma_device_type": 2 00:14:40.978 } 00:14:40.978 ], 00:14:40.978 "driver_specific": { 00:14:40.978 "raid": { 00:14:40.978 "uuid": "6d09eb0c-eb09-4ecc-b2c7-954d85565544", 00:14:40.978 "strip_size_kb": 64, 00:14:40.978 "state": "online", 00:14:40.978 "raid_level": "raid0", 00:14:40.978 "superblock": false, 00:14:40.978 "num_base_bdevs": 4, 00:14:40.978 "num_base_bdevs_discovered": 4, 00:14:40.978 "num_base_bdevs_operational": 4, 00:14:40.978 "base_bdevs_list": [ 00:14:40.978 { 00:14:40.978 "name": "BaseBdev1", 00:14:40.978 "uuid": "035a2ce8-f8e5-455c-9b3e-e65531930578", 00:14:40.978 "is_configured": true, 00:14:40.978 "data_offset": 0, 00:14:40.978 "data_size": 65536 00:14:40.978 }, 00:14:40.978 { 00:14:40.978 "name": "BaseBdev2", 00:14:40.978 "uuid": "1fde05c0-33fb-43bf-bc50-13fd0335942a", 00:14:40.978 "is_configured": true, 00:14:40.978 "data_offset": 0, 00:14:40.978 "data_size": 65536 00:14:40.978 }, 00:14:40.978 { 00:14:40.978 "name": "BaseBdev3", 00:14:40.978 "uuid": "7e107f66-539e-4a28-87c7-da8a76b14503", 00:14:40.978 "is_configured": true, 00:14:40.978 "data_offset": 0, 00:14:40.978 "data_size": 65536 00:14:40.978 }, 00:14:40.978 { 00:14:40.978 "name": "BaseBdev4", 00:14:40.978 "uuid": "124cdad5-cdcf-47c7-bd63-76c82e331575", 00:14:40.978 "is_configured": true, 00:14:40.978 "data_offset": 0, 00:14:40.978 "data_size": 65536 00:14:40.978 } 00:14:40.978 ] 00:14:40.978 } 00:14:40.978 } 00:14:40.978 }' 00:14:40.978 11:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:40.978 11:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:14:40.978 BaseBdev2 00:14:40.978 BaseBdev3 00:14:40.978 BaseBdev4' 00:14:40.978 11:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:40.978 11:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:40.978 11:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:41.238 11:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:41.238 "name": "BaseBdev1", 00:14:41.238 "aliases": [ 00:14:41.238 "035a2ce8-f8e5-455c-9b3e-e65531930578" 00:14:41.238 ], 00:14:41.238 "product_name": "Malloc disk", 00:14:41.238 "block_size": 512, 00:14:41.238 "num_blocks": 65536, 00:14:41.238 "uuid": "035a2ce8-f8e5-455c-9b3e-e65531930578", 00:14:41.238 "assigned_rate_limits": { 00:14:41.238 "rw_ios_per_sec": 0, 00:14:41.238 "rw_mbytes_per_sec": 0, 00:14:41.238 "r_mbytes_per_sec": 0, 00:14:41.238 "w_mbytes_per_sec": 0 00:14:41.238 }, 00:14:41.238 "claimed": true, 00:14:41.238 "claim_type": "exclusive_write", 00:14:41.238 "zoned": false, 00:14:41.238 "supported_io_types": { 00:14:41.238 "read": true, 00:14:41.238 "write": true, 00:14:41.238 "unmap": true, 00:14:41.238 "flush": true, 00:14:41.238 "reset": true, 00:14:41.238 "nvme_admin": false, 00:14:41.238 "nvme_io": false, 00:14:41.238 "nvme_io_md": false, 00:14:41.238 "write_zeroes": true, 00:14:41.238 "zcopy": true, 00:14:41.238 "get_zone_info": false, 00:14:41.238 "zone_management": false, 00:14:41.238 "zone_append": false, 00:14:41.238 "compare": false, 00:14:41.238 "compare_and_write": false, 00:14:41.238 "abort": true, 00:14:41.238 "seek_hole": false, 00:14:41.238 "seek_data": false, 00:14:41.238 "copy": true, 00:14:41.238 "nvme_iov_md": false 00:14:41.238 }, 00:14:41.238 "memory_domains": [ 00:14:41.238 { 00:14:41.238 "dma_device_id": "system", 00:14:41.238 "dma_device_type": 1 00:14:41.238 }, 00:14:41.238 { 00:14:41.238 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:41.238 "dma_device_type": 2 00:14:41.238 } 00:14:41.238 ], 00:14:41.238 "driver_specific": {} 00:14:41.238 }' 00:14:41.238 11:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:41.238 11:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:41.238 11:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:41.238 11:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:41.238 11:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:41.238 11:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:41.238 11:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:41.497 11:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:41.497 11:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:41.497 11:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:41.497 11:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:41.497 11:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:41.497 11:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:41.497 11:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:41.497 11:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:41.757 11:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:41.757 "name": "BaseBdev2", 00:14:41.757 "aliases": [ 00:14:41.757 "1fde05c0-33fb-43bf-bc50-13fd0335942a" 00:14:41.757 ], 00:14:41.757 "product_name": "Malloc disk", 00:14:41.757 "block_size": 512, 00:14:41.757 "num_blocks": 65536, 00:14:41.757 "uuid": "1fde05c0-33fb-43bf-bc50-13fd0335942a", 00:14:41.757 "assigned_rate_limits": { 00:14:41.757 "rw_ios_per_sec": 0, 00:14:41.757 "rw_mbytes_per_sec": 0, 00:14:41.757 "r_mbytes_per_sec": 0, 00:14:41.757 "w_mbytes_per_sec": 0 00:14:41.757 }, 00:14:41.757 "claimed": true, 00:14:41.757 "claim_type": "exclusive_write", 00:14:41.757 "zoned": false, 00:14:41.757 "supported_io_types": { 00:14:41.757 "read": true, 00:14:41.757 "write": true, 00:14:41.757 "unmap": true, 00:14:41.757 "flush": true, 00:14:41.757 "reset": true, 00:14:41.757 "nvme_admin": false, 00:14:41.757 "nvme_io": false, 00:14:41.757 "nvme_io_md": false, 00:14:41.757 "write_zeroes": true, 00:14:41.757 "zcopy": true, 00:14:41.757 "get_zone_info": false, 00:14:41.757 "zone_management": false, 00:14:41.757 "zone_append": false, 00:14:41.757 "compare": false, 00:14:41.757 "compare_and_write": false, 00:14:41.757 "abort": true, 00:14:41.757 "seek_hole": false, 00:14:41.757 "seek_data": false, 00:14:41.757 "copy": true, 00:14:41.757 "nvme_iov_md": false 00:14:41.757 }, 00:14:41.757 "memory_domains": [ 00:14:41.757 { 00:14:41.757 "dma_device_id": "system", 00:14:41.757 "dma_device_type": 1 00:14:41.757 }, 00:14:41.757 { 00:14:41.757 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:41.757 "dma_device_type": 2 00:14:41.757 } 00:14:41.757 ], 00:14:41.757 "driver_specific": {} 00:14:41.757 }' 00:14:41.757 11:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:41.757 11:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:41.757 11:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:41.757 11:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:41.757 11:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:41.757 11:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:41.757 11:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:41.757 11:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:41.757 11:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:41.757 11:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:41.757 11:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:42.016 11:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:42.016 11:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:42.016 11:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:42.016 11:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:42.016 11:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:42.016 "name": "BaseBdev3", 00:14:42.016 "aliases": [ 00:14:42.016 "7e107f66-539e-4a28-87c7-da8a76b14503" 00:14:42.016 ], 00:14:42.016 "product_name": "Malloc disk", 00:14:42.016 "block_size": 512, 00:14:42.016 "num_blocks": 65536, 00:14:42.016 "uuid": "7e107f66-539e-4a28-87c7-da8a76b14503", 00:14:42.016 "assigned_rate_limits": { 00:14:42.016 "rw_ios_per_sec": 0, 00:14:42.016 "rw_mbytes_per_sec": 0, 00:14:42.016 "r_mbytes_per_sec": 0, 00:14:42.016 "w_mbytes_per_sec": 0 00:14:42.016 }, 00:14:42.016 "claimed": true, 00:14:42.016 "claim_type": "exclusive_write", 00:14:42.016 "zoned": false, 00:14:42.016 "supported_io_types": { 00:14:42.016 "read": true, 00:14:42.016 "write": true, 00:14:42.016 "unmap": true, 00:14:42.017 "flush": true, 00:14:42.017 "reset": true, 00:14:42.017 "nvme_admin": false, 00:14:42.017 "nvme_io": false, 00:14:42.017 "nvme_io_md": false, 00:14:42.017 "write_zeroes": true, 00:14:42.017 "zcopy": true, 00:14:42.017 "get_zone_info": false, 00:14:42.017 "zone_management": false, 00:14:42.017 "zone_append": false, 00:14:42.017 "compare": false, 00:14:42.017 "compare_and_write": false, 00:14:42.017 "abort": true, 00:14:42.017 "seek_hole": false, 00:14:42.017 "seek_data": false, 00:14:42.017 "copy": true, 00:14:42.017 "nvme_iov_md": false 00:14:42.017 }, 00:14:42.017 "memory_domains": [ 00:14:42.017 { 00:14:42.017 "dma_device_id": "system", 00:14:42.017 "dma_device_type": 1 00:14:42.017 }, 00:14:42.017 { 00:14:42.017 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:42.017 "dma_device_type": 2 00:14:42.017 } 00:14:42.017 ], 00:14:42.017 "driver_specific": {} 00:14:42.017 }' 00:14:42.017 11:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:42.017 11:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:42.276 11:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:42.276 11:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:42.276 11:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:42.276 11:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:42.276 11:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:42.276 11:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:42.276 11:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:42.276 11:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:42.276 11:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:42.534 11:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:42.534 11:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:42.534 11:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:14:42.534 11:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:42.534 11:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:42.534 "name": "BaseBdev4", 00:14:42.534 "aliases": [ 00:14:42.534 "124cdad5-cdcf-47c7-bd63-76c82e331575" 00:14:42.534 ], 00:14:42.534 "product_name": "Malloc disk", 00:14:42.534 "block_size": 512, 00:14:42.534 "num_blocks": 65536, 00:14:42.534 "uuid": "124cdad5-cdcf-47c7-bd63-76c82e331575", 00:14:42.534 "assigned_rate_limits": { 00:14:42.534 "rw_ios_per_sec": 0, 00:14:42.534 "rw_mbytes_per_sec": 0, 00:14:42.534 "r_mbytes_per_sec": 0, 00:14:42.534 "w_mbytes_per_sec": 0 00:14:42.534 }, 00:14:42.534 "claimed": true, 00:14:42.534 "claim_type": "exclusive_write", 00:14:42.534 "zoned": false, 00:14:42.534 "supported_io_types": { 00:14:42.534 "read": true, 00:14:42.534 "write": true, 00:14:42.534 "unmap": true, 00:14:42.534 "flush": true, 00:14:42.534 "reset": true, 00:14:42.535 "nvme_admin": false, 00:14:42.535 "nvme_io": false, 00:14:42.535 "nvme_io_md": false, 00:14:42.535 "write_zeroes": true, 00:14:42.535 "zcopy": true, 00:14:42.535 "get_zone_info": false, 00:14:42.535 "zone_management": false, 00:14:42.535 "zone_append": false, 00:14:42.535 "compare": false, 00:14:42.535 "compare_and_write": false, 00:14:42.535 "abort": true, 00:14:42.535 "seek_hole": false, 00:14:42.535 "seek_data": false, 00:14:42.535 "copy": true, 00:14:42.535 "nvme_iov_md": false 00:14:42.535 }, 00:14:42.535 "memory_domains": [ 00:14:42.535 { 00:14:42.535 "dma_device_id": "system", 00:14:42.535 "dma_device_type": 1 00:14:42.535 }, 00:14:42.535 { 00:14:42.535 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:42.535 "dma_device_type": 2 00:14:42.535 } 00:14:42.535 ], 00:14:42.535 "driver_specific": {} 00:14:42.535 }' 00:14:42.535 11:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:42.535 11:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:42.794 11:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:42.794 11:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:42.794 11:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:42.794 11:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:42.794 11:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:42.794 11:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:42.794 11:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:42.794 11:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:42.794 11:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:42.794 11:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:42.794 11:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:43.052 [2024-07-12 11:55:33.181221] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:43.052 [2024-07-12 11:55:33.181240] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:43.052 [2024-07-12 11:55:33.181272] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:43.052 11:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:43.052 11:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:14:43.052 11:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:43.053 11:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:43.053 11:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:14:43.053 11:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:14:43.053 11:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:43.053 11:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:14:43.053 11:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:43.053 11:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:43.053 11:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:43.053 11:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:43.053 11:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:43.053 11:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:43.053 11:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:43.053 11:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:43.053 11:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:43.312 11:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:43.312 "name": "Existed_Raid", 00:14:43.312 "uuid": "6d09eb0c-eb09-4ecc-b2c7-954d85565544", 00:14:43.312 "strip_size_kb": 64, 00:14:43.312 "state": "offline", 00:14:43.312 "raid_level": "raid0", 00:14:43.312 "superblock": false, 00:14:43.312 "num_base_bdevs": 4, 00:14:43.312 "num_base_bdevs_discovered": 3, 00:14:43.312 "num_base_bdevs_operational": 3, 00:14:43.312 "base_bdevs_list": [ 00:14:43.312 { 00:14:43.312 "name": null, 00:14:43.312 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:43.312 "is_configured": false, 00:14:43.312 "data_offset": 0, 00:14:43.312 "data_size": 65536 00:14:43.312 }, 00:14:43.312 { 00:14:43.312 "name": "BaseBdev2", 00:14:43.312 "uuid": "1fde05c0-33fb-43bf-bc50-13fd0335942a", 00:14:43.312 "is_configured": true, 00:14:43.312 "data_offset": 0, 00:14:43.312 "data_size": 65536 00:14:43.312 }, 00:14:43.312 { 00:14:43.312 "name": "BaseBdev3", 00:14:43.312 "uuid": "7e107f66-539e-4a28-87c7-da8a76b14503", 00:14:43.312 "is_configured": true, 00:14:43.312 "data_offset": 0, 00:14:43.312 "data_size": 65536 00:14:43.312 }, 00:14:43.312 { 00:14:43.312 "name": "BaseBdev4", 00:14:43.312 "uuid": "124cdad5-cdcf-47c7-bd63-76c82e331575", 00:14:43.312 "is_configured": true, 00:14:43.312 "data_offset": 0, 00:14:43.312 "data_size": 65536 00:14:43.312 } 00:14:43.312 ] 00:14:43.312 }' 00:14:43.312 11:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:43.312 11:55:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:43.880 11:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:43.880 11:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:43.880 11:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:43.880 11:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:43.880 11:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:43.880 11:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:43.880 11:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:44.139 [2024-07-12 11:55:34.176606] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:44.139 11:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:44.139 11:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:44.139 11:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:44.139 11:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:44.139 11:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:44.139 11:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:44.139 11:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:14:44.398 [2024-07-12 11:55:34.523266] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:44.398 11:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:44.398 11:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:44.398 11:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:44.398 11:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:44.657 11:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:44.657 11:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:44.657 11:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:14:44.657 [2024-07-12 11:55:34.853726] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:14:44.657 [2024-07-12 11:55:34.853755] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23c2b90 name Existed_Raid, state offline 00:14:44.657 11:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:44.657 11:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:44.657 11:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:44.657 11:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:44.915 11:55:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:44.916 11:55:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:44.916 11:55:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:14:44.916 11:55:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:14:44.916 11:55:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:44.916 11:55:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:45.175 BaseBdev2 00:14:45.175 11:55:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:14:45.175 11:55:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:45.175 11:55:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:45.175 11:55:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:45.175 11:55:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:45.175 11:55:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:45.175 11:55:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:45.175 11:55:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:45.434 [ 00:14:45.434 { 00:14:45.434 "name": "BaseBdev2", 00:14:45.434 "aliases": [ 00:14:45.434 "a3e5b8c7-a4c3-4272-ab39-ddb44a1efd8f" 00:14:45.434 ], 00:14:45.434 "product_name": "Malloc disk", 00:14:45.434 "block_size": 512, 00:14:45.434 "num_blocks": 65536, 00:14:45.434 "uuid": "a3e5b8c7-a4c3-4272-ab39-ddb44a1efd8f", 00:14:45.434 "assigned_rate_limits": { 00:14:45.434 "rw_ios_per_sec": 0, 00:14:45.434 "rw_mbytes_per_sec": 0, 00:14:45.434 "r_mbytes_per_sec": 0, 00:14:45.434 "w_mbytes_per_sec": 0 00:14:45.434 }, 00:14:45.434 "claimed": false, 00:14:45.434 "zoned": false, 00:14:45.434 "supported_io_types": { 00:14:45.434 "read": true, 00:14:45.434 "write": true, 00:14:45.434 "unmap": true, 00:14:45.434 "flush": true, 00:14:45.434 "reset": true, 00:14:45.434 "nvme_admin": false, 00:14:45.434 "nvme_io": false, 00:14:45.434 "nvme_io_md": false, 00:14:45.434 "write_zeroes": true, 00:14:45.434 "zcopy": true, 00:14:45.434 "get_zone_info": false, 00:14:45.434 "zone_management": false, 00:14:45.434 "zone_append": false, 00:14:45.434 "compare": false, 00:14:45.434 "compare_and_write": false, 00:14:45.434 "abort": true, 00:14:45.434 "seek_hole": false, 00:14:45.434 "seek_data": false, 00:14:45.434 "copy": true, 00:14:45.435 "nvme_iov_md": false 00:14:45.435 }, 00:14:45.435 "memory_domains": [ 00:14:45.435 { 00:14:45.435 "dma_device_id": "system", 00:14:45.435 "dma_device_type": 1 00:14:45.435 }, 00:14:45.435 { 00:14:45.435 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:45.435 "dma_device_type": 2 00:14:45.435 } 00:14:45.435 ], 00:14:45.435 "driver_specific": {} 00:14:45.435 } 00:14:45.435 ] 00:14:45.435 11:55:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:45.435 11:55:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:45.435 11:55:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:45.435 11:55:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:45.694 BaseBdev3 00:14:45.694 11:55:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:14:45.694 11:55:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:45.694 11:55:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:45.694 11:55:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:45.694 11:55:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:45.694 11:55:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:45.694 11:55:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:45.694 11:55:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:45.954 [ 00:14:45.954 { 00:14:45.954 "name": "BaseBdev3", 00:14:45.954 "aliases": [ 00:14:45.954 "70bb2c7c-5c42-4c22-8102-18ffaba33b91" 00:14:45.954 ], 00:14:45.954 "product_name": "Malloc disk", 00:14:45.954 "block_size": 512, 00:14:45.954 "num_blocks": 65536, 00:14:45.954 "uuid": "70bb2c7c-5c42-4c22-8102-18ffaba33b91", 00:14:45.954 "assigned_rate_limits": { 00:14:45.954 "rw_ios_per_sec": 0, 00:14:45.954 "rw_mbytes_per_sec": 0, 00:14:45.954 "r_mbytes_per_sec": 0, 00:14:45.954 "w_mbytes_per_sec": 0 00:14:45.954 }, 00:14:45.954 "claimed": false, 00:14:45.954 "zoned": false, 00:14:45.954 "supported_io_types": { 00:14:45.954 "read": true, 00:14:45.954 "write": true, 00:14:45.954 "unmap": true, 00:14:45.954 "flush": true, 00:14:45.954 "reset": true, 00:14:45.954 "nvme_admin": false, 00:14:45.954 "nvme_io": false, 00:14:45.954 "nvme_io_md": false, 00:14:45.954 "write_zeroes": true, 00:14:45.954 "zcopy": true, 00:14:45.954 "get_zone_info": false, 00:14:45.954 "zone_management": false, 00:14:45.954 "zone_append": false, 00:14:45.954 "compare": false, 00:14:45.954 "compare_and_write": false, 00:14:45.954 "abort": true, 00:14:45.954 "seek_hole": false, 00:14:45.954 "seek_data": false, 00:14:45.954 "copy": true, 00:14:45.954 "nvme_iov_md": false 00:14:45.954 }, 00:14:45.954 "memory_domains": [ 00:14:45.954 { 00:14:45.954 "dma_device_id": "system", 00:14:45.954 "dma_device_type": 1 00:14:45.954 }, 00:14:45.954 { 00:14:45.954 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:45.954 "dma_device_type": 2 00:14:45.954 } 00:14:45.954 ], 00:14:45.954 "driver_specific": {} 00:14:45.954 } 00:14:45.954 ] 00:14:45.954 11:55:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:45.954 11:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:45.954 11:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:45.954 11:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:14:45.954 BaseBdev4 00:14:46.213 11:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:14:46.213 11:55:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:14:46.213 11:55:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:46.213 11:55:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:46.213 11:55:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:46.213 11:55:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:46.213 11:55:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:46.213 11:55:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:14:46.472 [ 00:14:46.472 { 00:14:46.472 "name": "BaseBdev4", 00:14:46.472 "aliases": [ 00:14:46.472 "25edd5aa-fa04-4b62-92f3-8a4b65cc75a2" 00:14:46.472 ], 00:14:46.472 "product_name": "Malloc disk", 00:14:46.472 "block_size": 512, 00:14:46.472 "num_blocks": 65536, 00:14:46.472 "uuid": "25edd5aa-fa04-4b62-92f3-8a4b65cc75a2", 00:14:46.472 "assigned_rate_limits": { 00:14:46.472 "rw_ios_per_sec": 0, 00:14:46.472 "rw_mbytes_per_sec": 0, 00:14:46.472 "r_mbytes_per_sec": 0, 00:14:46.472 "w_mbytes_per_sec": 0 00:14:46.472 }, 00:14:46.472 "claimed": false, 00:14:46.472 "zoned": false, 00:14:46.473 "supported_io_types": { 00:14:46.473 "read": true, 00:14:46.473 "write": true, 00:14:46.473 "unmap": true, 00:14:46.473 "flush": true, 00:14:46.473 "reset": true, 00:14:46.473 "nvme_admin": false, 00:14:46.473 "nvme_io": false, 00:14:46.473 "nvme_io_md": false, 00:14:46.473 "write_zeroes": true, 00:14:46.473 "zcopy": true, 00:14:46.473 "get_zone_info": false, 00:14:46.473 "zone_management": false, 00:14:46.473 "zone_append": false, 00:14:46.473 "compare": false, 00:14:46.473 "compare_and_write": false, 00:14:46.473 "abort": true, 00:14:46.473 "seek_hole": false, 00:14:46.473 "seek_data": false, 00:14:46.473 "copy": true, 00:14:46.473 "nvme_iov_md": false 00:14:46.473 }, 00:14:46.473 "memory_domains": [ 00:14:46.473 { 00:14:46.473 "dma_device_id": "system", 00:14:46.473 "dma_device_type": 1 00:14:46.473 }, 00:14:46.473 { 00:14:46.473 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:46.473 "dma_device_type": 2 00:14:46.473 } 00:14:46.473 ], 00:14:46.473 "driver_specific": {} 00:14:46.473 } 00:14:46.473 ] 00:14:46.473 11:55:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:46.473 11:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:46.473 11:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:46.473 11:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:14:46.473 [2024-07-12 11:55:36.678996] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:46.473 [2024-07-12 11:55:36.679024] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:46.473 [2024-07-12 11:55:36.679036] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:46.473 [2024-07-12 11:55:36.679990] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:46.473 [2024-07-12 11:55:36.680017] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:14:46.473 11:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:46.473 11:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:46.473 11:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:46.473 11:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:46.473 11:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:46.473 11:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:46.473 11:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:46.473 11:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:46.473 11:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:46.473 11:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:46.473 11:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:46.473 11:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:46.733 11:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:46.733 "name": "Existed_Raid", 00:14:46.733 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:46.733 "strip_size_kb": 64, 00:14:46.733 "state": "configuring", 00:14:46.733 "raid_level": "raid0", 00:14:46.733 "superblock": false, 00:14:46.733 "num_base_bdevs": 4, 00:14:46.733 "num_base_bdevs_discovered": 3, 00:14:46.733 "num_base_bdevs_operational": 4, 00:14:46.733 "base_bdevs_list": [ 00:14:46.733 { 00:14:46.733 "name": "BaseBdev1", 00:14:46.733 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:46.733 "is_configured": false, 00:14:46.733 "data_offset": 0, 00:14:46.733 "data_size": 0 00:14:46.733 }, 00:14:46.733 { 00:14:46.733 "name": "BaseBdev2", 00:14:46.733 "uuid": "a3e5b8c7-a4c3-4272-ab39-ddb44a1efd8f", 00:14:46.733 "is_configured": true, 00:14:46.733 "data_offset": 0, 00:14:46.733 "data_size": 65536 00:14:46.733 }, 00:14:46.733 { 00:14:46.733 "name": "BaseBdev3", 00:14:46.733 "uuid": "70bb2c7c-5c42-4c22-8102-18ffaba33b91", 00:14:46.733 "is_configured": true, 00:14:46.733 "data_offset": 0, 00:14:46.733 "data_size": 65536 00:14:46.733 }, 00:14:46.733 { 00:14:46.733 "name": "BaseBdev4", 00:14:46.733 "uuid": "25edd5aa-fa04-4b62-92f3-8a4b65cc75a2", 00:14:46.733 "is_configured": true, 00:14:46.733 "data_offset": 0, 00:14:46.733 "data_size": 65536 00:14:46.733 } 00:14:46.733 ] 00:14:46.733 }' 00:14:46.733 11:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:46.733 11:55:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:47.329 11:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:14:47.329 [2024-07-12 11:55:37.493105] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:47.329 11:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:47.329 11:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:47.329 11:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:47.329 11:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:47.329 11:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:47.329 11:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:47.329 11:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:47.329 11:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:47.329 11:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:47.329 11:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:47.329 11:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:47.329 11:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:47.588 11:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:47.588 "name": "Existed_Raid", 00:14:47.588 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:47.588 "strip_size_kb": 64, 00:14:47.588 "state": "configuring", 00:14:47.588 "raid_level": "raid0", 00:14:47.588 "superblock": false, 00:14:47.588 "num_base_bdevs": 4, 00:14:47.588 "num_base_bdevs_discovered": 2, 00:14:47.588 "num_base_bdevs_operational": 4, 00:14:47.588 "base_bdevs_list": [ 00:14:47.588 { 00:14:47.588 "name": "BaseBdev1", 00:14:47.588 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:47.588 "is_configured": false, 00:14:47.588 "data_offset": 0, 00:14:47.588 "data_size": 0 00:14:47.588 }, 00:14:47.588 { 00:14:47.588 "name": null, 00:14:47.588 "uuid": "a3e5b8c7-a4c3-4272-ab39-ddb44a1efd8f", 00:14:47.588 "is_configured": false, 00:14:47.588 "data_offset": 0, 00:14:47.588 "data_size": 65536 00:14:47.588 }, 00:14:47.588 { 00:14:47.588 "name": "BaseBdev3", 00:14:47.588 "uuid": "70bb2c7c-5c42-4c22-8102-18ffaba33b91", 00:14:47.588 "is_configured": true, 00:14:47.588 "data_offset": 0, 00:14:47.588 "data_size": 65536 00:14:47.588 }, 00:14:47.588 { 00:14:47.588 "name": "BaseBdev4", 00:14:47.588 "uuid": "25edd5aa-fa04-4b62-92f3-8a4b65cc75a2", 00:14:47.588 "is_configured": true, 00:14:47.588 "data_offset": 0, 00:14:47.588 "data_size": 65536 00:14:47.588 } 00:14:47.588 ] 00:14:47.588 }' 00:14:47.588 11:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:47.588 11:55:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:48.157 11:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:48.158 11:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:48.158 11:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:14:48.158 11:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:48.417 [2024-07-12 11:55:38.474287] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:48.417 BaseBdev1 00:14:48.417 11:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:14:48.417 11:55:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:48.417 11:55:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:48.417 11:55:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:48.417 11:55:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:48.417 11:55:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:48.417 11:55:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:48.417 11:55:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:48.677 [ 00:14:48.677 { 00:14:48.677 "name": "BaseBdev1", 00:14:48.677 "aliases": [ 00:14:48.677 "0857a3e9-abb3-4159-90b4-a11ab10df23e" 00:14:48.677 ], 00:14:48.677 "product_name": "Malloc disk", 00:14:48.677 "block_size": 512, 00:14:48.677 "num_blocks": 65536, 00:14:48.677 "uuid": "0857a3e9-abb3-4159-90b4-a11ab10df23e", 00:14:48.677 "assigned_rate_limits": { 00:14:48.677 "rw_ios_per_sec": 0, 00:14:48.677 "rw_mbytes_per_sec": 0, 00:14:48.677 "r_mbytes_per_sec": 0, 00:14:48.677 "w_mbytes_per_sec": 0 00:14:48.677 }, 00:14:48.677 "claimed": true, 00:14:48.677 "claim_type": "exclusive_write", 00:14:48.677 "zoned": false, 00:14:48.677 "supported_io_types": { 00:14:48.677 "read": true, 00:14:48.677 "write": true, 00:14:48.677 "unmap": true, 00:14:48.677 "flush": true, 00:14:48.677 "reset": true, 00:14:48.677 "nvme_admin": false, 00:14:48.677 "nvme_io": false, 00:14:48.677 "nvme_io_md": false, 00:14:48.677 "write_zeroes": true, 00:14:48.677 "zcopy": true, 00:14:48.677 "get_zone_info": false, 00:14:48.677 "zone_management": false, 00:14:48.677 "zone_append": false, 00:14:48.677 "compare": false, 00:14:48.677 "compare_and_write": false, 00:14:48.677 "abort": true, 00:14:48.677 "seek_hole": false, 00:14:48.677 "seek_data": false, 00:14:48.677 "copy": true, 00:14:48.677 "nvme_iov_md": false 00:14:48.677 }, 00:14:48.677 "memory_domains": [ 00:14:48.677 { 00:14:48.677 "dma_device_id": "system", 00:14:48.677 "dma_device_type": 1 00:14:48.677 }, 00:14:48.677 { 00:14:48.677 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:48.677 "dma_device_type": 2 00:14:48.677 } 00:14:48.677 ], 00:14:48.677 "driver_specific": {} 00:14:48.677 } 00:14:48.677 ] 00:14:48.677 11:55:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:48.677 11:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:48.677 11:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:48.677 11:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:48.677 11:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:48.677 11:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:48.677 11:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:48.677 11:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:48.677 11:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:48.677 11:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:48.677 11:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:48.677 11:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:48.677 11:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:48.936 11:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:48.936 "name": "Existed_Raid", 00:14:48.936 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:48.936 "strip_size_kb": 64, 00:14:48.936 "state": "configuring", 00:14:48.936 "raid_level": "raid0", 00:14:48.936 "superblock": false, 00:14:48.936 "num_base_bdevs": 4, 00:14:48.936 "num_base_bdevs_discovered": 3, 00:14:48.936 "num_base_bdevs_operational": 4, 00:14:48.936 "base_bdevs_list": [ 00:14:48.936 { 00:14:48.936 "name": "BaseBdev1", 00:14:48.936 "uuid": "0857a3e9-abb3-4159-90b4-a11ab10df23e", 00:14:48.936 "is_configured": true, 00:14:48.936 "data_offset": 0, 00:14:48.936 "data_size": 65536 00:14:48.936 }, 00:14:48.936 { 00:14:48.936 "name": null, 00:14:48.936 "uuid": "a3e5b8c7-a4c3-4272-ab39-ddb44a1efd8f", 00:14:48.936 "is_configured": false, 00:14:48.936 "data_offset": 0, 00:14:48.936 "data_size": 65536 00:14:48.936 }, 00:14:48.936 { 00:14:48.936 "name": "BaseBdev3", 00:14:48.936 "uuid": "70bb2c7c-5c42-4c22-8102-18ffaba33b91", 00:14:48.936 "is_configured": true, 00:14:48.936 "data_offset": 0, 00:14:48.936 "data_size": 65536 00:14:48.936 }, 00:14:48.936 { 00:14:48.936 "name": "BaseBdev4", 00:14:48.936 "uuid": "25edd5aa-fa04-4b62-92f3-8a4b65cc75a2", 00:14:48.936 "is_configured": true, 00:14:48.936 "data_offset": 0, 00:14:48.936 "data_size": 65536 00:14:48.936 } 00:14:48.936 ] 00:14:48.936 }' 00:14:48.937 11:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:48.937 11:55:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:49.195 11:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:49.195 11:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:49.454 11:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:14:49.454 11:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:49.454 [2024-07-12 11:55:39.693459] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:49.714 11:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:49.714 11:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:49.714 11:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:49.714 11:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:49.714 11:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:49.714 11:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:49.714 11:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:49.714 11:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:49.714 11:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:49.714 11:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:49.714 11:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:49.714 11:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:49.714 11:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:49.714 "name": "Existed_Raid", 00:14:49.714 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:49.714 "strip_size_kb": 64, 00:14:49.714 "state": "configuring", 00:14:49.714 "raid_level": "raid0", 00:14:49.714 "superblock": false, 00:14:49.714 "num_base_bdevs": 4, 00:14:49.714 "num_base_bdevs_discovered": 2, 00:14:49.714 "num_base_bdevs_operational": 4, 00:14:49.714 "base_bdevs_list": [ 00:14:49.714 { 00:14:49.714 "name": "BaseBdev1", 00:14:49.714 "uuid": "0857a3e9-abb3-4159-90b4-a11ab10df23e", 00:14:49.714 "is_configured": true, 00:14:49.714 "data_offset": 0, 00:14:49.714 "data_size": 65536 00:14:49.714 }, 00:14:49.714 { 00:14:49.714 "name": null, 00:14:49.714 "uuid": "a3e5b8c7-a4c3-4272-ab39-ddb44a1efd8f", 00:14:49.714 "is_configured": false, 00:14:49.714 "data_offset": 0, 00:14:49.714 "data_size": 65536 00:14:49.714 }, 00:14:49.714 { 00:14:49.714 "name": null, 00:14:49.714 "uuid": "70bb2c7c-5c42-4c22-8102-18ffaba33b91", 00:14:49.714 "is_configured": false, 00:14:49.714 "data_offset": 0, 00:14:49.714 "data_size": 65536 00:14:49.714 }, 00:14:49.714 { 00:14:49.714 "name": "BaseBdev4", 00:14:49.714 "uuid": "25edd5aa-fa04-4b62-92f3-8a4b65cc75a2", 00:14:49.714 "is_configured": true, 00:14:49.714 "data_offset": 0, 00:14:49.714 "data_size": 65536 00:14:49.714 } 00:14:49.714 ] 00:14:49.714 }' 00:14:49.714 11:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:49.714 11:55:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:50.282 11:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:50.282 11:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:50.282 11:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:14:50.282 11:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:50.541 [2024-07-12 11:55:40.667997] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:50.541 11:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:50.541 11:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:50.541 11:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:50.541 11:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:50.541 11:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:50.541 11:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:50.541 11:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:50.541 11:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:50.541 11:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:50.541 11:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:50.541 11:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:50.541 11:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:50.800 11:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:50.800 "name": "Existed_Raid", 00:14:50.800 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:50.800 "strip_size_kb": 64, 00:14:50.800 "state": "configuring", 00:14:50.800 "raid_level": "raid0", 00:14:50.800 "superblock": false, 00:14:50.800 "num_base_bdevs": 4, 00:14:50.800 "num_base_bdevs_discovered": 3, 00:14:50.800 "num_base_bdevs_operational": 4, 00:14:50.800 "base_bdevs_list": [ 00:14:50.800 { 00:14:50.800 "name": "BaseBdev1", 00:14:50.800 "uuid": "0857a3e9-abb3-4159-90b4-a11ab10df23e", 00:14:50.800 "is_configured": true, 00:14:50.800 "data_offset": 0, 00:14:50.800 "data_size": 65536 00:14:50.800 }, 00:14:50.800 { 00:14:50.800 "name": null, 00:14:50.800 "uuid": "a3e5b8c7-a4c3-4272-ab39-ddb44a1efd8f", 00:14:50.800 "is_configured": false, 00:14:50.800 "data_offset": 0, 00:14:50.800 "data_size": 65536 00:14:50.800 }, 00:14:50.800 { 00:14:50.800 "name": "BaseBdev3", 00:14:50.800 "uuid": "70bb2c7c-5c42-4c22-8102-18ffaba33b91", 00:14:50.800 "is_configured": true, 00:14:50.800 "data_offset": 0, 00:14:50.800 "data_size": 65536 00:14:50.800 }, 00:14:50.800 { 00:14:50.800 "name": "BaseBdev4", 00:14:50.800 "uuid": "25edd5aa-fa04-4b62-92f3-8a4b65cc75a2", 00:14:50.800 "is_configured": true, 00:14:50.800 "data_offset": 0, 00:14:50.800 "data_size": 65536 00:14:50.800 } 00:14:50.800 ] 00:14:50.800 }' 00:14:50.800 11:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:50.800 11:55:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:51.059 11:55:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:51.059 11:55:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:51.318 11:55:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:14:51.318 11:55:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:51.577 [2024-07-12 11:55:41.602414] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:51.577 11:55:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:51.577 11:55:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:51.577 11:55:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:51.577 11:55:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:51.577 11:55:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:51.577 11:55:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:51.577 11:55:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:51.577 11:55:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:51.577 11:55:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:51.577 11:55:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:51.577 11:55:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:51.577 11:55:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:51.577 11:55:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:51.577 "name": "Existed_Raid", 00:14:51.577 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:51.577 "strip_size_kb": 64, 00:14:51.577 "state": "configuring", 00:14:51.577 "raid_level": "raid0", 00:14:51.577 "superblock": false, 00:14:51.577 "num_base_bdevs": 4, 00:14:51.577 "num_base_bdevs_discovered": 2, 00:14:51.577 "num_base_bdevs_operational": 4, 00:14:51.577 "base_bdevs_list": [ 00:14:51.577 { 00:14:51.577 "name": null, 00:14:51.577 "uuid": "0857a3e9-abb3-4159-90b4-a11ab10df23e", 00:14:51.577 "is_configured": false, 00:14:51.577 "data_offset": 0, 00:14:51.577 "data_size": 65536 00:14:51.577 }, 00:14:51.577 { 00:14:51.577 "name": null, 00:14:51.577 "uuid": "a3e5b8c7-a4c3-4272-ab39-ddb44a1efd8f", 00:14:51.577 "is_configured": false, 00:14:51.577 "data_offset": 0, 00:14:51.577 "data_size": 65536 00:14:51.577 }, 00:14:51.577 { 00:14:51.577 "name": "BaseBdev3", 00:14:51.577 "uuid": "70bb2c7c-5c42-4c22-8102-18ffaba33b91", 00:14:51.577 "is_configured": true, 00:14:51.577 "data_offset": 0, 00:14:51.577 "data_size": 65536 00:14:51.577 }, 00:14:51.577 { 00:14:51.577 "name": "BaseBdev4", 00:14:51.577 "uuid": "25edd5aa-fa04-4b62-92f3-8a4b65cc75a2", 00:14:51.577 "is_configured": true, 00:14:51.577 "data_offset": 0, 00:14:51.577 "data_size": 65536 00:14:51.577 } 00:14:51.577 ] 00:14:51.577 }' 00:14:51.577 11:55:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:51.577 11:55:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:52.145 11:55:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:52.145 11:55:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:52.404 11:55:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:14:52.404 11:55:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:52.404 [2024-07-12 11:55:42.598773] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:52.404 11:55:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:52.404 11:55:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:52.404 11:55:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:52.404 11:55:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:52.404 11:55:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:52.404 11:55:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:52.404 11:55:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:52.404 11:55:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:52.404 11:55:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:52.405 11:55:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:52.405 11:55:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:52.405 11:55:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:52.663 11:55:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:52.663 "name": "Existed_Raid", 00:14:52.663 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:52.663 "strip_size_kb": 64, 00:14:52.663 "state": "configuring", 00:14:52.663 "raid_level": "raid0", 00:14:52.663 "superblock": false, 00:14:52.663 "num_base_bdevs": 4, 00:14:52.663 "num_base_bdevs_discovered": 3, 00:14:52.664 "num_base_bdevs_operational": 4, 00:14:52.664 "base_bdevs_list": [ 00:14:52.664 { 00:14:52.664 "name": null, 00:14:52.664 "uuid": "0857a3e9-abb3-4159-90b4-a11ab10df23e", 00:14:52.664 "is_configured": false, 00:14:52.664 "data_offset": 0, 00:14:52.664 "data_size": 65536 00:14:52.664 }, 00:14:52.664 { 00:14:52.664 "name": "BaseBdev2", 00:14:52.664 "uuid": "a3e5b8c7-a4c3-4272-ab39-ddb44a1efd8f", 00:14:52.664 "is_configured": true, 00:14:52.664 "data_offset": 0, 00:14:52.664 "data_size": 65536 00:14:52.664 }, 00:14:52.664 { 00:14:52.664 "name": "BaseBdev3", 00:14:52.664 "uuid": "70bb2c7c-5c42-4c22-8102-18ffaba33b91", 00:14:52.664 "is_configured": true, 00:14:52.664 "data_offset": 0, 00:14:52.664 "data_size": 65536 00:14:52.664 }, 00:14:52.664 { 00:14:52.664 "name": "BaseBdev4", 00:14:52.664 "uuid": "25edd5aa-fa04-4b62-92f3-8a4b65cc75a2", 00:14:52.664 "is_configured": true, 00:14:52.664 "data_offset": 0, 00:14:52.664 "data_size": 65536 00:14:52.664 } 00:14:52.664 ] 00:14:52.664 }' 00:14:52.664 11:55:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:52.664 11:55:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:53.229 11:55:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:53.229 11:55:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:53.229 11:55:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:14:53.229 11:55:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:53.229 11:55:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:53.488 11:55:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 0857a3e9-abb3-4159-90b4-a11ab10df23e 00:14:53.488 [2024-07-12 11:55:43.724371] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:53.488 [2024-07-12 11:55:43.724398] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25676c0 00:14:53.488 [2024-07-12 11:55:43.724402] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:14:53.488 [2024-07-12 11:55:43.724551] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2566c10 00:14:53.488 [2024-07-12 11:55:43.724638] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25676c0 00:14:53.488 [2024-07-12 11:55:43.724644] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x25676c0 00:14:53.488 [2024-07-12 11:55:43.724796] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:53.488 NewBaseBdev 00:14:53.746 11:55:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:14:53.746 11:55:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:14:53.746 11:55:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:53.746 11:55:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:53.746 11:55:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:53.746 11:55:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:53.746 11:55:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:53.746 11:55:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:54.005 [ 00:14:54.005 { 00:14:54.005 "name": "NewBaseBdev", 00:14:54.005 "aliases": [ 00:14:54.005 "0857a3e9-abb3-4159-90b4-a11ab10df23e" 00:14:54.005 ], 00:14:54.005 "product_name": "Malloc disk", 00:14:54.005 "block_size": 512, 00:14:54.005 "num_blocks": 65536, 00:14:54.005 "uuid": "0857a3e9-abb3-4159-90b4-a11ab10df23e", 00:14:54.005 "assigned_rate_limits": { 00:14:54.005 "rw_ios_per_sec": 0, 00:14:54.005 "rw_mbytes_per_sec": 0, 00:14:54.005 "r_mbytes_per_sec": 0, 00:14:54.005 "w_mbytes_per_sec": 0 00:14:54.005 }, 00:14:54.005 "claimed": true, 00:14:54.005 "claim_type": "exclusive_write", 00:14:54.005 "zoned": false, 00:14:54.005 "supported_io_types": { 00:14:54.005 "read": true, 00:14:54.005 "write": true, 00:14:54.005 "unmap": true, 00:14:54.005 "flush": true, 00:14:54.005 "reset": true, 00:14:54.005 "nvme_admin": false, 00:14:54.005 "nvme_io": false, 00:14:54.005 "nvme_io_md": false, 00:14:54.005 "write_zeroes": true, 00:14:54.005 "zcopy": true, 00:14:54.005 "get_zone_info": false, 00:14:54.005 "zone_management": false, 00:14:54.005 "zone_append": false, 00:14:54.005 "compare": false, 00:14:54.005 "compare_and_write": false, 00:14:54.005 "abort": true, 00:14:54.005 "seek_hole": false, 00:14:54.005 "seek_data": false, 00:14:54.005 "copy": true, 00:14:54.005 "nvme_iov_md": false 00:14:54.005 }, 00:14:54.005 "memory_domains": [ 00:14:54.005 { 00:14:54.005 "dma_device_id": "system", 00:14:54.005 "dma_device_type": 1 00:14:54.005 }, 00:14:54.005 { 00:14:54.005 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:54.005 "dma_device_type": 2 00:14:54.005 } 00:14:54.005 ], 00:14:54.005 "driver_specific": {} 00:14:54.005 } 00:14:54.005 ] 00:14:54.005 11:55:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:54.005 11:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:14:54.005 11:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:54.005 11:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:54.005 11:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:54.005 11:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:54.005 11:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:54.005 11:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:54.005 11:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:54.005 11:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:54.005 11:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:54.005 11:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:54.005 11:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:54.005 11:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:54.005 "name": "Existed_Raid", 00:14:54.005 "uuid": "e2c41c63-fa6d-4538-994a-82191b0e5e17", 00:14:54.005 "strip_size_kb": 64, 00:14:54.005 "state": "online", 00:14:54.005 "raid_level": "raid0", 00:14:54.005 "superblock": false, 00:14:54.005 "num_base_bdevs": 4, 00:14:54.005 "num_base_bdevs_discovered": 4, 00:14:54.005 "num_base_bdevs_operational": 4, 00:14:54.005 "base_bdevs_list": [ 00:14:54.005 { 00:14:54.005 "name": "NewBaseBdev", 00:14:54.005 "uuid": "0857a3e9-abb3-4159-90b4-a11ab10df23e", 00:14:54.005 "is_configured": true, 00:14:54.005 "data_offset": 0, 00:14:54.005 "data_size": 65536 00:14:54.005 }, 00:14:54.005 { 00:14:54.005 "name": "BaseBdev2", 00:14:54.005 "uuid": "a3e5b8c7-a4c3-4272-ab39-ddb44a1efd8f", 00:14:54.005 "is_configured": true, 00:14:54.005 "data_offset": 0, 00:14:54.005 "data_size": 65536 00:14:54.005 }, 00:14:54.005 { 00:14:54.005 "name": "BaseBdev3", 00:14:54.005 "uuid": "70bb2c7c-5c42-4c22-8102-18ffaba33b91", 00:14:54.005 "is_configured": true, 00:14:54.005 "data_offset": 0, 00:14:54.005 "data_size": 65536 00:14:54.005 }, 00:14:54.005 { 00:14:54.005 "name": "BaseBdev4", 00:14:54.005 "uuid": "25edd5aa-fa04-4b62-92f3-8a4b65cc75a2", 00:14:54.005 "is_configured": true, 00:14:54.005 "data_offset": 0, 00:14:54.005 "data_size": 65536 00:14:54.005 } 00:14:54.005 ] 00:14:54.005 }' 00:14:54.005 11:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:54.005 11:55:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:54.573 11:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:14:54.573 11:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:54.573 11:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:54.573 11:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:54.573 11:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:54.573 11:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:54.573 11:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:54.573 11:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:54.832 [2024-07-12 11:55:44.835474] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:54.832 11:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:54.832 "name": "Existed_Raid", 00:14:54.832 "aliases": [ 00:14:54.832 "e2c41c63-fa6d-4538-994a-82191b0e5e17" 00:14:54.832 ], 00:14:54.832 "product_name": "Raid Volume", 00:14:54.832 "block_size": 512, 00:14:54.832 "num_blocks": 262144, 00:14:54.832 "uuid": "e2c41c63-fa6d-4538-994a-82191b0e5e17", 00:14:54.832 "assigned_rate_limits": { 00:14:54.832 "rw_ios_per_sec": 0, 00:14:54.832 "rw_mbytes_per_sec": 0, 00:14:54.832 "r_mbytes_per_sec": 0, 00:14:54.832 "w_mbytes_per_sec": 0 00:14:54.832 }, 00:14:54.832 "claimed": false, 00:14:54.832 "zoned": false, 00:14:54.832 "supported_io_types": { 00:14:54.832 "read": true, 00:14:54.832 "write": true, 00:14:54.832 "unmap": true, 00:14:54.832 "flush": true, 00:14:54.832 "reset": true, 00:14:54.832 "nvme_admin": false, 00:14:54.832 "nvme_io": false, 00:14:54.832 "nvme_io_md": false, 00:14:54.832 "write_zeroes": true, 00:14:54.832 "zcopy": false, 00:14:54.832 "get_zone_info": false, 00:14:54.832 "zone_management": false, 00:14:54.832 "zone_append": false, 00:14:54.832 "compare": false, 00:14:54.832 "compare_and_write": false, 00:14:54.832 "abort": false, 00:14:54.832 "seek_hole": false, 00:14:54.832 "seek_data": false, 00:14:54.832 "copy": false, 00:14:54.832 "nvme_iov_md": false 00:14:54.832 }, 00:14:54.832 "memory_domains": [ 00:14:54.832 { 00:14:54.832 "dma_device_id": "system", 00:14:54.832 "dma_device_type": 1 00:14:54.832 }, 00:14:54.832 { 00:14:54.832 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:54.832 "dma_device_type": 2 00:14:54.832 }, 00:14:54.832 { 00:14:54.832 "dma_device_id": "system", 00:14:54.832 "dma_device_type": 1 00:14:54.832 }, 00:14:54.832 { 00:14:54.832 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:54.832 "dma_device_type": 2 00:14:54.832 }, 00:14:54.832 { 00:14:54.832 "dma_device_id": "system", 00:14:54.832 "dma_device_type": 1 00:14:54.832 }, 00:14:54.832 { 00:14:54.832 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:54.832 "dma_device_type": 2 00:14:54.832 }, 00:14:54.832 { 00:14:54.832 "dma_device_id": "system", 00:14:54.832 "dma_device_type": 1 00:14:54.832 }, 00:14:54.832 { 00:14:54.832 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:54.832 "dma_device_type": 2 00:14:54.832 } 00:14:54.832 ], 00:14:54.832 "driver_specific": { 00:14:54.832 "raid": { 00:14:54.832 "uuid": "e2c41c63-fa6d-4538-994a-82191b0e5e17", 00:14:54.832 "strip_size_kb": 64, 00:14:54.832 "state": "online", 00:14:54.832 "raid_level": "raid0", 00:14:54.832 "superblock": false, 00:14:54.832 "num_base_bdevs": 4, 00:14:54.832 "num_base_bdevs_discovered": 4, 00:14:54.832 "num_base_bdevs_operational": 4, 00:14:54.832 "base_bdevs_list": [ 00:14:54.832 { 00:14:54.832 "name": "NewBaseBdev", 00:14:54.832 "uuid": "0857a3e9-abb3-4159-90b4-a11ab10df23e", 00:14:54.832 "is_configured": true, 00:14:54.832 "data_offset": 0, 00:14:54.832 "data_size": 65536 00:14:54.832 }, 00:14:54.832 { 00:14:54.832 "name": "BaseBdev2", 00:14:54.832 "uuid": "a3e5b8c7-a4c3-4272-ab39-ddb44a1efd8f", 00:14:54.832 "is_configured": true, 00:14:54.832 "data_offset": 0, 00:14:54.832 "data_size": 65536 00:14:54.832 }, 00:14:54.832 { 00:14:54.832 "name": "BaseBdev3", 00:14:54.832 "uuid": "70bb2c7c-5c42-4c22-8102-18ffaba33b91", 00:14:54.832 "is_configured": true, 00:14:54.832 "data_offset": 0, 00:14:54.832 "data_size": 65536 00:14:54.832 }, 00:14:54.832 { 00:14:54.832 "name": "BaseBdev4", 00:14:54.832 "uuid": "25edd5aa-fa04-4b62-92f3-8a4b65cc75a2", 00:14:54.832 "is_configured": true, 00:14:54.832 "data_offset": 0, 00:14:54.832 "data_size": 65536 00:14:54.832 } 00:14:54.832 ] 00:14:54.832 } 00:14:54.832 } 00:14:54.832 }' 00:14:54.832 11:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:54.832 11:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:14:54.832 BaseBdev2 00:14:54.832 BaseBdev3 00:14:54.832 BaseBdev4' 00:14:54.832 11:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:54.832 11:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:54.832 11:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:54.832 11:55:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:54.832 "name": "NewBaseBdev", 00:14:54.832 "aliases": [ 00:14:54.832 "0857a3e9-abb3-4159-90b4-a11ab10df23e" 00:14:54.832 ], 00:14:54.832 "product_name": "Malloc disk", 00:14:54.832 "block_size": 512, 00:14:54.832 "num_blocks": 65536, 00:14:54.832 "uuid": "0857a3e9-abb3-4159-90b4-a11ab10df23e", 00:14:54.832 "assigned_rate_limits": { 00:14:54.832 "rw_ios_per_sec": 0, 00:14:54.832 "rw_mbytes_per_sec": 0, 00:14:54.832 "r_mbytes_per_sec": 0, 00:14:54.833 "w_mbytes_per_sec": 0 00:14:54.833 }, 00:14:54.833 "claimed": true, 00:14:54.833 "claim_type": "exclusive_write", 00:14:54.833 "zoned": false, 00:14:54.833 "supported_io_types": { 00:14:54.833 "read": true, 00:14:54.833 "write": true, 00:14:54.833 "unmap": true, 00:14:54.833 "flush": true, 00:14:54.833 "reset": true, 00:14:54.833 "nvme_admin": false, 00:14:54.833 "nvme_io": false, 00:14:54.833 "nvme_io_md": false, 00:14:54.833 "write_zeroes": true, 00:14:54.833 "zcopy": true, 00:14:54.833 "get_zone_info": false, 00:14:54.833 "zone_management": false, 00:14:54.833 "zone_append": false, 00:14:54.833 "compare": false, 00:14:54.833 "compare_and_write": false, 00:14:54.833 "abort": true, 00:14:54.833 "seek_hole": false, 00:14:54.833 "seek_data": false, 00:14:54.833 "copy": true, 00:14:54.833 "nvme_iov_md": false 00:14:54.833 }, 00:14:54.833 "memory_domains": [ 00:14:54.833 { 00:14:54.833 "dma_device_id": "system", 00:14:54.833 "dma_device_type": 1 00:14:54.833 }, 00:14:54.833 { 00:14:54.833 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:54.833 "dma_device_type": 2 00:14:54.833 } 00:14:54.833 ], 00:14:54.833 "driver_specific": {} 00:14:54.833 }' 00:14:54.833 11:55:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:55.092 11:55:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:55.092 11:55:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:55.092 11:55:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:55.092 11:55:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:55.092 11:55:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:55.092 11:55:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:55.092 11:55:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:55.092 11:55:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:55.092 11:55:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:55.092 11:55:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:55.351 11:55:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:55.351 11:55:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:55.351 11:55:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:55.351 11:55:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:55.351 11:55:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:55.351 "name": "BaseBdev2", 00:14:55.351 "aliases": [ 00:14:55.351 "a3e5b8c7-a4c3-4272-ab39-ddb44a1efd8f" 00:14:55.351 ], 00:14:55.351 "product_name": "Malloc disk", 00:14:55.351 "block_size": 512, 00:14:55.351 "num_blocks": 65536, 00:14:55.351 "uuid": "a3e5b8c7-a4c3-4272-ab39-ddb44a1efd8f", 00:14:55.351 "assigned_rate_limits": { 00:14:55.351 "rw_ios_per_sec": 0, 00:14:55.351 "rw_mbytes_per_sec": 0, 00:14:55.351 "r_mbytes_per_sec": 0, 00:14:55.351 "w_mbytes_per_sec": 0 00:14:55.351 }, 00:14:55.351 "claimed": true, 00:14:55.351 "claim_type": "exclusive_write", 00:14:55.351 "zoned": false, 00:14:55.352 "supported_io_types": { 00:14:55.352 "read": true, 00:14:55.352 "write": true, 00:14:55.352 "unmap": true, 00:14:55.352 "flush": true, 00:14:55.352 "reset": true, 00:14:55.352 "nvme_admin": false, 00:14:55.352 "nvme_io": false, 00:14:55.352 "nvme_io_md": false, 00:14:55.352 "write_zeroes": true, 00:14:55.352 "zcopy": true, 00:14:55.352 "get_zone_info": false, 00:14:55.352 "zone_management": false, 00:14:55.352 "zone_append": false, 00:14:55.352 "compare": false, 00:14:55.352 "compare_and_write": false, 00:14:55.352 "abort": true, 00:14:55.352 "seek_hole": false, 00:14:55.352 "seek_data": false, 00:14:55.352 "copy": true, 00:14:55.352 "nvme_iov_md": false 00:14:55.352 }, 00:14:55.352 "memory_domains": [ 00:14:55.352 { 00:14:55.352 "dma_device_id": "system", 00:14:55.352 "dma_device_type": 1 00:14:55.352 }, 00:14:55.352 { 00:14:55.352 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:55.352 "dma_device_type": 2 00:14:55.352 } 00:14:55.352 ], 00:14:55.352 "driver_specific": {} 00:14:55.352 }' 00:14:55.352 11:55:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:55.352 11:55:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:55.352 11:55:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:55.352 11:55:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:55.610 11:55:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:55.610 11:55:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:55.610 11:55:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:55.610 11:55:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:55.610 11:55:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:55.610 11:55:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:55.610 11:55:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:55.610 11:55:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:55.610 11:55:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:55.610 11:55:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:55.610 11:55:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:55.867 11:55:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:55.867 "name": "BaseBdev3", 00:14:55.867 "aliases": [ 00:14:55.867 "70bb2c7c-5c42-4c22-8102-18ffaba33b91" 00:14:55.867 ], 00:14:55.867 "product_name": "Malloc disk", 00:14:55.867 "block_size": 512, 00:14:55.867 "num_blocks": 65536, 00:14:55.867 "uuid": "70bb2c7c-5c42-4c22-8102-18ffaba33b91", 00:14:55.867 "assigned_rate_limits": { 00:14:55.867 "rw_ios_per_sec": 0, 00:14:55.867 "rw_mbytes_per_sec": 0, 00:14:55.867 "r_mbytes_per_sec": 0, 00:14:55.867 "w_mbytes_per_sec": 0 00:14:55.867 }, 00:14:55.867 "claimed": true, 00:14:55.867 "claim_type": "exclusive_write", 00:14:55.867 "zoned": false, 00:14:55.867 "supported_io_types": { 00:14:55.867 "read": true, 00:14:55.867 "write": true, 00:14:55.867 "unmap": true, 00:14:55.867 "flush": true, 00:14:55.867 "reset": true, 00:14:55.867 "nvme_admin": false, 00:14:55.867 "nvme_io": false, 00:14:55.867 "nvme_io_md": false, 00:14:55.867 "write_zeroes": true, 00:14:55.867 "zcopy": true, 00:14:55.867 "get_zone_info": false, 00:14:55.867 "zone_management": false, 00:14:55.867 "zone_append": false, 00:14:55.867 "compare": false, 00:14:55.867 "compare_and_write": false, 00:14:55.867 "abort": true, 00:14:55.867 "seek_hole": false, 00:14:55.867 "seek_data": false, 00:14:55.867 "copy": true, 00:14:55.867 "nvme_iov_md": false 00:14:55.867 }, 00:14:55.867 "memory_domains": [ 00:14:55.867 { 00:14:55.867 "dma_device_id": "system", 00:14:55.867 "dma_device_type": 1 00:14:55.867 }, 00:14:55.867 { 00:14:55.867 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:55.867 "dma_device_type": 2 00:14:55.867 } 00:14:55.867 ], 00:14:55.867 "driver_specific": {} 00:14:55.867 }' 00:14:55.867 11:55:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:55.867 11:55:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:55.867 11:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:55.867 11:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:55.867 11:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:56.125 11:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:56.125 11:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:56.125 11:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:56.125 11:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:56.125 11:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:56.125 11:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:56.125 11:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:56.125 11:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:56.125 11:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:14:56.125 11:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:56.383 11:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:56.383 "name": "BaseBdev4", 00:14:56.383 "aliases": [ 00:14:56.383 "25edd5aa-fa04-4b62-92f3-8a4b65cc75a2" 00:14:56.383 ], 00:14:56.383 "product_name": "Malloc disk", 00:14:56.383 "block_size": 512, 00:14:56.383 "num_blocks": 65536, 00:14:56.383 "uuid": "25edd5aa-fa04-4b62-92f3-8a4b65cc75a2", 00:14:56.383 "assigned_rate_limits": { 00:14:56.383 "rw_ios_per_sec": 0, 00:14:56.383 "rw_mbytes_per_sec": 0, 00:14:56.383 "r_mbytes_per_sec": 0, 00:14:56.383 "w_mbytes_per_sec": 0 00:14:56.383 }, 00:14:56.383 "claimed": true, 00:14:56.383 "claim_type": "exclusive_write", 00:14:56.383 "zoned": false, 00:14:56.383 "supported_io_types": { 00:14:56.383 "read": true, 00:14:56.383 "write": true, 00:14:56.383 "unmap": true, 00:14:56.383 "flush": true, 00:14:56.383 "reset": true, 00:14:56.383 "nvme_admin": false, 00:14:56.383 "nvme_io": false, 00:14:56.383 "nvme_io_md": false, 00:14:56.383 "write_zeroes": true, 00:14:56.383 "zcopy": true, 00:14:56.383 "get_zone_info": false, 00:14:56.383 "zone_management": false, 00:14:56.383 "zone_append": false, 00:14:56.383 "compare": false, 00:14:56.383 "compare_and_write": false, 00:14:56.383 "abort": true, 00:14:56.383 "seek_hole": false, 00:14:56.383 "seek_data": false, 00:14:56.383 "copy": true, 00:14:56.383 "nvme_iov_md": false 00:14:56.383 }, 00:14:56.383 "memory_domains": [ 00:14:56.383 { 00:14:56.383 "dma_device_id": "system", 00:14:56.383 "dma_device_type": 1 00:14:56.383 }, 00:14:56.383 { 00:14:56.383 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:56.383 "dma_device_type": 2 00:14:56.383 } 00:14:56.383 ], 00:14:56.383 "driver_specific": {} 00:14:56.383 }' 00:14:56.383 11:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:56.383 11:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:56.383 11:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:56.383 11:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:56.383 11:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:56.383 11:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:56.383 11:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:56.383 11:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:56.383 11:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:56.383 11:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:56.642 11:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:56.642 11:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:56.642 11:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:56.642 [2024-07-12 11:55:46.824383] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:56.642 [2024-07-12 11:55:46.824401] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:56.642 [2024-07-12 11:55:46.824439] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:56.642 [2024-07-12 11:55:46.824480] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:56.642 [2024-07-12 11:55:46.824486] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25676c0 name Existed_Raid, state offline 00:14:56.642 11:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 649337 00:14:56.642 11:55:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 649337 ']' 00:14:56.642 11:55:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 649337 00:14:56.642 11:55:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:14:56.642 11:55:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:56.642 11:55:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 649337 00:14:56.642 11:55:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:56.642 11:55:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:56.642 11:55:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 649337' 00:14:56.642 killing process with pid 649337 00:14:56.642 11:55:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 649337 00:14:56.642 [2024-07-12 11:55:46.869626] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:56.642 11:55:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 649337 00:14:56.901 [2024-07-12 11:55:46.900594] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:56.901 11:55:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:14:56.901 00:14:56.901 real 0m23.700s 00:14:56.901 user 0m44.178s 00:14:56.901 sys 0m3.564s 00:14:56.901 11:55:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:56.901 11:55:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:56.901 ************************************ 00:14:56.901 END TEST raid_state_function_test 00:14:56.901 ************************************ 00:14:56.901 11:55:47 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:56.901 11:55:47 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 4 true 00:14:56.901 11:55:47 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:56.901 11:55:47 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:56.901 11:55:47 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:56.901 ************************************ 00:14:56.901 START TEST raid_state_function_test_sb 00:14:56.901 ************************************ 00:14:56.901 11:55:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 4 true 00:14:56.901 11:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:14:56.901 11:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:14:56.901 11:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:14:56.901 11:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:14:57.161 11:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:14:57.161 11:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:57.161 11:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:14:57.161 11:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:57.161 11:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:57.161 11:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:14:57.161 11:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:57.161 11:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:57.161 11:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:14:57.161 11:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:57.161 11:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:57.161 11:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:14:57.161 11:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:57.161 11:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:57.161 11:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:14:57.161 11:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:14:57.161 11:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:14:57.161 11:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:14:57.161 11:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:14:57.161 11:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:14:57.162 11:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:14:57.162 11:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:14:57.162 11:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:14:57.162 11:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:14:57.162 11:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:14:57.162 11:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=653885 00:14:57.162 11:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 653885' 00:14:57.162 Process raid pid: 653885 00:14:57.162 11:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:57.162 11:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 653885 /var/tmp/spdk-raid.sock 00:14:57.162 11:55:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 653885 ']' 00:14:57.162 11:55:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:57.162 11:55:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:57.162 11:55:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:57.162 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:57.162 11:55:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:57.162 11:55:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:57.162 [2024-07-12 11:55:47.206027] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:14:57.162 [2024-07-12 11:55:47.206069] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:57.162 [2024-07-12 11:55:47.274293] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:57.162 [2024-07-12 11:55:47.348226] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:14:57.162 [2024-07-12 11:55:47.399240] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:57.162 [2024-07-12 11:55:47.399261] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:58.099 11:55:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:58.099 11:55:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:14:58.099 11:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:14:58.099 [2024-07-12 11:55:48.134608] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:58.099 [2024-07-12 11:55:48.134638] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:58.099 [2024-07-12 11:55:48.134644] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:58.099 [2024-07-12 11:55:48.134649] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:58.099 [2024-07-12 11:55:48.134653] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:58.099 [2024-07-12 11:55:48.134657] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:58.099 [2024-07-12 11:55:48.134678] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:14:58.099 [2024-07-12 11:55:48.134683] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:14:58.100 11:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:58.100 11:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:58.100 11:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:58.100 11:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:58.100 11:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:58.100 11:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:58.100 11:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:58.100 11:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:58.100 11:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:58.100 11:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:58.100 11:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:58.100 11:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:58.100 11:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:58.100 "name": "Existed_Raid", 00:14:58.100 "uuid": "48f6ad21-2caa-4ce4-8f79-2afafe7c4c3d", 00:14:58.100 "strip_size_kb": 64, 00:14:58.100 "state": "configuring", 00:14:58.100 "raid_level": "raid0", 00:14:58.100 "superblock": true, 00:14:58.100 "num_base_bdevs": 4, 00:14:58.100 "num_base_bdevs_discovered": 0, 00:14:58.100 "num_base_bdevs_operational": 4, 00:14:58.100 "base_bdevs_list": [ 00:14:58.100 { 00:14:58.100 "name": "BaseBdev1", 00:14:58.100 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:58.100 "is_configured": false, 00:14:58.100 "data_offset": 0, 00:14:58.100 "data_size": 0 00:14:58.100 }, 00:14:58.100 { 00:14:58.100 "name": "BaseBdev2", 00:14:58.100 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:58.100 "is_configured": false, 00:14:58.100 "data_offset": 0, 00:14:58.100 "data_size": 0 00:14:58.100 }, 00:14:58.100 { 00:14:58.100 "name": "BaseBdev3", 00:14:58.100 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:58.100 "is_configured": false, 00:14:58.100 "data_offset": 0, 00:14:58.100 "data_size": 0 00:14:58.100 }, 00:14:58.100 { 00:14:58.100 "name": "BaseBdev4", 00:14:58.100 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:58.100 "is_configured": false, 00:14:58.100 "data_offset": 0, 00:14:58.100 "data_size": 0 00:14:58.100 } 00:14:58.100 ] 00:14:58.100 }' 00:14:58.100 11:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:58.100 11:55:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:58.694 11:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:58.952 [2024-07-12 11:55:48.980707] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:58.952 [2024-07-12 11:55:48.980728] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18d21f0 name Existed_Raid, state configuring 00:14:58.952 11:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:14:58.952 [2024-07-12 11:55:49.145140] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:58.952 [2024-07-12 11:55:49.145159] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:58.952 [2024-07-12 11:55:49.145164] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:58.952 [2024-07-12 11:55:49.145168] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:58.952 [2024-07-12 11:55:49.145172] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:58.952 [2024-07-12 11:55:49.145177] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:58.952 [2024-07-12 11:55:49.145181] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:14:58.952 [2024-07-12 11:55:49.145185] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:14:58.952 11:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:59.210 [2024-07-12 11:55:49.313776] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:59.210 BaseBdev1 00:14:59.210 11:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:14:59.210 11:55:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:59.210 11:55:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:59.210 11:55:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:59.210 11:55:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:59.210 11:55:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:59.210 11:55:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:59.467 11:55:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:59.467 [ 00:14:59.467 { 00:14:59.467 "name": "BaseBdev1", 00:14:59.467 "aliases": [ 00:14:59.467 "98bf85cd-74ae-4685-993c-8571432c29fb" 00:14:59.467 ], 00:14:59.467 "product_name": "Malloc disk", 00:14:59.467 "block_size": 512, 00:14:59.467 "num_blocks": 65536, 00:14:59.467 "uuid": "98bf85cd-74ae-4685-993c-8571432c29fb", 00:14:59.467 "assigned_rate_limits": { 00:14:59.467 "rw_ios_per_sec": 0, 00:14:59.467 "rw_mbytes_per_sec": 0, 00:14:59.467 "r_mbytes_per_sec": 0, 00:14:59.467 "w_mbytes_per_sec": 0 00:14:59.467 }, 00:14:59.467 "claimed": true, 00:14:59.467 "claim_type": "exclusive_write", 00:14:59.467 "zoned": false, 00:14:59.467 "supported_io_types": { 00:14:59.467 "read": true, 00:14:59.467 "write": true, 00:14:59.467 "unmap": true, 00:14:59.467 "flush": true, 00:14:59.467 "reset": true, 00:14:59.467 "nvme_admin": false, 00:14:59.467 "nvme_io": false, 00:14:59.467 "nvme_io_md": false, 00:14:59.467 "write_zeroes": true, 00:14:59.467 "zcopy": true, 00:14:59.467 "get_zone_info": false, 00:14:59.467 "zone_management": false, 00:14:59.467 "zone_append": false, 00:14:59.467 "compare": false, 00:14:59.467 "compare_and_write": false, 00:14:59.467 "abort": true, 00:14:59.467 "seek_hole": false, 00:14:59.467 "seek_data": false, 00:14:59.467 "copy": true, 00:14:59.467 "nvme_iov_md": false 00:14:59.467 }, 00:14:59.467 "memory_domains": [ 00:14:59.467 { 00:14:59.467 "dma_device_id": "system", 00:14:59.467 "dma_device_type": 1 00:14:59.467 }, 00:14:59.467 { 00:14:59.467 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:59.467 "dma_device_type": 2 00:14:59.467 } 00:14:59.467 ], 00:14:59.467 "driver_specific": {} 00:14:59.467 } 00:14:59.467 ] 00:14:59.467 11:55:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:59.468 11:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:59.468 11:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:59.468 11:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:59.468 11:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:59.468 11:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:59.468 11:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:59.468 11:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:59.468 11:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:59.468 11:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:59.468 11:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:59.468 11:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:59.468 11:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:59.725 11:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:59.725 "name": "Existed_Raid", 00:14:59.725 "uuid": "bc337f3d-8688-4177-9648-68c19e891df9", 00:14:59.725 "strip_size_kb": 64, 00:14:59.725 "state": "configuring", 00:14:59.725 "raid_level": "raid0", 00:14:59.725 "superblock": true, 00:14:59.725 "num_base_bdevs": 4, 00:14:59.725 "num_base_bdevs_discovered": 1, 00:14:59.725 "num_base_bdevs_operational": 4, 00:14:59.725 "base_bdevs_list": [ 00:14:59.725 { 00:14:59.725 "name": "BaseBdev1", 00:14:59.725 "uuid": "98bf85cd-74ae-4685-993c-8571432c29fb", 00:14:59.725 "is_configured": true, 00:14:59.725 "data_offset": 2048, 00:14:59.725 "data_size": 63488 00:14:59.725 }, 00:14:59.725 { 00:14:59.725 "name": "BaseBdev2", 00:14:59.725 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:59.725 "is_configured": false, 00:14:59.725 "data_offset": 0, 00:14:59.725 "data_size": 0 00:14:59.725 }, 00:14:59.725 { 00:14:59.725 "name": "BaseBdev3", 00:14:59.725 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:59.725 "is_configured": false, 00:14:59.725 "data_offset": 0, 00:14:59.725 "data_size": 0 00:14:59.725 }, 00:14:59.725 { 00:14:59.725 "name": "BaseBdev4", 00:14:59.725 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:59.725 "is_configured": false, 00:14:59.725 "data_offset": 0, 00:14:59.725 "data_size": 0 00:14:59.725 } 00:14:59.725 ] 00:14:59.725 }' 00:14:59.725 11:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:59.725 11:55:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:00.288 11:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:00.288 [2024-07-12 11:55:50.480780] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:00.288 [2024-07-12 11:55:50.480812] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18d1a60 name Existed_Raid, state configuring 00:15:00.288 11:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:00.547 [2024-07-12 11:55:50.665293] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:00.547 [2024-07-12 11:55:50.666341] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:00.547 [2024-07-12 11:55:50.666364] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:00.547 [2024-07-12 11:55:50.666369] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:00.547 [2024-07-12 11:55:50.666374] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:00.547 [2024-07-12 11:55:50.666379] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:00.547 [2024-07-12 11:55:50.666384] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:00.547 11:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:00.547 11:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:00.547 11:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:00.547 11:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:00.547 11:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:00.547 11:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:00.547 11:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:00.547 11:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:00.547 11:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:00.547 11:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:00.547 11:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:00.547 11:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:00.547 11:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:00.547 11:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:00.815 11:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:00.815 "name": "Existed_Raid", 00:15:00.815 "uuid": "e4095e19-f4e4-42e1-b0c6-9972379470bd", 00:15:00.815 "strip_size_kb": 64, 00:15:00.815 "state": "configuring", 00:15:00.815 "raid_level": "raid0", 00:15:00.815 "superblock": true, 00:15:00.815 "num_base_bdevs": 4, 00:15:00.815 "num_base_bdevs_discovered": 1, 00:15:00.815 "num_base_bdevs_operational": 4, 00:15:00.815 "base_bdevs_list": [ 00:15:00.815 { 00:15:00.815 "name": "BaseBdev1", 00:15:00.815 "uuid": "98bf85cd-74ae-4685-993c-8571432c29fb", 00:15:00.815 "is_configured": true, 00:15:00.815 "data_offset": 2048, 00:15:00.815 "data_size": 63488 00:15:00.815 }, 00:15:00.815 { 00:15:00.815 "name": "BaseBdev2", 00:15:00.815 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:00.815 "is_configured": false, 00:15:00.815 "data_offset": 0, 00:15:00.815 "data_size": 0 00:15:00.815 }, 00:15:00.815 { 00:15:00.815 "name": "BaseBdev3", 00:15:00.815 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:00.815 "is_configured": false, 00:15:00.815 "data_offset": 0, 00:15:00.815 "data_size": 0 00:15:00.815 }, 00:15:00.815 { 00:15:00.815 "name": "BaseBdev4", 00:15:00.815 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:00.815 "is_configured": false, 00:15:00.815 "data_offset": 0, 00:15:00.815 "data_size": 0 00:15:00.815 } 00:15:00.815 ] 00:15:00.815 }' 00:15:00.815 11:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:00.815 11:55:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:01.111 11:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:01.394 [2024-07-12 11:55:51.518229] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:01.394 BaseBdev2 00:15:01.394 11:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:01.394 11:55:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:01.394 11:55:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:01.394 11:55:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:01.394 11:55:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:01.394 11:55:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:01.394 11:55:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:01.653 11:55:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:01.653 [ 00:15:01.653 { 00:15:01.653 "name": "BaseBdev2", 00:15:01.653 "aliases": [ 00:15:01.653 "a5db1075-4be8-4f29-9d4b-7a87a625cf24" 00:15:01.653 ], 00:15:01.653 "product_name": "Malloc disk", 00:15:01.653 "block_size": 512, 00:15:01.653 "num_blocks": 65536, 00:15:01.653 "uuid": "a5db1075-4be8-4f29-9d4b-7a87a625cf24", 00:15:01.653 "assigned_rate_limits": { 00:15:01.653 "rw_ios_per_sec": 0, 00:15:01.653 "rw_mbytes_per_sec": 0, 00:15:01.653 "r_mbytes_per_sec": 0, 00:15:01.653 "w_mbytes_per_sec": 0 00:15:01.653 }, 00:15:01.653 "claimed": true, 00:15:01.653 "claim_type": "exclusive_write", 00:15:01.653 "zoned": false, 00:15:01.653 "supported_io_types": { 00:15:01.653 "read": true, 00:15:01.653 "write": true, 00:15:01.653 "unmap": true, 00:15:01.653 "flush": true, 00:15:01.653 "reset": true, 00:15:01.653 "nvme_admin": false, 00:15:01.653 "nvme_io": false, 00:15:01.653 "nvme_io_md": false, 00:15:01.653 "write_zeroes": true, 00:15:01.653 "zcopy": true, 00:15:01.653 "get_zone_info": false, 00:15:01.653 "zone_management": false, 00:15:01.653 "zone_append": false, 00:15:01.653 "compare": false, 00:15:01.653 "compare_and_write": false, 00:15:01.653 "abort": true, 00:15:01.653 "seek_hole": false, 00:15:01.653 "seek_data": false, 00:15:01.653 "copy": true, 00:15:01.653 "nvme_iov_md": false 00:15:01.653 }, 00:15:01.653 "memory_domains": [ 00:15:01.653 { 00:15:01.653 "dma_device_id": "system", 00:15:01.653 "dma_device_type": 1 00:15:01.653 }, 00:15:01.653 { 00:15:01.653 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:01.653 "dma_device_type": 2 00:15:01.653 } 00:15:01.653 ], 00:15:01.653 "driver_specific": {} 00:15:01.653 } 00:15:01.653 ] 00:15:01.653 11:55:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:01.653 11:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:01.653 11:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:01.653 11:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:01.653 11:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:01.653 11:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:01.653 11:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:01.653 11:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:01.653 11:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:01.653 11:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:01.653 11:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:01.653 11:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:01.653 11:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:01.653 11:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:01.653 11:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:01.912 11:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:01.912 "name": "Existed_Raid", 00:15:01.912 "uuid": "e4095e19-f4e4-42e1-b0c6-9972379470bd", 00:15:01.912 "strip_size_kb": 64, 00:15:01.912 "state": "configuring", 00:15:01.912 "raid_level": "raid0", 00:15:01.912 "superblock": true, 00:15:01.912 "num_base_bdevs": 4, 00:15:01.912 "num_base_bdevs_discovered": 2, 00:15:01.912 "num_base_bdevs_operational": 4, 00:15:01.912 "base_bdevs_list": [ 00:15:01.912 { 00:15:01.912 "name": "BaseBdev1", 00:15:01.912 "uuid": "98bf85cd-74ae-4685-993c-8571432c29fb", 00:15:01.912 "is_configured": true, 00:15:01.912 "data_offset": 2048, 00:15:01.912 "data_size": 63488 00:15:01.912 }, 00:15:01.912 { 00:15:01.912 "name": "BaseBdev2", 00:15:01.912 "uuid": "a5db1075-4be8-4f29-9d4b-7a87a625cf24", 00:15:01.912 "is_configured": true, 00:15:01.912 "data_offset": 2048, 00:15:01.912 "data_size": 63488 00:15:01.912 }, 00:15:01.912 { 00:15:01.912 "name": "BaseBdev3", 00:15:01.912 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:01.912 "is_configured": false, 00:15:01.912 "data_offset": 0, 00:15:01.912 "data_size": 0 00:15:01.912 }, 00:15:01.912 { 00:15:01.912 "name": "BaseBdev4", 00:15:01.912 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:01.912 "is_configured": false, 00:15:01.912 "data_offset": 0, 00:15:01.912 "data_size": 0 00:15:01.912 } 00:15:01.912 ] 00:15:01.912 }' 00:15:01.912 11:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:01.912 11:55:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:02.479 11:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:02.479 [2024-07-12 11:55:52.659853] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:02.479 BaseBdev3 00:15:02.479 11:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:15:02.479 11:55:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:02.479 11:55:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:02.479 11:55:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:02.479 11:55:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:02.479 11:55:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:02.479 11:55:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:02.739 11:55:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:02.997 [ 00:15:02.997 { 00:15:02.997 "name": "BaseBdev3", 00:15:02.997 "aliases": [ 00:15:02.997 "94ac7a81-5cee-4478-92b1-e685654043e0" 00:15:02.997 ], 00:15:02.997 "product_name": "Malloc disk", 00:15:02.997 "block_size": 512, 00:15:02.997 "num_blocks": 65536, 00:15:02.997 "uuid": "94ac7a81-5cee-4478-92b1-e685654043e0", 00:15:02.997 "assigned_rate_limits": { 00:15:02.997 "rw_ios_per_sec": 0, 00:15:02.997 "rw_mbytes_per_sec": 0, 00:15:02.997 "r_mbytes_per_sec": 0, 00:15:02.997 "w_mbytes_per_sec": 0 00:15:02.997 }, 00:15:02.997 "claimed": true, 00:15:02.997 "claim_type": "exclusive_write", 00:15:02.997 "zoned": false, 00:15:02.997 "supported_io_types": { 00:15:02.997 "read": true, 00:15:02.997 "write": true, 00:15:02.997 "unmap": true, 00:15:02.997 "flush": true, 00:15:02.997 "reset": true, 00:15:02.997 "nvme_admin": false, 00:15:02.997 "nvme_io": false, 00:15:02.997 "nvme_io_md": false, 00:15:02.997 "write_zeroes": true, 00:15:02.997 "zcopy": true, 00:15:02.997 "get_zone_info": false, 00:15:02.997 "zone_management": false, 00:15:02.997 "zone_append": false, 00:15:02.997 "compare": false, 00:15:02.997 "compare_and_write": false, 00:15:02.997 "abort": true, 00:15:02.997 "seek_hole": false, 00:15:02.997 "seek_data": false, 00:15:02.997 "copy": true, 00:15:02.997 "nvme_iov_md": false 00:15:02.997 }, 00:15:02.997 "memory_domains": [ 00:15:02.997 { 00:15:02.997 "dma_device_id": "system", 00:15:02.997 "dma_device_type": 1 00:15:02.997 }, 00:15:02.997 { 00:15:02.997 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:02.997 "dma_device_type": 2 00:15:02.997 } 00:15:02.997 ], 00:15:02.997 "driver_specific": {} 00:15:02.997 } 00:15:02.997 ] 00:15:02.997 11:55:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:02.997 11:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:02.997 11:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:02.997 11:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:02.997 11:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:02.997 11:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:02.997 11:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:02.997 11:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:02.997 11:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:02.997 11:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:02.997 11:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:02.997 11:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:02.997 11:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:02.997 11:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:02.997 11:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:02.997 11:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:02.997 "name": "Existed_Raid", 00:15:02.997 "uuid": "e4095e19-f4e4-42e1-b0c6-9972379470bd", 00:15:02.997 "strip_size_kb": 64, 00:15:02.997 "state": "configuring", 00:15:02.997 "raid_level": "raid0", 00:15:02.997 "superblock": true, 00:15:02.997 "num_base_bdevs": 4, 00:15:02.997 "num_base_bdevs_discovered": 3, 00:15:02.997 "num_base_bdevs_operational": 4, 00:15:02.997 "base_bdevs_list": [ 00:15:02.997 { 00:15:02.997 "name": "BaseBdev1", 00:15:02.997 "uuid": "98bf85cd-74ae-4685-993c-8571432c29fb", 00:15:02.997 "is_configured": true, 00:15:02.997 "data_offset": 2048, 00:15:02.997 "data_size": 63488 00:15:02.997 }, 00:15:02.997 { 00:15:02.997 "name": "BaseBdev2", 00:15:02.997 "uuid": "a5db1075-4be8-4f29-9d4b-7a87a625cf24", 00:15:02.997 "is_configured": true, 00:15:02.997 "data_offset": 2048, 00:15:02.997 "data_size": 63488 00:15:02.997 }, 00:15:02.997 { 00:15:02.997 "name": "BaseBdev3", 00:15:02.997 "uuid": "94ac7a81-5cee-4478-92b1-e685654043e0", 00:15:02.997 "is_configured": true, 00:15:02.997 "data_offset": 2048, 00:15:02.997 "data_size": 63488 00:15:02.997 }, 00:15:02.997 { 00:15:02.997 "name": "BaseBdev4", 00:15:02.997 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:02.997 "is_configured": false, 00:15:02.997 "data_offset": 0, 00:15:02.997 "data_size": 0 00:15:02.997 } 00:15:02.997 ] 00:15:02.997 }' 00:15:02.997 11:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:02.997 11:55:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:03.563 11:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:15:03.821 [2024-07-12 11:55:53.825460] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:15:03.821 [2024-07-12 11:55:53.825592] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x18d2b90 00:15:03.822 [2024-07-12 11:55:53.825604] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:15:03.822 [2024-07-12 11:55:53.825723] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18d2700 00:15:03.822 [2024-07-12 11:55:53.825806] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18d2b90 00:15:03.822 [2024-07-12 11:55:53.825811] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x18d2b90 00:15:03.822 [2024-07-12 11:55:53.825875] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:03.822 BaseBdev4 00:15:03.822 11:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:15:03.822 11:55:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:15:03.822 11:55:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:03.822 11:55:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:03.822 11:55:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:03.822 11:55:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:03.822 11:55:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:03.822 11:55:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:15:04.080 [ 00:15:04.080 { 00:15:04.080 "name": "BaseBdev4", 00:15:04.080 "aliases": [ 00:15:04.080 "e801bba3-952b-434e-b553-1491eee0761e" 00:15:04.080 ], 00:15:04.080 "product_name": "Malloc disk", 00:15:04.080 "block_size": 512, 00:15:04.080 "num_blocks": 65536, 00:15:04.080 "uuid": "e801bba3-952b-434e-b553-1491eee0761e", 00:15:04.080 "assigned_rate_limits": { 00:15:04.080 "rw_ios_per_sec": 0, 00:15:04.080 "rw_mbytes_per_sec": 0, 00:15:04.080 "r_mbytes_per_sec": 0, 00:15:04.080 "w_mbytes_per_sec": 0 00:15:04.080 }, 00:15:04.080 "claimed": true, 00:15:04.080 "claim_type": "exclusive_write", 00:15:04.080 "zoned": false, 00:15:04.080 "supported_io_types": { 00:15:04.080 "read": true, 00:15:04.080 "write": true, 00:15:04.080 "unmap": true, 00:15:04.080 "flush": true, 00:15:04.080 "reset": true, 00:15:04.080 "nvme_admin": false, 00:15:04.080 "nvme_io": false, 00:15:04.080 "nvme_io_md": false, 00:15:04.080 "write_zeroes": true, 00:15:04.080 "zcopy": true, 00:15:04.080 "get_zone_info": false, 00:15:04.080 "zone_management": false, 00:15:04.080 "zone_append": false, 00:15:04.080 "compare": false, 00:15:04.080 "compare_and_write": false, 00:15:04.080 "abort": true, 00:15:04.080 "seek_hole": false, 00:15:04.080 "seek_data": false, 00:15:04.080 "copy": true, 00:15:04.080 "nvme_iov_md": false 00:15:04.081 }, 00:15:04.081 "memory_domains": [ 00:15:04.081 { 00:15:04.081 "dma_device_id": "system", 00:15:04.081 "dma_device_type": 1 00:15:04.081 }, 00:15:04.081 { 00:15:04.081 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:04.081 "dma_device_type": 2 00:15:04.081 } 00:15:04.081 ], 00:15:04.081 "driver_specific": {} 00:15:04.081 } 00:15:04.081 ] 00:15:04.081 11:55:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:04.081 11:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:04.081 11:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:04.081 11:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:15:04.081 11:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:04.081 11:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:04.081 11:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:04.081 11:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:04.081 11:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:04.081 11:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:04.081 11:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:04.081 11:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:04.081 11:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:04.081 11:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:04.081 11:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:04.339 11:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:04.339 "name": "Existed_Raid", 00:15:04.339 "uuid": "e4095e19-f4e4-42e1-b0c6-9972379470bd", 00:15:04.339 "strip_size_kb": 64, 00:15:04.339 "state": "online", 00:15:04.339 "raid_level": "raid0", 00:15:04.339 "superblock": true, 00:15:04.339 "num_base_bdevs": 4, 00:15:04.339 "num_base_bdevs_discovered": 4, 00:15:04.339 "num_base_bdevs_operational": 4, 00:15:04.339 "base_bdevs_list": [ 00:15:04.339 { 00:15:04.339 "name": "BaseBdev1", 00:15:04.339 "uuid": "98bf85cd-74ae-4685-993c-8571432c29fb", 00:15:04.339 "is_configured": true, 00:15:04.339 "data_offset": 2048, 00:15:04.339 "data_size": 63488 00:15:04.339 }, 00:15:04.339 { 00:15:04.339 "name": "BaseBdev2", 00:15:04.339 "uuid": "a5db1075-4be8-4f29-9d4b-7a87a625cf24", 00:15:04.339 "is_configured": true, 00:15:04.339 "data_offset": 2048, 00:15:04.339 "data_size": 63488 00:15:04.339 }, 00:15:04.339 { 00:15:04.340 "name": "BaseBdev3", 00:15:04.340 "uuid": "94ac7a81-5cee-4478-92b1-e685654043e0", 00:15:04.340 "is_configured": true, 00:15:04.340 "data_offset": 2048, 00:15:04.340 "data_size": 63488 00:15:04.340 }, 00:15:04.340 { 00:15:04.340 "name": "BaseBdev4", 00:15:04.340 "uuid": "e801bba3-952b-434e-b553-1491eee0761e", 00:15:04.340 "is_configured": true, 00:15:04.340 "data_offset": 2048, 00:15:04.340 "data_size": 63488 00:15:04.340 } 00:15:04.340 ] 00:15:04.340 }' 00:15:04.340 11:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:04.340 11:55:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:04.598 11:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:04.598 11:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:04.598 11:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:04.598 11:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:04.598 11:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:04.598 11:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:04.598 11:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:04.598 11:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:04.857 [2024-07-12 11:55:54.980690] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:04.857 11:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:04.857 "name": "Existed_Raid", 00:15:04.857 "aliases": [ 00:15:04.857 "e4095e19-f4e4-42e1-b0c6-9972379470bd" 00:15:04.857 ], 00:15:04.857 "product_name": "Raid Volume", 00:15:04.857 "block_size": 512, 00:15:04.857 "num_blocks": 253952, 00:15:04.857 "uuid": "e4095e19-f4e4-42e1-b0c6-9972379470bd", 00:15:04.857 "assigned_rate_limits": { 00:15:04.857 "rw_ios_per_sec": 0, 00:15:04.857 "rw_mbytes_per_sec": 0, 00:15:04.857 "r_mbytes_per_sec": 0, 00:15:04.857 "w_mbytes_per_sec": 0 00:15:04.857 }, 00:15:04.857 "claimed": false, 00:15:04.857 "zoned": false, 00:15:04.857 "supported_io_types": { 00:15:04.857 "read": true, 00:15:04.857 "write": true, 00:15:04.857 "unmap": true, 00:15:04.857 "flush": true, 00:15:04.857 "reset": true, 00:15:04.857 "nvme_admin": false, 00:15:04.857 "nvme_io": false, 00:15:04.857 "nvme_io_md": false, 00:15:04.857 "write_zeroes": true, 00:15:04.857 "zcopy": false, 00:15:04.857 "get_zone_info": false, 00:15:04.857 "zone_management": false, 00:15:04.857 "zone_append": false, 00:15:04.857 "compare": false, 00:15:04.857 "compare_and_write": false, 00:15:04.857 "abort": false, 00:15:04.857 "seek_hole": false, 00:15:04.857 "seek_data": false, 00:15:04.857 "copy": false, 00:15:04.857 "nvme_iov_md": false 00:15:04.857 }, 00:15:04.857 "memory_domains": [ 00:15:04.857 { 00:15:04.857 "dma_device_id": "system", 00:15:04.857 "dma_device_type": 1 00:15:04.857 }, 00:15:04.857 { 00:15:04.857 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:04.857 "dma_device_type": 2 00:15:04.857 }, 00:15:04.857 { 00:15:04.857 "dma_device_id": "system", 00:15:04.857 "dma_device_type": 1 00:15:04.857 }, 00:15:04.857 { 00:15:04.857 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:04.857 "dma_device_type": 2 00:15:04.857 }, 00:15:04.857 { 00:15:04.857 "dma_device_id": "system", 00:15:04.857 "dma_device_type": 1 00:15:04.857 }, 00:15:04.857 { 00:15:04.857 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:04.857 "dma_device_type": 2 00:15:04.857 }, 00:15:04.857 { 00:15:04.857 "dma_device_id": "system", 00:15:04.857 "dma_device_type": 1 00:15:04.857 }, 00:15:04.857 { 00:15:04.857 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:04.857 "dma_device_type": 2 00:15:04.857 } 00:15:04.857 ], 00:15:04.857 "driver_specific": { 00:15:04.857 "raid": { 00:15:04.857 "uuid": "e4095e19-f4e4-42e1-b0c6-9972379470bd", 00:15:04.857 "strip_size_kb": 64, 00:15:04.857 "state": "online", 00:15:04.857 "raid_level": "raid0", 00:15:04.857 "superblock": true, 00:15:04.857 "num_base_bdevs": 4, 00:15:04.857 "num_base_bdevs_discovered": 4, 00:15:04.857 "num_base_bdevs_operational": 4, 00:15:04.857 "base_bdevs_list": [ 00:15:04.857 { 00:15:04.857 "name": "BaseBdev1", 00:15:04.857 "uuid": "98bf85cd-74ae-4685-993c-8571432c29fb", 00:15:04.857 "is_configured": true, 00:15:04.857 "data_offset": 2048, 00:15:04.857 "data_size": 63488 00:15:04.857 }, 00:15:04.857 { 00:15:04.857 "name": "BaseBdev2", 00:15:04.857 "uuid": "a5db1075-4be8-4f29-9d4b-7a87a625cf24", 00:15:04.857 "is_configured": true, 00:15:04.857 "data_offset": 2048, 00:15:04.857 "data_size": 63488 00:15:04.857 }, 00:15:04.857 { 00:15:04.857 "name": "BaseBdev3", 00:15:04.857 "uuid": "94ac7a81-5cee-4478-92b1-e685654043e0", 00:15:04.857 "is_configured": true, 00:15:04.857 "data_offset": 2048, 00:15:04.857 "data_size": 63488 00:15:04.857 }, 00:15:04.857 { 00:15:04.857 "name": "BaseBdev4", 00:15:04.857 "uuid": "e801bba3-952b-434e-b553-1491eee0761e", 00:15:04.857 "is_configured": true, 00:15:04.857 "data_offset": 2048, 00:15:04.857 "data_size": 63488 00:15:04.857 } 00:15:04.857 ] 00:15:04.857 } 00:15:04.857 } 00:15:04.857 }' 00:15:04.857 11:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:04.857 11:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:15:04.857 BaseBdev2 00:15:04.857 BaseBdev3 00:15:04.857 BaseBdev4' 00:15:04.857 11:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:04.857 11:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:04.857 11:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:05.116 11:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:05.116 "name": "BaseBdev1", 00:15:05.116 "aliases": [ 00:15:05.116 "98bf85cd-74ae-4685-993c-8571432c29fb" 00:15:05.116 ], 00:15:05.116 "product_name": "Malloc disk", 00:15:05.116 "block_size": 512, 00:15:05.116 "num_blocks": 65536, 00:15:05.116 "uuid": "98bf85cd-74ae-4685-993c-8571432c29fb", 00:15:05.116 "assigned_rate_limits": { 00:15:05.116 "rw_ios_per_sec": 0, 00:15:05.116 "rw_mbytes_per_sec": 0, 00:15:05.116 "r_mbytes_per_sec": 0, 00:15:05.116 "w_mbytes_per_sec": 0 00:15:05.116 }, 00:15:05.116 "claimed": true, 00:15:05.116 "claim_type": "exclusive_write", 00:15:05.116 "zoned": false, 00:15:05.116 "supported_io_types": { 00:15:05.116 "read": true, 00:15:05.116 "write": true, 00:15:05.116 "unmap": true, 00:15:05.116 "flush": true, 00:15:05.116 "reset": true, 00:15:05.116 "nvme_admin": false, 00:15:05.116 "nvme_io": false, 00:15:05.116 "nvme_io_md": false, 00:15:05.116 "write_zeroes": true, 00:15:05.116 "zcopy": true, 00:15:05.116 "get_zone_info": false, 00:15:05.116 "zone_management": false, 00:15:05.116 "zone_append": false, 00:15:05.116 "compare": false, 00:15:05.116 "compare_and_write": false, 00:15:05.116 "abort": true, 00:15:05.116 "seek_hole": false, 00:15:05.116 "seek_data": false, 00:15:05.116 "copy": true, 00:15:05.116 "nvme_iov_md": false 00:15:05.116 }, 00:15:05.116 "memory_domains": [ 00:15:05.116 { 00:15:05.116 "dma_device_id": "system", 00:15:05.116 "dma_device_type": 1 00:15:05.116 }, 00:15:05.116 { 00:15:05.116 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:05.116 "dma_device_type": 2 00:15:05.116 } 00:15:05.116 ], 00:15:05.116 "driver_specific": {} 00:15:05.116 }' 00:15:05.116 11:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:05.116 11:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:05.116 11:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:05.116 11:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:05.116 11:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:05.374 11:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:05.374 11:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:05.374 11:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:05.374 11:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:05.374 11:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:05.374 11:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:05.374 11:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:05.374 11:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:05.374 11:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:05.374 11:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:05.633 11:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:05.633 "name": "BaseBdev2", 00:15:05.633 "aliases": [ 00:15:05.633 "a5db1075-4be8-4f29-9d4b-7a87a625cf24" 00:15:05.633 ], 00:15:05.633 "product_name": "Malloc disk", 00:15:05.633 "block_size": 512, 00:15:05.633 "num_blocks": 65536, 00:15:05.633 "uuid": "a5db1075-4be8-4f29-9d4b-7a87a625cf24", 00:15:05.633 "assigned_rate_limits": { 00:15:05.633 "rw_ios_per_sec": 0, 00:15:05.633 "rw_mbytes_per_sec": 0, 00:15:05.633 "r_mbytes_per_sec": 0, 00:15:05.633 "w_mbytes_per_sec": 0 00:15:05.633 }, 00:15:05.633 "claimed": true, 00:15:05.633 "claim_type": "exclusive_write", 00:15:05.633 "zoned": false, 00:15:05.633 "supported_io_types": { 00:15:05.633 "read": true, 00:15:05.633 "write": true, 00:15:05.633 "unmap": true, 00:15:05.633 "flush": true, 00:15:05.633 "reset": true, 00:15:05.633 "nvme_admin": false, 00:15:05.633 "nvme_io": false, 00:15:05.633 "nvme_io_md": false, 00:15:05.633 "write_zeroes": true, 00:15:05.633 "zcopy": true, 00:15:05.633 "get_zone_info": false, 00:15:05.633 "zone_management": false, 00:15:05.633 "zone_append": false, 00:15:05.633 "compare": false, 00:15:05.633 "compare_and_write": false, 00:15:05.633 "abort": true, 00:15:05.633 "seek_hole": false, 00:15:05.633 "seek_data": false, 00:15:05.633 "copy": true, 00:15:05.633 "nvme_iov_md": false 00:15:05.633 }, 00:15:05.633 "memory_domains": [ 00:15:05.633 { 00:15:05.633 "dma_device_id": "system", 00:15:05.633 "dma_device_type": 1 00:15:05.633 }, 00:15:05.633 { 00:15:05.633 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:05.633 "dma_device_type": 2 00:15:05.633 } 00:15:05.633 ], 00:15:05.633 "driver_specific": {} 00:15:05.633 }' 00:15:05.633 11:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:05.633 11:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:05.633 11:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:05.633 11:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:05.633 11:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:05.633 11:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:05.633 11:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:05.892 11:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:05.892 11:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:05.892 11:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:05.892 11:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:05.892 11:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:05.892 11:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:05.892 11:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:05.892 11:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:06.150 11:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:06.150 "name": "BaseBdev3", 00:15:06.150 "aliases": [ 00:15:06.150 "94ac7a81-5cee-4478-92b1-e685654043e0" 00:15:06.150 ], 00:15:06.150 "product_name": "Malloc disk", 00:15:06.150 "block_size": 512, 00:15:06.150 "num_blocks": 65536, 00:15:06.150 "uuid": "94ac7a81-5cee-4478-92b1-e685654043e0", 00:15:06.150 "assigned_rate_limits": { 00:15:06.150 "rw_ios_per_sec": 0, 00:15:06.150 "rw_mbytes_per_sec": 0, 00:15:06.150 "r_mbytes_per_sec": 0, 00:15:06.150 "w_mbytes_per_sec": 0 00:15:06.150 }, 00:15:06.150 "claimed": true, 00:15:06.150 "claim_type": "exclusive_write", 00:15:06.150 "zoned": false, 00:15:06.150 "supported_io_types": { 00:15:06.150 "read": true, 00:15:06.150 "write": true, 00:15:06.150 "unmap": true, 00:15:06.150 "flush": true, 00:15:06.150 "reset": true, 00:15:06.150 "nvme_admin": false, 00:15:06.150 "nvme_io": false, 00:15:06.150 "nvme_io_md": false, 00:15:06.150 "write_zeroes": true, 00:15:06.150 "zcopy": true, 00:15:06.150 "get_zone_info": false, 00:15:06.150 "zone_management": false, 00:15:06.150 "zone_append": false, 00:15:06.150 "compare": false, 00:15:06.150 "compare_and_write": false, 00:15:06.150 "abort": true, 00:15:06.150 "seek_hole": false, 00:15:06.150 "seek_data": false, 00:15:06.150 "copy": true, 00:15:06.150 "nvme_iov_md": false 00:15:06.150 }, 00:15:06.150 "memory_domains": [ 00:15:06.150 { 00:15:06.150 "dma_device_id": "system", 00:15:06.150 "dma_device_type": 1 00:15:06.150 }, 00:15:06.150 { 00:15:06.150 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:06.150 "dma_device_type": 2 00:15:06.150 } 00:15:06.150 ], 00:15:06.150 "driver_specific": {} 00:15:06.150 }' 00:15:06.150 11:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:06.150 11:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:06.150 11:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:06.150 11:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:06.150 11:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:06.150 11:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:06.150 11:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:06.150 11:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:06.410 11:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:06.410 11:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:06.410 11:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:06.410 11:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:06.410 11:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:06.410 11:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:15:06.410 11:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:06.410 11:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:06.410 "name": "BaseBdev4", 00:15:06.410 "aliases": [ 00:15:06.410 "e801bba3-952b-434e-b553-1491eee0761e" 00:15:06.410 ], 00:15:06.410 "product_name": "Malloc disk", 00:15:06.410 "block_size": 512, 00:15:06.410 "num_blocks": 65536, 00:15:06.410 "uuid": "e801bba3-952b-434e-b553-1491eee0761e", 00:15:06.410 "assigned_rate_limits": { 00:15:06.410 "rw_ios_per_sec": 0, 00:15:06.410 "rw_mbytes_per_sec": 0, 00:15:06.410 "r_mbytes_per_sec": 0, 00:15:06.410 "w_mbytes_per_sec": 0 00:15:06.410 }, 00:15:06.410 "claimed": true, 00:15:06.410 "claim_type": "exclusive_write", 00:15:06.410 "zoned": false, 00:15:06.410 "supported_io_types": { 00:15:06.410 "read": true, 00:15:06.410 "write": true, 00:15:06.410 "unmap": true, 00:15:06.410 "flush": true, 00:15:06.410 "reset": true, 00:15:06.410 "nvme_admin": false, 00:15:06.410 "nvme_io": false, 00:15:06.410 "nvme_io_md": false, 00:15:06.410 "write_zeroes": true, 00:15:06.410 "zcopy": true, 00:15:06.410 "get_zone_info": false, 00:15:06.410 "zone_management": false, 00:15:06.410 "zone_append": false, 00:15:06.410 "compare": false, 00:15:06.410 "compare_and_write": false, 00:15:06.410 "abort": true, 00:15:06.410 "seek_hole": false, 00:15:06.410 "seek_data": false, 00:15:06.410 "copy": true, 00:15:06.410 "nvme_iov_md": false 00:15:06.410 }, 00:15:06.410 "memory_domains": [ 00:15:06.410 { 00:15:06.410 "dma_device_id": "system", 00:15:06.410 "dma_device_type": 1 00:15:06.410 }, 00:15:06.410 { 00:15:06.410 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:06.410 "dma_device_type": 2 00:15:06.410 } 00:15:06.410 ], 00:15:06.410 "driver_specific": {} 00:15:06.410 }' 00:15:06.410 11:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:06.668 11:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:06.668 11:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:06.668 11:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:06.668 11:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:06.668 11:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:06.668 11:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:06.668 11:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:06.668 11:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:06.668 11:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:06.668 11:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:06.927 11:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:06.927 11:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:06.927 [2024-07-12 11:55:57.090001] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:06.927 [2024-07-12 11:55:57.090022] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:06.927 [2024-07-12 11:55:57.090057] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:06.927 11:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:15:06.927 11:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:15:06.927 11:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:06.927 11:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:15:06.927 11:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:15:06.927 11:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:15:06.927 11:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:06.927 11:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:15:06.927 11:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:06.927 11:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:06.927 11:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:06.927 11:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:06.927 11:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:06.927 11:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:06.927 11:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:06.927 11:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:06.927 11:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:07.185 11:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:07.185 "name": "Existed_Raid", 00:15:07.185 "uuid": "e4095e19-f4e4-42e1-b0c6-9972379470bd", 00:15:07.185 "strip_size_kb": 64, 00:15:07.185 "state": "offline", 00:15:07.185 "raid_level": "raid0", 00:15:07.185 "superblock": true, 00:15:07.185 "num_base_bdevs": 4, 00:15:07.185 "num_base_bdevs_discovered": 3, 00:15:07.185 "num_base_bdevs_operational": 3, 00:15:07.185 "base_bdevs_list": [ 00:15:07.185 { 00:15:07.185 "name": null, 00:15:07.185 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:07.185 "is_configured": false, 00:15:07.185 "data_offset": 2048, 00:15:07.185 "data_size": 63488 00:15:07.185 }, 00:15:07.185 { 00:15:07.185 "name": "BaseBdev2", 00:15:07.185 "uuid": "a5db1075-4be8-4f29-9d4b-7a87a625cf24", 00:15:07.185 "is_configured": true, 00:15:07.185 "data_offset": 2048, 00:15:07.185 "data_size": 63488 00:15:07.185 }, 00:15:07.185 { 00:15:07.185 "name": "BaseBdev3", 00:15:07.185 "uuid": "94ac7a81-5cee-4478-92b1-e685654043e0", 00:15:07.185 "is_configured": true, 00:15:07.185 "data_offset": 2048, 00:15:07.185 "data_size": 63488 00:15:07.185 }, 00:15:07.185 { 00:15:07.185 "name": "BaseBdev4", 00:15:07.185 "uuid": "e801bba3-952b-434e-b553-1491eee0761e", 00:15:07.185 "is_configured": true, 00:15:07.185 "data_offset": 2048, 00:15:07.185 "data_size": 63488 00:15:07.185 } 00:15:07.185 ] 00:15:07.185 }' 00:15:07.185 11:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:07.185 11:55:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:07.752 11:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:07.752 11:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:07.752 11:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:07.752 11:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:07.752 11:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:07.752 11:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:07.752 11:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:08.010 [2024-07-12 11:55:58.085511] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:08.010 11:55:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:08.010 11:55:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:08.010 11:55:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:08.010 11:55:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:08.269 11:55:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:08.269 11:55:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:08.269 11:55:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:08.269 [2024-07-12 11:55:58.432300] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:08.269 11:55:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:08.269 11:55:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:08.269 11:55:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:08.269 11:55:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:08.527 11:55:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:08.527 11:55:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:08.527 11:55:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:15:08.786 [2024-07-12 11:55:58.775045] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:15:08.786 [2024-07-12 11:55:58.775076] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18d2b90 name Existed_Raid, state offline 00:15:08.786 11:55:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:08.786 11:55:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:08.786 11:55:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:08.786 11:55:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:15:08.786 11:55:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:15:08.786 11:55:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:15:08.786 11:55:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:15:08.786 11:55:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:15:08.786 11:55:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:08.786 11:55:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:09.044 BaseBdev2 00:15:09.044 11:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:15:09.044 11:55:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:09.044 11:55:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:09.044 11:55:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:09.044 11:55:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:09.044 11:55:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:09.044 11:55:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:09.303 11:55:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:09.303 [ 00:15:09.303 { 00:15:09.303 "name": "BaseBdev2", 00:15:09.303 "aliases": [ 00:15:09.303 "58634589-7202-4567-8ca0-c1cd1f492bfb" 00:15:09.303 ], 00:15:09.303 "product_name": "Malloc disk", 00:15:09.303 "block_size": 512, 00:15:09.303 "num_blocks": 65536, 00:15:09.303 "uuid": "58634589-7202-4567-8ca0-c1cd1f492bfb", 00:15:09.303 "assigned_rate_limits": { 00:15:09.303 "rw_ios_per_sec": 0, 00:15:09.303 "rw_mbytes_per_sec": 0, 00:15:09.303 "r_mbytes_per_sec": 0, 00:15:09.303 "w_mbytes_per_sec": 0 00:15:09.303 }, 00:15:09.303 "claimed": false, 00:15:09.303 "zoned": false, 00:15:09.303 "supported_io_types": { 00:15:09.303 "read": true, 00:15:09.303 "write": true, 00:15:09.303 "unmap": true, 00:15:09.303 "flush": true, 00:15:09.303 "reset": true, 00:15:09.303 "nvme_admin": false, 00:15:09.303 "nvme_io": false, 00:15:09.303 "nvme_io_md": false, 00:15:09.303 "write_zeroes": true, 00:15:09.303 "zcopy": true, 00:15:09.303 "get_zone_info": false, 00:15:09.303 "zone_management": false, 00:15:09.303 "zone_append": false, 00:15:09.303 "compare": false, 00:15:09.303 "compare_and_write": false, 00:15:09.303 "abort": true, 00:15:09.303 "seek_hole": false, 00:15:09.303 "seek_data": false, 00:15:09.303 "copy": true, 00:15:09.303 "nvme_iov_md": false 00:15:09.303 }, 00:15:09.303 "memory_domains": [ 00:15:09.303 { 00:15:09.303 "dma_device_id": "system", 00:15:09.303 "dma_device_type": 1 00:15:09.303 }, 00:15:09.303 { 00:15:09.303 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:09.303 "dma_device_type": 2 00:15:09.303 } 00:15:09.303 ], 00:15:09.303 "driver_specific": {} 00:15:09.303 } 00:15:09.303 ] 00:15:09.303 11:55:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:09.303 11:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:09.303 11:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:09.303 11:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:09.561 BaseBdev3 00:15:09.561 11:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:15:09.561 11:55:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:09.561 11:55:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:09.561 11:55:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:09.561 11:55:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:09.561 11:55:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:09.561 11:55:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:09.561 11:55:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:09.820 [ 00:15:09.820 { 00:15:09.820 "name": "BaseBdev3", 00:15:09.820 "aliases": [ 00:15:09.820 "cc3d0745-c1d8-4afe-a31c-dfd60a8f978d" 00:15:09.820 ], 00:15:09.820 "product_name": "Malloc disk", 00:15:09.820 "block_size": 512, 00:15:09.820 "num_blocks": 65536, 00:15:09.820 "uuid": "cc3d0745-c1d8-4afe-a31c-dfd60a8f978d", 00:15:09.820 "assigned_rate_limits": { 00:15:09.820 "rw_ios_per_sec": 0, 00:15:09.820 "rw_mbytes_per_sec": 0, 00:15:09.820 "r_mbytes_per_sec": 0, 00:15:09.820 "w_mbytes_per_sec": 0 00:15:09.820 }, 00:15:09.820 "claimed": false, 00:15:09.820 "zoned": false, 00:15:09.820 "supported_io_types": { 00:15:09.820 "read": true, 00:15:09.820 "write": true, 00:15:09.820 "unmap": true, 00:15:09.820 "flush": true, 00:15:09.820 "reset": true, 00:15:09.820 "nvme_admin": false, 00:15:09.820 "nvme_io": false, 00:15:09.820 "nvme_io_md": false, 00:15:09.820 "write_zeroes": true, 00:15:09.820 "zcopy": true, 00:15:09.820 "get_zone_info": false, 00:15:09.820 "zone_management": false, 00:15:09.820 "zone_append": false, 00:15:09.820 "compare": false, 00:15:09.820 "compare_and_write": false, 00:15:09.820 "abort": true, 00:15:09.820 "seek_hole": false, 00:15:09.820 "seek_data": false, 00:15:09.820 "copy": true, 00:15:09.820 "nvme_iov_md": false 00:15:09.820 }, 00:15:09.820 "memory_domains": [ 00:15:09.820 { 00:15:09.820 "dma_device_id": "system", 00:15:09.820 "dma_device_type": 1 00:15:09.820 }, 00:15:09.820 { 00:15:09.820 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:09.820 "dma_device_type": 2 00:15:09.820 } 00:15:09.820 ], 00:15:09.820 "driver_specific": {} 00:15:09.820 } 00:15:09.820 ] 00:15:09.820 11:55:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:09.820 11:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:09.820 11:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:09.820 11:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:15:10.079 BaseBdev4 00:15:10.079 11:56:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:15:10.079 11:56:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:15:10.079 11:56:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:10.079 11:56:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:10.079 11:56:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:10.079 11:56:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:10.079 11:56:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:10.079 11:56:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:15:10.337 [ 00:15:10.337 { 00:15:10.337 "name": "BaseBdev4", 00:15:10.337 "aliases": [ 00:15:10.337 "6e869165-0c62-4973-94e7-2bd222d1bc68" 00:15:10.337 ], 00:15:10.337 "product_name": "Malloc disk", 00:15:10.338 "block_size": 512, 00:15:10.338 "num_blocks": 65536, 00:15:10.338 "uuid": "6e869165-0c62-4973-94e7-2bd222d1bc68", 00:15:10.338 "assigned_rate_limits": { 00:15:10.338 "rw_ios_per_sec": 0, 00:15:10.338 "rw_mbytes_per_sec": 0, 00:15:10.338 "r_mbytes_per_sec": 0, 00:15:10.338 "w_mbytes_per_sec": 0 00:15:10.338 }, 00:15:10.338 "claimed": false, 00:15:10.338 "zoned": false, 00:15:10.338 "supported_io_types": { 00:15:10.338 "read": true, 00:15:10.338 "write": true, 00:15:10.338 "unmap": true, 00:15:10.338 "flush": true, 00:15:10.338 "reset": true, 00:15:10.338 "nvme_admin": false, 00:15:10.338 "nvme_io": false, 00:15:10.338 "nvme_io_md": false, 00:15:10.338 "write_zeroes": true, 00:15:10.338 "zcopy": true, 00:15:10.338 "get_zone_info": false, 00:15:10.338 "zone_management": false, 00:15:10.338 "zone_append": false, 00:15:10.338 "compare": false, 00:15:10.338 "compare_and_write": false, 00:15:10.338 "abort": true, 00:15:10.338 "seek_hole": false, 00:15:10.338 "seek_data": false, 00:15:10.338 "copy": true, 00:15:10.338 "nvme_iov_md": false 00:15:10.338 }, 00:15:10.338 "memory_domains": [ 00:15:10.338 { 00:15:10.338 "dma_device_id": "system", 00:15:10.338 "dma_device_type": 1 00:15:10.338 }, 00:15:10.338 { 00:15:10.338 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:10.338 "dma_device_type": 2 00:15:10.338 } 00:15:10.338 ], 00:15:10.338 "driver_specific": {} 00:15:10.338 } 00:15:10.338 ] 00:15:10.338 11:56:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:10.338 11:56:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:10.338 11:56:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:10.338 11:56:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:10.597 [2024-07-12 11:56:00.629125] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:10.597 [2024-07-12 11:56:00.629155] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:10.597 [2024-07-12 11:56:00.629167] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:10.597 [2024-07-12 11:56:00.630131] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:10.597 [2024-07-12 11:56:00.630161] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:15:10.597 11:56:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:10.597 11:56:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:10.597 11:56:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:10.597 11:56:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:10.597 11:56:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:10.597 11:56:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:10.597 11:56:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:10.597 11:56:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:10.597 11:56:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:10.597 11:56:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:10.597 11:56:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:10.597 11:56:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:10.597 11:56:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:10.597 "name": "Existed_Raid", 00:15:10.597 "uuid": "fd07743a-6a49-4ef0-897d-33d745e66ec2", 00:15:10.597 "strip_size_kb": 64, 00:15:10.597 "state": "configuring", 00:15:10.597 "raid_level": "raid0", 00:15:10.597 "superblock": true, 00:15:10.597 "num_base_bdevs": 4, 00:15:10.597 "num_base_bdevs_discovered": 3, 00:15:10.597 "num_base_bdevs_operational": 4, 00:15:10.597 "base_bdevs_list": [ 00:15:10.597 { 00:15:10.597 "name": "BaseBdev1", 00:15:10.597 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:10.597 "is_configured": false, 00:15:10.597 "data_offset": 0, 00:15:10.597 "data_size": 0 00:15:10.597 }, 00:15:10.597 { 00:15:10.597 "name": "BaseBdev2", 00:15:10.597 "uuid": "58634589-7202-4567-8ca0-c1cd1f492bfb", 00:15:10.597 "is_configured": true, 00:15:10.597 "data_offset": 2048, 00:15:10.597 "data_size": 63488 00:15:10.597 }, 00:15:10.597 { 00:15:10.597 "name": "BaseBdev3", 00:15:10.597 "uuid": "cc3d0745-c1d8-4afe-a31c-dfd60a8f978d", 00:15:10.597 "is_configured": true, 00:15:10.597 "data_offset": 2048, 00:15:10.597 "data_size": 63488 00:15:10.597 }, 00:15:10.597 { 00:15:10.597 "name": "BaseBdev4", 00:15:10.597 "uuid": "6e869165-0c62-4973-94e7-2bd222d1bc68", 00:15:10.597 "is_configured": true, 00:15:10.597 "data_offset": 2048, 00:15:10.597 "data_size": 63488 00:15:10.597 } 00:15:10.597 ] 00:15:10.597 }' 00:15:10.597 11:56:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:10.597 11:56:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:11.164 11:56:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:15:11.424 [2024-07-12 11:56:01.427166] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:11.424 11:56:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:11.424 11:56:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:11.424 11:56:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:11.424 11:56:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:11.424 11:56:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:11.424 11:56:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:11.424 11:56:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:11.424 11:56:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:11.424 11:56:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:11.424 11:56:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:11.424 11:56:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:11.424 11:56:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:11.424 11:56:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:11.424 "name": "Existed_Raid", 00:15:11.424 "uuid": "fd07743a-6a49-4ef0-897d-33d745e66ec2", 00:15:11.424 "strip_size_kb": 64, 00:15:11.424 "state": "configuring", 00:15:11.424 "raid_level": "raid0", 00:15:11.424 "superblock": true, 00:15:11.424 "num_base_bdevs": 4, 00:15:11.424 "num_base_bdevs_discovered": 2, 00:15:11.424 "num_base_bdevs_operational": 4, 00:15:11.424 "base_bdevs_list": [ 00:15:11.424 { 00:15:11.424 "name": "BaseBdev1", 00:15:11.424 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:11.424 "is_configured": false, 00:15:11.424 "data_offset": 0, 00:15:11.424 "data_size": 0 00:15:11.424 }, 00:15:11.424 { 00:15:11.424 "name": null, 00:15:11.424 "uuid": "58634589-7202-4567-8ca0-c1cd1f492bfb", 00:15:11.424 "is_configured": false, 00:15:11.424 "data_offset": 2048, 00:15:11.424 "data_size": 63488 00:15:11.424 }, 00:15:11.424 { 00:15:11.424 "name": "BaseBdev3", 00:15:11.424 "uuid": "cc3d0745-c1d8-4afe-a31c-dfd60a8f978d", 00:15:11.424 "is_configured": true, 00:15:11.424 "data_offset": 2048, 00:15:11.424 "data_size": 63488 00:15:11.424 }, 00:15:11.424 { 00:15:11.424 "name": "BaseBdev4", 00:15:11.424 "uuid": "6e869165-0c62-4973-94e7-2bd222d1bc68", 00:15:11.424 "is_configured": true, 00:15:11.424 "data_offset": 2048, 00:15:11.424 "data_size": 63488 00:15:11.424 } 00:15:11.424 ] 00:15:11.424 }' 00:15:11.424 11:56:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:11.424 11:56:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:11.992 11:56:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:11.992 11:56:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:12.250 11:56:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:15:12.250 11:56:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:12.250 [2024-07-12 11:56:02.424513] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:12.250 BaseBdev1 00:15:12.250 11:56:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:15:12.250 11:56:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:12.250 11:56:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:12.250 11:56:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:12.250 11:56:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:12.250 11:56:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:12.250 11:56:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:12.508 11:56:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:12.508 [ 00:15:12.508 { 00:15:12.508 "name": "BaseBdev1", 00:15:12.508 "aliases": [ 00:15:12.508 "edf09e32-8eb3-4193-922a-9d49af7ffa9d" 00:15:12.508 ], 00:15:12.508 "product_name": "Malloc disk", 00:15:12.508 "block_size": 512, 00:15:12.508 "num_blocks": 65536, 00:15:12.508 "uuid": "edf09e32-8eb3-4193-922a-9d49af7ffa9d", 00:15:12.508 "assigned_rate_limits": { 00:15:12.508 "rw_ios_per_sec": 0, 00:15:12.508 "rw_mbytes_per_sec": 0, 00:15:12.508 "r_mbytes_per_sec": 0, 00:15:12.508 "w_mbytes_per_sec": 0 00:15:12.508 }, 00:15:12.508 "claimed": true, 00:15:12.508 "claim_type": "exclusive_write", 00:15:12.508 "zoned": false, 00:15:12.508 "supported_io_types": { 00:15:12.508 "read": true, 00:15:12.508 "write": true, 00:15:12.508 "unmap": true, 00:15:12.508 "flush": true, 00:15:12.508 "reset": true, 00:15:12.508 "nvme_admin": false, 00:15:12.508 "nvme_io": false, 00:15:12.508 "nvme_io_md": false, 00:15:12.508 "write_zeroes": true, 00:15:12.508 "zcopy": true, 00:15:12.508 "get_zone_info": false, 00:15:12.508 "zone_management": false, 00:15:12.508 "zone_append": false, 00:15:12.508 "compare": false, 00:15:12.508 "compare_and_write": false, 00:15:12.508 "abort": true, 00:15:12.508 "seek_hole": false, 00:15:12.508 "seek_data": false, 00:15:12.508 "copy": true, 00:15:12.508 "nvme_iov_md": false 00:15:12.508 }, 00:15:12.508 "memory_domains": [ 00:15:12.508 { 00:15:12.508 "dma_device_id": "system", 00:15:12.508 "dma_device_type": 1 00:15:12.508 }, 00:15:12.508 { 00:15:12.508 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:12.508 "dma_device_type": 2 00:15:12.508 } 00:15:12.508 ], 00:15:12.508 "driver_specific": {} 00:15:12.508 } 00:15:12.508 ] 00:15:12.508 11:56:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:12.508 11:56:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:12.508 11:56:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:12.508 11:56:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:12.508 11:56:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:12.508 11:56:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:12.508 11:56:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:12.508 11:56:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:12.508 11:56:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:12.508 11:56:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:12.508 11:56:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:12.508 11:56:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:12.508 11:56:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:12.766 11:56:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:12.766 "name": "Existed_Raid", 00:15:12.766 "uuid": "fd07743a-6a49-4ef0-897d-33d745e66ec2", 00:15:12.766 "strip_size_kb": 64, 00:15:12.766 "state": "configuring", 00:15:12.766 "raid_level": "raid0", 00:15:12.766 "superblock": true, 00:15:12.766 "num_base_bdevs": 4, 00:15:12.766 "num_base_bdevs_discovered": 3, 00:15:12.766 "num_base_bdevs_operational": 4, 00:15:12.766 "base_bdevs_list": [ 00:15:12.766 { 00:15:12.766 "name": "BaseBdev1", 00:15:12.766 "uuid": "edf09e32-8eb3-4193-922a-9d49af7ffa9d", 00:15:12.766 "is_configured": true, 00:15:12.766 "data_offset": 2048, 00:15:12.766 "data_size": 63488 00:15:12.766 }, 00:15:12.766 { 00:15:12.766 "name": null, 00:15:12.766 "uuid": "58634589-7202-4567-8ca0-c1cd1f492bfb", 00:15:12.766 "is_configured": false, 00:15:12.766 "data_offset": 2048, 00:15:12.766 "data_size": 63488 00:15:12.766 }, 00:15:12.766 { 00:15:12.766 "name": "BaseBdev3", 00:15:12.766 "uuid": "cc3d0745-c1d8-4afe-a31c-dfd60a8f978d", 00:15:12.766 "is_configured": true, 00:15:12.766 "data_offset": 2048, 00:15:12.766 "data_size": 63488 00:15:12.766 }, 00:15:12.766 { 00:15:12.766 "name": "BaseBdev4", 00:15:12.766 "uuid": "6e869165-0c62-4973-94e7-2bd222d1bc68", 00:15:12.766 "is_configured": true, 00:15:12.766 "data_offset": 2048, 00:15:12.766 "data_size": 63488 00:15:12.766 } 00:15:12.766 ] 00:15:12.766 }' 00:15:12.766 11:56:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:12.766 11:56:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:13.332 11:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:13.332 11:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:13.332 11:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:15:13.332 11:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:15:13.591 [2024-07-12 11:56:03.699852] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:13.591 11:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:13.591 11:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:13.591 11:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:13.591 11:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:13.591 11:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:13.591 11:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:13.591 11:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:13.591 11:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:13.591 11:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:13.591 11:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:13.591 11:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:13.591 11:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:13.849 11:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:13.849 "name": "Existed_Raid", 00:15:13.849 "uuid": "fd07743a-6a49-4ef0-897d-33d745e66ec2", 00:15:13.849 "strip_size_kb": 64, 00:15:13.849 "state": "configuring", 00:15:13.849 "raid_level": "raid0", 00:15:13.849 "superblock": true, 00:15:13.849 "num_base_bdevs": 4, 00:15:13.849 "num_base_bdevs_discovered": 2, 00:15:13.849 "num_base_bdevs_operational": 4, 00:15:13.849 "base_bdevs_list": [ 00:15:13.849 { 00:15:13.849 "name": "BaseBdev1", 00:15:13.849 "uuid": "edf09e32-8eb3-4193-922a-9d49af7ffa9d", 00:15:13.849 "is_configured": true, 00:15:13.849 "data_offset": 2048, 00:15:13.849 "data_size": 63488 00:15:13.849 }, 00:15:13.849 { 00:15:13.849 "name": null, 00:15:13.849 "uuid": "58634589-7202-4567-8ca0-c1cd1f492bfb", 00:15:13.849 "is_configured": false, 00:15:13.849 "data_offset": 2048, 00:15:13.849 "data_size": 63488 00:15:13.849 }, 00:15:13.849 { 00:15:13.849 "name": null, 00:15:13.849 "uuid": "cc3d0745-c1d8-4afe-a31c-dfd60a8f978d", 00:15:13.849 "is_configured": false, 00:15:13.849 "data_offset": 2048, 00:15:13.849 "data_size": 63488 00:15:13.849 }, 00:15:13.849 { 00:15:13.849 "name": "BaseBdev4", 00:15:13.849 "uuid": "6e869165-0c62-4973-94e7-2bd222d1bc68", 00:15:13.849 "is_configured": true, 00:15:13.849 "data_offset": 2048, 00:15:13.849 "data_size": 63488 00:15:13.849 } 00:15:13.849 ] 00:15:13.849 }' 00:15:13.849 11:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:13.849 11:56:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:14.417 11:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:14.417 11:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:14.417 11:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:15:14.417 11:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:15:14.676 [2024-07-12 11:56:04.698454] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:14.676 11:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:14.676 11:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:14.676 11:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:14.676 11:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:14.676 11:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:14.676 11:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:14.676 11:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:14.676 11:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:14.676 11:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:14.676 11:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:14.676 11:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:14.676 11:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:14.676 11:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:14.676 "name": "Existed_Raid", 00:15:14.676 "uuid": "fd07743a-6a49-4ef0-897d-33d745e66ec2", 00:15:14.676 "strip_size_kb": 64, 00:15:14.676 "state": "configuring", 00:15:14.676 "raid_level": "raid0", 00:15:14.676 "superblock": true, 00:15:14.676 "num_base_bdevs": 4, 00:15:14.676 "num_base_bdevs_discovered": 3, 00:15:14.676 "num_base_bdevs_operational": 4, 00:15:14.676 "base_bdevs_list": [ 00:15:14.676 { 00:15:14.676 "name": "BaseBdev1", 00:15:14.676 "uuid": "edf09e32-8eb3-4193-922a-9d49af7ffa9d", 00:15:14.676 "is_configured": true, 00:15:14.676 "data_offset": 2048, 00:15:14.676 "data_size": 63488 00:15:14.676 }, 00:15:14.676 { 00:15:14.676 "name": null, 00:15:14.676 "uuid": "58634589-7202-4567-8ca0-c1cd1f492bfb", 00:15:14.676 "is_configured": false, 00:15:14.676 "data_offset": 2048, 00:15:14.676 "data_size": 63488 00:15:14.676 }, 00:15:14.676 { 00:15:14.676 "name": "BaseBdev3", 00:15:14.676 "uuid": "cc3d0745-c1d8-4afe-a31c-dfd60a8f978d", 00:15:14.676 "is_configured": true, 00:15:14.676 "data_offset": 2048, 00:15:14.676 "data_size": 63488 00:15:14.676 }, 00:15:14.676 { 00:15:14.676 "name": "BaseBdev4", 00:15:14.676 "uuid": "6e869165-0c62-4973-94e7-2bd222d1bc68", 00:15:14.676 "is_configured": true, 00:15:14.676 "data_offset": 2048, 00:15:14.676 "data_size": 63488 00:15:14.676 } 00:15:14.676 ] 00:15:14.676 }' 00:15:14.676 11:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:14.676 11:56:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:15.243 11:56:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:15.243 11:56:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:15.502 11:56:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:15:15.502 11:56:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:15.502 [2024-07-12 11:56:05.681022] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:15.502 11:56:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:15.502 11:56:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:15.502 11:56:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:15.502 11:56:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:15.502 11:56:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:15.502 11:56:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:15.502 11:56:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:15.502 11:56:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:15.502 11:56:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:15.502 11:56:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:15.502 11:56:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:15.502 11:56:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:15.761 11:56:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:15.761 "name": "Existed_Raid", 00:15:15.761 "uuid": "fd07743a-6a49-4ef0-897d-33d745e66ec2", 00:15:15.761 "strip_size_kb": 64, 00:15:15.761 "state": "configuring", 00:15:15.761 "raid_level": "raid0", 00:15:15.761 "superblock": true, 00:15:15.761 "num_base_bdevs": 4, 00:15:15.761 "num_base_bdevs_discovered": 2, 00:15:15.761 "num_base_bdevs_operational": 4, 00:15:15.761 "base_bdevs_list": [ 00:15:15.761 { 00:15:15.761 "name": null, 00:15:15.761 "uuid": "edf09e32-8eb3-4193-922a-9d49af7ffa9d", 00:15:15.761 "is_configured": false, 00:15:15.761 "data_offset": 2048, 00:15:15.761 "data_size": 63488 00:15:15.761 }, 00:15:15.761 { 00:15:15.761 "name": null, 00:15:15.761 "uuid": "58634589-7202-4567-8ca0-c1cd1f492bfb", 00:15:15.761 "is_configured": false, 00:15:15.761 "data_offset": 2048, 00:15:15.761 "data_size": 63488 00:15:15.761 }, 00:15:15.761 { 00:15:15.761 "name": "BaseBdev3", 00:15:15.761 "uuid": "cc3d0745-c1d8-4afe-a31c-dfd60a8f978d", 00:15:15.761 "is_configured": true, 00:15:15.761 "data_offset": 2048, 00:15:15.761 "data_size": 63488 00:15:15.761 }, 00:15:15.761 { 00:15:15.761 "name": "BaseBdev4", 00:15:15.761 "uuid": "6e869165-0c62-4973-94e7-2bd222d1bc68", 00:15:15.761 "is_configured": true, 00:15:15.761 "data_offset": 2048, 00:15:15.761 "data_size": 63488 00:15:15.761 } 00:15:15.761 ] 00:15:15.761 }' 00:15:15.761 11:56:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:15.761 11:56:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:16.328 11:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:16.328 11:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:16.328 11:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:15:16.328 11:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:15:16.587 [2024-07-12 11:56:06.641195] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:16.587 11:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:16.587 11:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:16.587 11:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:16.587 11:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:16.587 11:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:16.587 11:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:16.587 11:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:16.587 11:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:16.587 11:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:16.587 11:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:16.587 11:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:16.587 11:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:16.587 11:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:16.587 "name": "Existed_Raid", 00:15:16.587 "uuid": "fd07743a-6a49-4ef0-897d-33d745e66ec2", 00:15:16.587 "strip_size_kb": 64, 00:15:16.587 "state": "configuring", 00:15:16.587 "raid_level": "raid0", 00:15:16.587 "superblock": true, 00:15:16.587 "num_base_bdevs": 4, 00:15:16.587 "num_base_bdevs_discovered": 3, 00:15:16.587 "num_base_bdevs_operational": 4, 00:15:16.587 "base_bdevs_list": [ 00:15:16.587 { 00:15:16.587 "name": null, 00:15:16.587 "uuid": "edf09e32-8eb3-4193-922a-9d49af7ffa9d", 00:15:16.587 "is_configured": false, 00:15:16.587 "data_offset": 2048, 00:15:16.587 "data_size": 63488 00:15:16.587 }, 00:15:16.587 { 00:15:16.587 "name": "BaseBdev2", 00:15:16.587 "uuid": "58634589-7202-4567-8ca0-c1cd1f492bfb", 00:15:16.587 "is_configured": true, 00:15:16.587 "data_offset": 2048, 00:15:16.587 "data_size": 63488 00:15:16.587 }, 00:15:16.587 { 00:15:16.587 "name": "BaseBdev3", 00:15:16.587 "uuid": "cc3d0745-c1d8-4afe-a31c-dfd60a8f978d", 00:15:16.587 "is_configured": true, 00:15:16.587 "data_offset": 2048, 00:15:16.587 "data_size": 63488 00:15:16.587 }, 00:15:16.587 { 00:15:16.587 "name": "BaseBdev4", 00:15:16.587 "uuid": "6e869165-0c62-4973-94e7-2bd222d1bc68", 00:15:16.587 "is_configured": true, 00:15:16.587 "data_offset": 2048, 00:15:16.587 "data_size": 63488 00:15:16.587 } 00:15:16.587 ] 00:15:16.587 }' 00:15:16.587 11:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:16.587 11:56:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:17.154 11:56:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:17.154 11:56:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:17.412 11:56:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:15:17.412 11:56:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:17.412 11:56:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:15:17.412 11:56:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u edf09e32-8eb3-4193-922a-9d49af7ffa9d 00:15:17.671 [2024-07-12 11:56:07.754788] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:15:17.671 [2024-07-12 11:56:07.754910] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a79050 00:15:17.671 [2024-07-12 11:56:07.754918] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:15:17.671 [2024-07-12 11:56:07.755039] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18d3710 00:15:17.671 [2024-07-12 11:56:07.755119] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a79050 00:15:17.671 [2024-07-12 11:56:07.755124] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1a79050 00:15:17.671 [2024-07-12 11:56:07.755187] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:17.671 NewBaseBdev 00:15:17.671 11:56:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:15:17.671 11:56:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:15:17.671 11:56:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:17.671 11:56:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:17.671 11:56:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:17.671 11:56:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:17.671 11:56:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:17.930 11:56:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:15:17.930 [ 00:15:17.930 { 00:15:17.930 "name": "NewBaseBdev", 00:15:17.930 "aliases": [ 00:15:17.930 "edf09e32-8eb3-4193-922a-9d49af7ffa9d" 00:15:17.930 ], 00:15:17.930 "product_name": "Malloc disk", 00:15:17.930 "block_size": 512, 00:15:17.930 "num_blocks": 65536, 00:15:17.930 "uuid": "edf09e32-8eb3-4193-922a-9d49af7ffa9d", 00:15:17.930 "assigned_rate_limits": { 00:15:17.930 "rw_ios_per_sec": 0, 00:15:17.930 "rw_mbytes_per_sec": 0, 00:15:17.930 "r_mbytes_per_sec": 0, 00:15:17.930 "w_mbytes_per_sec": 0 00:15:17.930 }, 00:15:17.930 "claimed": true, 00:15:17.930 "claim_type": "exclusive_write", 00:15:17.930 "zoned": false, 00:15:17.930 "supported_io_types": { 00:15:17.930 "read": true, 00:15:17.930 "write": true, 00:15:17.930 "unmap": true, 00:15:17.930 "flush": true, 00:15:17.930 "reset": true, 00:15:17.930 "nvme_admin": false, 00:15:17.930 "nvme_io": false, 00:15:17.930 "nvme_io_md": false, 00:15:17.930 "write_zeroes": true, 00:15:17.930 "zcopy": true, 00:15:17.930 "get_zone_info": false, 00:15:17.930 "zone_management": false, 00:15:17.930 "zone_append": false, 00:15:17.930 "compare": false, 00:15:17.930 "compare_and_write": false, 00:15:17.930 "abort": true, 00:15:17.930 "seek_hole": false, 00:15:17.930 "seek_data": false, 00:15:17.930 "copy": true, 00:15:17.930 "nvme_iov_md": false 00:15:17.930 }, 00:15:17.930 "memory_domains": [ 00:15:17.930 { 00:15:17.930 "dma_device_id": "system", 00:15:17.930 "dma_device_type": 1 00:15:17.930 }, 00:15:17.930 { 00:15:17.930 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:17.930 "dma_device_type": 2 00:15:17.930 } 00:15:17.930 ], 00:15:17.930 "driver_specific": {} 00:15:17.930 } 00:15:17.930 ] 00:15:17.930 11:56:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:17.930 11:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:15:17.930 11:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:17.930 11:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:17.930 11:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:17.930 11:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:17.930 11:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:17.930 11:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:17.930 11:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:17.930 11:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:17.930 11:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:17.930 11:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:17.930 11:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:18.189 11:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:18.189 "name": "Existed_Raid", 00:15:18.189 "uuid": "fd07743a-6a49-4ef0-897d-33d745e66ec2", 00:15:18.189 "strip_size_kb": 64, 00:15:18.189 "state": "online", 00:15:18.189 "raid_level": "raid0", 00:15:18.189 "superblock": true, 00:15:18.189 "num_base_bdevs": 4, 00:15:18.189 "num_base_bdevs_discovered": 4, 00:15:18.189 "num_base_bdevs_operational": 4, 00:15:18.189 "base_bdevs_list": [ 00:15:18.189 { 00:15:18.189 "name": "NewBaseBdev", 00:15:18.189 "uuid": "edf09e32-8eb3-4193-922a-9d49af7ffa9d", 00:15:18.189 "is_configured": true, 00:15:18.189 "data_offset": 2048, 00:15:18.189 "data_size": 63488 00:15:18.189 }, 00:15:18.189 { 00:15:18.189 "name": "BaseBdev2", 00:15:18.189 "uuid": "58634589-7202-4567-8ca0-c1cd1f492bfb", 00:15:18.189 "is_configured": true, 00:15:18.189 "data_offset": 2048, 00:15:18.189 "data_size": 63488 00:15:18.189 }, 00:15:18.189 { 00:15:18.189 "name": "BaseBdev3", 00:15:18.189 "uuid": "cc3d0745-c1d8-4afe-a31c-dfd60a8f978d", 00:15:18.189 "is_configured": true, 00:15:18.189 "data_offset": 2048, 00:15:18.189 "data_size": 63488 00:15:18.189 }, 00:15:18.189 { 00:15:18.189 "name": "BaseBdev4", 00:15:18.189 "uuid": "6e869165-0c62-4973-94e7-2bd222d1bc68", 00:15:18.189 "is_configured": true, 00:15:18.189 "data_offset": 2048, 00:15:18.189 "data_size": 63488 00:15:18.189 } 00:15:18.189 ] 00:15:18.189 }' 00:15:18.189 11:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:18.189 11:56:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:18.778 11:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:15:18.778 11:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:18.778 11:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:18.778 11:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:18.778 11:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:18.778 11:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:18.778 11:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:18.778 11:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:18.778 [2024-07-12 11:56:08.881916] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:18.778 11:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:18.778 "name": "Existed_Raid", 00:15:18.778 "aliases": [ 00:15:18.778 "fd07743a-6a49-4ef0-897d-33d745e66ec2" 00:15:18.778 ], 00:15:18.778 "product_name": "Raid Volume", 00:15:18.778 "block_size": 512, 00:15:18.778 "num_blocks": 253952, 00:15:18.778 "uuid": "fd07743a-6a49-4ef0-897d-33d745e66ec2", 00:15:18.778 "assigned_rate_limits": { 00:15:18.778 "rw_ios_per_sec": 0, 00:15:18.778 "rw_mbytes_per_sec": 0, 00:15:18.778 "r_mbytes_per_sec": 0, 00:15:18.778 "w_mbytes_per_sec": 0 00:15:18.778 }, 00:15:18.778 "claimed": false, 00:15:18.778 "zoned": false, 00:15:18.778 "supported_io_types": { 00:15:18.778 "read": true, 00:15:18.778 "write": true, 00:15:18.778 "unmap": true, 00:15:18.778 "flush": true, 00:15:18.778 "reset": true, 00:15:18.778 "nvme_admin": false, 00:15:18.778 "nvme_io": false, 00:15:18.778 "nvme_io_md": false, 00:15:18.778 "write_zeroes": true, 00:15:18.778 "zcopy": false, 00:15:18.778 "get_zone_info": false, 00:15:18.778 "zone_management": false, 00:15:18.778 "zone_append": false, 00:15:18.778 "compare": false, 00:15:18.778 "compare_and_write": false, 00:15:18.778 "abort": false, 00:15:18.778 "seek_hole": false, 00:15:18.778 "seek_data": false, 00:15:18.778 "copy": false, 00:15:18.778 "nvme_iov_md": false 00:15:18.778 }, 00:15:18.778 "memory_domains": [ 00:15:18.778 { 00:15:18.778 "dma_device_id": "system", 00:15:18.778 "dma_device_type": 1 00:15:18.778 }, 00:15:18.778 { 00:15:18.778 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:18.778 "dma_device_type": 2 00:15:18.778 }, 00:15:18.778 { 00:15:18.778 "dma_device_id": "system", 00:15:18.778 "dma_device_type": 1 00:15:18.778 }, 00:15:18.778 { 00:15:18.778 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:18.778 "dma_device_type": 2 00:15:18.778 }, 00:15:18.778 { 00:15:18.778 "dma_device_id": "system", 00:15:18.778 "dma_device_type": 1 00:15:18.778 }, 00:15:18.778 { 00:15:18.778 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:18.778 "dma_device_type": 2 00:15:18.778 }, 00:15:18.778 { 00:15:18.778 "dma_device_id": "system", 00:15:18.778 "dma_device_type": 1 00:15:18.778 }, 00:15:18.778 { 00:15:18.778 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:18.778 "dma_device_type": 2 00:15:18.778 } 00:15:18.778 ], 00:15:18.778 "driver_specific": { 00:15:18.778 "raid": { 00:15:18.778 "uuid": "fd07743a-6a49-4ef0-897d-33d745e66ec2", 00:15:18.778 "strip_size_kb": 64, 00:15:18.778 "state": "online", 00:15:18.778 "raid_level": "raid0", 00:15:18.778 "superblock": true, 00:15:18.778 "num_base_bdevs": 4, 00:15:18.778 "num_base_bdevs_discovered": 4, 00:15:18.778 "num_base_bdevs_operational": 4, 00:15:18.778 "base_bdevs_list": [ 00:15:18.778 { 00:15:18.778 "name": "NewBaseBdev", 00:15:18.778 "uuid": "edf09e32-8eb3-4193-922a-9d49af7ffa9d", 00:15:18.778 "is_configured": true, 00:15:18.778 "data_offset": 2048, 00:15:18.778 "data_size": 63488 00:15:18.778 }, 00:15:18.778 { 00:15:18.778 "name": "BaseBdev2", 00:15:18.778 "uuid": "58634589-7202-4567-8ca0-c1cd1f492bfb", 00:15:18.778 "is_configured": true, 00:15:18.778 "data_offset": 2048, 00:15:18.778 "data_size": 63488 00:15:18.778 }, 00:15:18.778 { 00:15:18.778 "name": "BaseBdev3", 00:15:18.778 "uuid": "cc3d0745-c1d8-4afe-a31c-dfd60a8f978d", 00:15:18.778 "is_configured": true, 00:15:18.778 "data_offset": 2048, 00:15:18.778 "data_size": 63488 00:15:18.778 }, 00:15:18.778 { 00:15:18.779 "name": "BaseBdev4", 00:15:18.779 "uuid": "6e869165-0c62-4973-94e7-2bd222d1bc68", 00:15:18.779 "is_configured": true, 00:15:18.779 "data_offset": 2048, 00:15:18.779 "data_size": 63488 00:15:18.779 } 00:15:18.779 ] 00:15:18.779 } 00:15:18.779 } 00:15:18.779 }' 00:15:18.779 11:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:18.779 11:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:15:18.779 BaseBdev2 00:15:18.779 BaseBdev3 00:15:18.779 BaseBdev4' 00:15:18.779 11:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:18.779 11:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:18.779 11:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:19.038 11:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:19.038 "name": "NewBaseBdev", 00:15:19.038 "aliases": [ 00:15:19.038 "edf09e32-8eb3-4193-922a-9d49af7ffa9d" 00:15:19.038 ], 00:15:19.038 "product_name": "Malloc disk", 00:15:19.038 "block_size": 512, 00:15:19.038 "num_blocks": 65536, 00:15:19.038 "uuid": "edf09e32-8eb3-4193-922a-9d49af7ffa9d", 00:15:19.038 "assigned_rate_limits": { 00:15:19.038 "rw_ios_per_sec": 0, 00:15:19.038 "rw_mbytes_per_sec": 0, 00:15:19.038 "r_mbytes_per_sec": 0, 00:15:19.038 "w_mbytes_per_sec": 0 00:15:19.038 }, 00:15:19.038 "claimed": true, 00:15:19.038 "claim_type": "exclusive_write", 00:15:19.038 "zoned": false, 00:15:19.038 "supported_io_types": { 00:15:19.038 "read": true, 00:15:19.038 "write": true, 00:15:19.038 "unmap": true, 00:15:19.038 "flush": true, 00:15:19.038 "reset": true, 00:15:19.038 "nvme_admin": false, 00:15:19.038 "nvme_io": false, 00:15:19.038 "nvme_io_md": false, 00:15:19.038 "write_zeroes": true, 00:15:19.038 "zcopy": true, 00:15:19.038 "get_zone_info": false, 00:15:19.038 "zone_management": false, 00:15:19.038 "zone_append": false, 00:15:19.038 "compare": false, 00:15:19.038 "compare_and_write": false, 00:15:19.038 "abort": true, 00:15:19.038 "seek_hole": false, 00:15:19.038 "seek_data": false, 00:15:19.038 "copy": true, 00:15:19.038 "nvme_iov_md": false 00:15:19.038 }, 00:15:19.038 "memory_domains": [ 00:15:19.038 { 00:15:19.038 "dma_device_id": "system", 00:15:19.038 "dma_device_type": 1 00:15:19.038 }, 00:15:19.038 { 00:15:19.038 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:19.038 "dma_device_type": 2 00:15:19.038 } 00:15:19.038 ], 00:15:19.038 "driver_specific": {} 00:15:19.038 }' 00:15:19.038 11:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:19.038 11:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:19.038 11:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:19.038 11:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:19.038 11:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:19.038 11:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:19.038 11:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:19.297 11:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:19.297 11:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:19.297 11:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:19.297 11:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:19.297 11:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:19.297 11:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:19.297 11:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:19.297 11:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:19.556 11:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:19.556 "name": "BaseBdev2", 00:15:19.556 "aliases": [ 00:15:19.556 "58634589-7202-4567-8ca0-c1cd1f492bfb" 00:15:19.556 ], 00:15:19.556 "product_name": "Malloc disk", 00:15:19.556 "block_size": 512, 00:15:19.556 "num_blocks": 65536, 00:15:19.556 "uuid": "58634589-7202-4567-8ca0-c1cd1f492bfb", 00:15:19.556 "assigned_rate_limits": { 00:15:19.556 "rw_ios_per_sec": 0, 00:15:19.556 "rw_mbytes_per_sec": 0, 00:15:19.556 "r_mbytes_per_sec": 0, 00:15:19.556 "w_mbytes_per_sec": 0 00:15:19.556 }, 00:15:19.556 "claimed": true, 00:15:19.556 "claim_type": "exclusive_write", 00:15:19.556 "zoned": false, 00:15:19.556 "supported_io_types": { 00:15:19.556 "read": true, 00:15:19.556 "write": true, 00:15:19.556 "unmap": true, 00:15:19.556 "flush": true, 00:15:19.556 "reset": true, 00:15:19.556 "nvme_admin": false, 00:15:19.556 "nvme_io": false, 00:15:19.556 "nvme_io_md": false, 00:15:19.556 "write_zeroes": true, 00:15:19.556 "zcopy": true, 00:15:19.556 "get_zone_info": false, 00:15:19.556 "zone_management": false, 00:15:19.556 "zone_append": false, 00:15:19.556 "compare": false, 00:15:19.556 "compare_and_write": false, 00:15:19.556 "abort": true, 00:15:19.556 "seek_hole": false, 00:15:19.556 "seek_data": false, 00:15:19.556 "copy": true, 00:15:19.556 "nvme_iov_md": false 00:15:19.556 }, 00:15:19.556 "memory_domains": [ 00:15:19.556 { 00:15:19.556 "dma_device_id": "system", 00:15:19.556 "dma_device_type": 1 00:15:19.556 }, 00:15:19.556 { 00:15:19.556 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:19.556 "dma_device_type": 2 00:15:19.556 } 00:15:19.556 ], 00:15:19.556 "driver_specific": {} 00:15:19.556 }' 00:15:19.556 11:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:19.556 11:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:19.556 11:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:19.556 11:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:19.556 11:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:19.556 11:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:19.556 11:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:19.556 11:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:19.815 11:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:19.815 11:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:19.815 11:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:19.815 11:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:19.815 11:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:19.815 11:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:19.815 11:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:20.074 11:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:20.074 "name": "BaseBdev3", 00:15:20.074 "aliases": [ 00:15:20.074 "cc3d0745-c1d8-4afe-a31c-dfd60a8f978d" 00:15:20.074 ], 00:15:20.074 "product_name": "Malloc disk", 00:15:20.074 "block_size": 512, 00:15:20.074 "num_blocks": 65536, 00:15:20.074 "uuid": "cc3d0745-c1d8-4afe-a31c-dfd60a8f978d", 00:15:20.074 "assigned_rate_limits": { 00:15:20.074 "rw_ios_per_sec": 0, 00:15:20.074 "rw_mbytes_per_sec": 0, 00:15:20.074 "r_mbytes_per_sec": 0, 00:15:20.074 "w_mbytes_per_sec": 0 00:15:20.074 }, 00:15:20.074 "claimed": true, 00:15:20.074 "claim_type": "exclusive_write", 00:15:20.074 "zoned": false, 00:15:20.074 "supported_io_types": { 00:15:20.074 "read": true, 00:15:20.074 "write": true, 00:15:20.074 "unmap": true, 00:15:20.074 "flush": true, 00:15:20.074 "reset": true, 00:15:20.074 "nvme_admin": false, 00:15:20.074 "nvme_io": false, 00:15:20.074 "nvme_io_md": false, 00:15:20.074 "write_zeroes": true, 00:15:20.074 "zcopy": true, 00:15:20.074 "get_zone_info": false, 00:15:20.074 "zone_management": false, 00:15:20.074 "zone_append": false, 00:15:20.074 "compare": false, 00:15:20.074 "compare_and_write": false, 00:15:20.074 "abort": true, 00:15:20.074 "seek_hole": false, 00:15:20.074 "seek_data": false, 00:15:20.074 "copy": true, 00:15:20.074 "nvme_iov_md": false 00:15:20.074 }, 00:15:20.075 "memory_domains": [ 00:15:20.075 { 00:15:20.075 "dma_device_id": "system", 00:15:20.075 "dma_device_type": 1 00:15:20.075 }, 00:15:20.075 { 00:15:20.075 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:20.075 "dma_device_type": 2 00:15:20.075 } 00:15:20.075 ], 00:15:20.075 "driver_specific": {} 00:15:20.075 }' 00:15:20.075 11:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:20.075 11:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:20.075 11:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:20.075 11:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:20.075 11:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:20.075 11:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:20.075 11:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:20.075 11:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:20.334 11:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:20.334 11:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:20.334 11:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:20.334 11:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:20.334 11:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:20.334 11:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:15:20.334 11:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:20.334 11:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:20.334 "name": "BaseBdev4", 00:15:20.334 "aliases": [ 00:15:20.334 "6e869165-0c62-4973-94e7-2bd222d1bc68" 00:15:20.334 ], 00:15:20.334 "product_name": "Malloc disk", 00:15:20.334 "block_size": 512, 00:15:20.334 "num_blocks": 65536, 00:15:20.334 "uuid": "6e869165-0c62-4973-94e7-2bd222d1bc68", 00:15:20.334 "assigned_rate_limits": { 00:15:20.334 "rw_ios_per_sec": 0, 00:15:20.334 "rw_mbytes_per_sec": 0, 00:15:20.334 "r_mbytes_per_sec": 0, 00:15:20.334 "w_mbytes_per_sec": 0 00:15:20.334 }, 00:15:20.334 "claimed": true, 00:15:20.334 "claim_type": "exclusive_write", 00:15:20.334 "zoned": false, 00:15:20.334 "supported_io_types": { 00:15:20.334 "read": true, 00:15:20.334 "write": true, 00:15:20.334 "unmap": true, 00:15:20.334 "flush": true, 00:15:20.334 "reset": true, 00:15:20.334 "nvme_admin": false, 00:15:20.334 "nvme_io": false, 00:15:20.334 "nvme_io_md": false, 00:15:20.334 "write_zeroes": true, 00:15:20.334 "zcopy": true, 00:15:20.334 "get_zone_info": false, 00:15:20.334 "zone_management": false, 00:15:20.334 "zone_append": false, 00:15:20.334 "compare": false, 00:15:20.334 "compare_and_write": false, 00:15:20.334 "abort": true, 00:15:20.334 "seek_hole": false, 00:15:20.334 "seek_data": false, 00:15:20.334 "copy": true, 00:15:20.334 "nvme_iov_md": false 00:15:20.334 }, 00:15:20.334 "memory_domains": [ 00:15:20.334 { 00:15:20.334 "dma_device_id": "system", 00:15:20.334 "dma_device_type": 1 00:15:20.334 }, 00:15:20.334 { 00:15:20.334 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:20.334 "dma_device_type": 2 00:15:20.334 } 00:15:20.334 ], 00:15:20.334 "driver_specific": {} 00:15:20.334 }' 00:15:20.334 11:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:20.594 11:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:20.594 11:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:20.594 11:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:20.594 11:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:20.594 11:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:20.594 11:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:20.594 11:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:20.594 11:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:20.594 11:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:20.853 11:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:20.853 11:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:20.853 11:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:20.853 [2024-07-12 11:56:11.023263] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:20.853 [2024-07-12 11:56:11.023283] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:20.853 [2024-07-12 11:56:11.023321] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:20.853 [2024-07-12 11:56:11.023363] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:20.853 [2024-07-12 11:56:11.023369] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a79050 name Existed_Raid, state offline 00:15:20.853 11:56:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 653885 00:15:20.853 11:56:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 653885 ']' 00:15:20.853 11:56:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 653885 00:15:20.853 11:56:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:15:20.853 11:56:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:20.853 11:56:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 653885 00:15:20.853 11:56:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:20.853 11:56:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:20.853 11:56:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 653885' 00:15:20.853 killing process with pid 653885 00:15:20.853 11:56:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 653885 00:15:20.853 [2024-07-12 11:56:11.080615] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:20.853 11:56:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 653885 00:15:21.112 [2024-07-12 11:56:11.111927] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:21.112 11:56:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:15:21.112 00:15:21.112 real 0m24.141s 00:15:21.112 user 0m45.011s 00:15:21.112 sys 0m3.695s 00:15:21.112 11:56:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:21.112 11:56:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:21.112 ************************************ 00:15:21.112 END TEST raid_state_function_test_sb 00:15:21.112 ************************************ 00:15:21.112 11:56:11 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:21.112 11:56:11 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 4 00:15:21.112 11:56:11 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:15:21.112 11:56:11 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:21.112 11:56:11 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:21.371 ************************************ 00:15:21.371 START TEST raid_superblock_test 00:15:21.371 ************************************ 00:15:21.371 11:56:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 4 00:15:21.371 11:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:15:21.371 11:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:15:21.371 11:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:15:21.371 11:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:15:21.371 11:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:15:21.371 11:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:15:21.371 11:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:15:21.371 11:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:15:21.371 11:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:15:21.371 11:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:15:21.371 11:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:15:21.371 11:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:15:21.371 11:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:15:21.371 11:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:15:21.371 11:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:15:21.371 11:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:15:21.371 11:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=658651 00:15:21.371 11:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 658651 /var/tmp/spdk-raid.sock 00:15:21.371 11:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:15:21.371 11:56:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 658651 ']' 00:15:21.371 11:56:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:21.371 11:56:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:21.371 11:56:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:21.371 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:21.371 11:56:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:21.371 11:56:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:21.371 [2024-07-12 11:56:11.413712] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:15:21.371 [2024-07-12 11:56:11.413753] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid658651 ] 00:15:21.371 [2024-07-12 11:56:11.478436] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:21.371 [2024-07-12 11:56:11.548110] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:15:21.371 [2024-07-12 11:56:11.600353] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:21.371 [2024-07-12 11:56:11.600381] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:22.307 11:56:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:22.307 11:56:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:15:22.307 11:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:15:22.307 11:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:22.308 11:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:15:22.308 11:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:15:22.308 11:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:15:22.308 11:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:22.308 11:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:22.308 11:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:22.308 11:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:15:22.308 malloc1 00:15:22.308 11:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:22.308 [2024-07-12 11:56:12.521048] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:22.308 [2024-07-12 11:56:12.521084] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:22.308 [2024-07-12 11:56:12.521094] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2147270 00:15:22.308 [2024-07-12 11:56:12.521100] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:22.308 [2024-07-12 11:56:12.522170] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:22.308 [2024-07-12 11:56:12.522190] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:22.308 pt1 00:15:22.308 11:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:22.308 11:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:22.308 11:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:15:22.308 11:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:15:22.308 11:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:15:22.308 11:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:22.308 11:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:22.308 11:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:22.308 11:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:15:22.566 malloc2 00:15:22.566 11:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:22.825 [2024-07-12 11:56:12.857332] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:22.825 [2024-07-12 11:56:12.857360] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:22.825 [2024-07-12 11:56:12.857369] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2148580 00:15:22.825 [2024-07-12 11:56:12.857375] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:22.825 [2024-07-12 11:56:12.858348] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:22.825 [2024-07-12 11:56:12.858367] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:22.825 pt2 00:15:22.825 11:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:22.825 11:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:22.825 11:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:15:22.825 11:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:15:22.825 11:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:15:22.825 11:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:22.825 11:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:22.825 11:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:22.825 11:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:15:22.825 malloc3 00:15:22.825 11:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:23.083 [2024-07-12 11:56:13.177462] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:23.083 [2024-07-12 11:56:13.177489] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:23.083 [2024-07-12 11:56:13.177497] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22f2e30 00:15:23.083 [2024-07-12 11:56:13.177503] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:23.083 [2024-07-12 11:56:13.178443] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:23.083 [2024-07-12 11:56:13.178463] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:23.083 pt3 00:15:23.083 11:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:23.083 11:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:23.083 11:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:15:23.083 11:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:15:23.083 11:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:15:23.083 11:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:23.083 11:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:23.083 11:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:23.083 11:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:15:23.341 malloc4 00:15:23.341 11:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:15:23.341 [2024-07-12 11:56:13.521658] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:15:23.341 [2024-07-12 11:56:13.521690] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:23.341 [2024-07-12 11:56:13.521700] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22f5570 00:15:23.341 [2024-07-12 11:56:13.521706] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:23.341 [2024-07-12 11:56:13.522692] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:23.341 [2024-07-12 11:56:13.522711] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:15:23.341 pt4 00:15:23.341 11:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:23.341 11:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:23.341 11:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:15:23.599 [2024-07-12 11:56:13.702145] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:23.599 [2024-07-12 11:56:13.703038] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:23.599 [2024-07-12 11:56:13.703077] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:23.599 [2024-07-12 11:56:13.703109] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:15:23.599 [2024-07-12 11:56:13.703228] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x22f6a80 00:15:23.599 [2024-07-12 11:56:13.703235] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:15:23.599 [2024-07-12 11:56:13.703371] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2146430 00:15:23.599 [2024-07-12 11:56:13.703472] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22f6a80 00:15:23.599 [2024-07-12 11:56:13.703478] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x22f6a80 00:15:23.599 [2024-07-12 11:56:13.703549] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:23.599 11:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:15:23.599 11:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:23.599 11:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:23.599 11:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:23.599 11:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:23.599 11:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:23.599 11:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:23.599 11:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:23.599 11:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:23.599 11:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:23.599 11:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:23.599 11:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:23.858 11:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:23.858 "name": "raid_bdev1", 00:15:23.858 "uuid": "1da60072-1d77-487e-9d33-04d2d421c831", 00:15:23.858 "strip_size_kb": 64, 00:15:23.858 "state": "online", 00:15:23.858 "raid_level": "raid0", 00:15:23.858 "superblock": true, 00:15:23.858 "num_base_bdevs": 4, 00:15:23.858 "num_base_bdevs_discovered": 4, 00:15:23.858 "num_base_bdevs_operational": 4, 00:15:23.858 "base_bdevs_list": [ 00:15:23.858 { 00:15:23.858 "name": "pt1", 00:15:23.858 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:23.858 "is_configured": true, 00:15:23.858 "data_offset": 2048, 00:15:23.858 "data_size": 63488 00:15:23.858 }, 00:15:23.858 { 00:15:23.858 "name": "pt2", 00:15:23.858 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:23.858 "is_configured": true, 00:15:23.858 "data_offset": 2048, 00:15:23.858 "data_size": 63488 00:15:23.858 }, 00:15:23.858 { 00:15:23.858 "name": "pt3", 00:15:23.858 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:23.858 "is_configured": true, 00:15:23.858 "data_offset": 2048, 00:15:23.858 "data_size": 63488 00:15:23.858 }, 00:15:23.858 { 00:15:23.858 "name": "pt4", 00:15:23.858 "uuid": "00000000-0000-0000-0000-000000000004", 00:15:23.858 "is_configured": true, 00:15:23.858 "data_offset": 2048, 00:15:23.858 "data_size": 63488 00:15:23.858 } 00:15:23.858 ] 00:15:23.858 }' 00:15:23.858 11:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:23.858 11:56:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:24.479 11:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:15:24.479 11:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:24.479 11:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:24.479 11:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:24.479 11:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:24.479 11:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:24.479 11:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:24.479 11:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:24.479 [2024-07-12 11:56:14.548522] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:24.479 11:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:24.479 "name": "raid_bdev1", 00:15:24.479 "aliases": [ 00:15:24.479 "1da60072-1d77-487e-9d33-04d2d421c831" 00:15:24.479 ], 00:15:24.479 "product_name": "Raid Volume", 00:15:24.479 "block_size": 512, 00:15:24.479 "num_blocks": 253952, 00:15:24.479 "uuid": "1da60072-1d77-487e-9d33-04d2d421c831", 00:15:24.479 "assigned_rate_limits": { 00:15:24.479 "rw_ios_per_sec": 0, 00:15:24.479 "rw_mbytes_per_sec": 0, 00:15:24.479 "r_mbytes_per_sec": 0, 00:15:24.479 "w_mbytes_per_sec": 0 00:15:24.479 }, 00:15:24.479 "claimed": false, 00:15:24.479 "zoned": false, 00:15:24.479 "supported_io_types": { 00:15:24.479 "read": true, 00:15:24.479 "write": true, 00:15:24.479 "unmap": true, 00:15:24.479 "flush": true, 00:15:24.479 "reset": true, 00:15:24.479 "nvme_admin": false, 00:15:24.479 "nvme_io": false, 00:15:24.479 "nvme_io_md": false, 00:15:24.479 "write_zeroes": true, 00:15:24.479 "zcopy": false, 00:15:24.479 "get_zone_info": false, 00:15:24.479 "zone_management": false, 00:15:24.479 "zone_append": false, 00:15:24.479 "compare": false, 00:15:24.479 "compare_and_write": false, 00:15:24.479 "abort": false, 00:15:24.479 "seek_hole": false, 00:15:24.479 "seek_data": false, 00:15:24.479 "copy": false, 00:15:24.479 "nvme_iov_md": false 00:15:24.479 }, 00:15:24.479 "memory_domains": [ 00:15:24.479 { 00:15:24.479 "dma_device_id": "system", 00:15:24.479 "dma_device_type": 1 00:15:24.479 }, 00:15:24.479 { 00:15:24.479 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:24.479 "dma_device_type": 2 00:15:24.479 }, 00:15:24.479 { 00:15:24.479 "dma_device_id": "system", 00:15:24.479 "dma_device_type": 1 00:15:24.479 }, 00:15:24.479 { 00:15:24.479 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:24.479 "dma_device_type": 2 00:15:24.479 }, 00:15:24.479 { 00:15:24.479 "dma_device_id": "system", 00:15:24.479 "dma_device_type": 1 00:15:24.479 }, 00:15:24.479 { 00:15:24.479 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:24.479 "dma_device_type": 2 00:15:24.479 }, 00:15:24.479 { 00:15:24.479 "dma_device_id": "system", 00:15:24.479 "dma_device_type": 1 00:15:24.479 }, 00:15:24.479 { 00:15:24.479 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:24.479 "dma_device_type": 2 00:15:24.479 } 00:15:24.479 ], 00:15:24.479 "driver_specific": { 00:15:24.479 "raid": { 00:15:24.479 "uuid": "1da60072-1d77-487e-9d33-04d2d421c831", 00:15:24.479 "strip_size_kb": 64, 00:15:24.479 "state": "online", 00:15:24.479 "raid_level": "raid0", 00:15:24.479 "superblock": true, 00:15:24.479 "num_base_bdevs": 4, 00:15:24.479 "num_base_bdevs_discovered": 4, 00:15:24.479 "num_base_bdevs_operational": 4, 00:15:24.479 "base_bdevs_list": [ 00:15:24.479 { 00:15:24.479 "name": "pt1", 00:15:24.479 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:24.479 "is_configured": true, 00:15:24.479 "data_offset": 2048, 00:15:24.479 "data_size": 63488 00:15:24.479 }, 00:15:24.479 { 00:15:24.479 "name": "pt2", 00:15:24.479 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:24.479 "is_configured": true, 00:15:24.479 "data_offset": 2048, 00:15:24.479 "data_size": 63488 00:15:24.479 }, 00:15:24.479 { 00:15:24.479 "name": "pt3", 00:15:24.479 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:24.479 "is_configured": true, 00:15:24.479 "data_offset": 2048, 00:15:24.479 "data_size": 63488 00:15:24.479 }, 00:15:24.479 { 00:15:24.479 "name": "pt4", 00:15:24.479 "uuid": "00000000-0000-0000-0000-000000000004", 00:15:24.479 "is_configured": true, 00:15:24.479 "data_offset": 2048, 00:15:24.479 "data_size": 63488 00:15:24.479 } 00:15:24.479 ] 00:15:24.479 } 00:15:24.479 } 00:15:24.479 }' 00:15:24.479 11:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:24.479 11:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:24.479 pt2 00:15:24.479 pt3 00:15:24.479 pt4' 00:15:24.479 11:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:24.480 11:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:24.480 11:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:24.738 11:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:24.738 "name": "pt1", 00:15:24.738 "aliases": [ 00:15:24.738 "00000000-0000-0000-0000-000000000001" 00:15:24.738 ], 00:15:24.738 "product_name": "passthru", 00:15:24.738 "block_size": 512, 00:15:24.738 "num_blocks": 65536, 00:15:24.738 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:24.738 "assigned_rate_limits": { 00:15:24.738 "rw_ios_per_sec": 0, 00:15:24.738 "rw_mbytes_per_sec": 0, 00:15:24.738 "r_mbytes_per_sec": 0, 00:15:24.738 "w_mbytes_per_sec": 0 00:15:24.738 }, 00:15:24.738 "claimed": true, 00:15:24.738 "claim_type": "exclusive_write", 00:15:24.738 "zoned": false, 00:15:24.738 "supported_io_types": { 00:15:24.738 "read": true, 00:15:24.739 "write": true, 00:15:24.739 "unmap": true, 00:15:24.739 "flush": true, 00:15:24.739 "reset": true, 00:15:24.739 "nvme_admin": false, 00:15:24.739 "nvme_io": false, 00:15:24.739 "nvme_io_md": false, 00:15:24.739 "write_zeroes": true, 00:15:24.739 "zcopy": true, 00:15:24.739 "get_zone_info": false, 00:15:24.739 "zone_management": false, 00:15:24.739 "zone_append": false, 00:15:24.739 "compare": false, 00:15:24.739 "compare_and_write": false, 00:15:24.739 "abort": true, 00:15:24.739 "seek_hole": false, 00:15:24.739 "seek_data": false, 00:15:24.739 "copy": true, 00:15:24.739 "nvme_iov_md": false 00:15:24.739 }, 00:15:24.739 "memory_domains": [ 00:15:24.739 { 00:15:24.739 "dma_device_id": "system", 00:15:24.739 "dma_device_type": 1 00:15:24.739 }, 00:15:24.739 { 00:15:24.739 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:24.739 "dma_device_type": 2 00:15:24.739 } 00:15:24.739 ], 00:15:24.739 "driver_specific": { 00:15:24.739 "passthru": { 00:15:24.739 "name": "pt1", 00:15:24.739 "base_bdev_name": "malloc1" 00:15:24.739 } 00:15:24.739 } 00:15:24.739 }' 00:15:24.739 11:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:24.739 11:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:24.739 11:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:24.739 11:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:24.739 11:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:24.739 11:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:24.739 11:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:24.739 11:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:24.739 11:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:24.739 11:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:24.997 11:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:24.997 11:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:24.997 11:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:24.997 11:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:24.997 11:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:24.997 11:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:24.997 "name": "pt2", 00:15:24.997 "aliases": [ 00:15:24.997 "00000000-0000-0000-0000-000000000002" 00:15:24.997 ], 00:15:24.997 "product_name": "passthru", 00:15:24.997 "block_size": 512, 00:15:24.997 "num_blocks": 65536, 00:15:24.997 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:24.997 "assigned_rate_limits": { 00:15:24.997 "rw_ios_per_sec": 0, 00:15:24.997 "rw_mbytes_per_sec": 0, 00:15:24.997 "r_mbytes_per_sec": 0, 00:15:24.997 "w_mbytes_per_sec": 0 00:15:24.997 }, 00:15:24.997 "claimed": true, 00:15:24.997 "claim_type": "exclusive_write", 00:15:24.997 "zoned": false, 00:15:24.997 "supported_io_types": { 00:15:24.997 "read": true, 00:15:24.997 "write": true, 00:15:24.997 "unmap": true, 00:15:24.997 "flush": true, 00:15:24.997 "reset": true, 00:15:24.997 "nvme_admin": false, 00:15:24.997 "nvme_io": false, 00:15:24.997 "nvme_io_md": false, 00:15:24.997 "write_zeroes": true, 00:15:24.997 "zcopy": true, 00:15:24.997 "get_zone_info": false, 00:15:24.997 "zone_management": false, 00:15:24.997 "zone_append": false, 00:15:24.997 "compare": false, 00:15:24.997 "compare_and_write": false, 00:15:24.997 "abort": true, 00:15:24.997 "seek_hole": false, 00:15:24.997 "seek_data": false, 00:15:24.997 "copy": true, 00:15:24.997 "nvme_iov_md": false 00:15:24.997 }, 00:15:24.997 "memory_domains": [ 00:15:24.997 { 00:15:24.997 "dma_device_id": "system", 00:15:24.997 "dma_device_type": 1 00:15:24.997 }, 00:15:24.997 { 00:15:24.997 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:24.997 "dma_device_type": 2 00:15:24.997 } 00:15:24.997 ], 00:15:24.997 "driver_specific": { 00:15:24.997 "passthru": { 00:15:24.997 "name": "pt2", 00:15:24.998 "base_bdev_name": "malloc2" 00:15:24.998 } 00:15:24.998 } 00:15:24.998 }' 00:15:25.257 11:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:25.257 11:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:25.257 11:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:25.257 11:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:25.257 11:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:25.257 11:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:25.257 11:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:25.257 11:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:25.257 11:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:25.257 11:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:25.516 11:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:25.516 11:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:25.516 11:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:25.516 11:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:25.516 11:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:15:25.516 11:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:25.516 "name": "pt3", 00:15:25.516 "aliases": [ 00:15:25.516 "00000000-0000-0000-0000-000000000003" 00:15:25.516 ], 00:15:25.516 "product_name": "passthru", 00:15:25.516 "block_size": 512, 00:15:25.516 "num_blocks": 65536, 00:15:25.516 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:25.516 "assigned_rate_limits": { 00:15:25.516 "rw_ios_per_sec": 0, 00:15:25.516 "rw_mbytes_per_sec": 0, 00:15:25.516 "r_mbytes_per_sec": 0, 00:15:25.516 "w_mbytes_per_sec": 0 00:15:25.516 }, 00:15:25.516 "claimed": true, 00:15:25.516 "claim_type": "exclusive_write", 00:15:25.516 "zoned": false, 00:15:25.516 "supported_io_types": { 00:15:25.516 "read": true, 00:15:25.516 "write": true, 00:15:25.516 "unmap": true, 00:15:25.516 "flush": true, 00:15:25.516 "reset": true, 00:15:25.516 "nvme_admin": false, 00:15:25.516 "nvme_io": false, 00:15:25.516 "nvme_io_md": false, 00:15:25.516 "write_zeroes": true, 00:15:25.516 "zcopy": true, 00:15:25.516 "get_zone_info": false, 00:15:25.516 "zone_management": false, 00:15:25.516 "zone_append": false, 00:15:25.516 "compare": false, 00:15:25.516 "compare_and_write": false, 00:15:25.516 "abort": true, 00:15:25.516 "seek_hole": false, 00:15:25.516 "seek_data": false, 00:15:25.516 "copy": true, 00:15:25.516 "nvme_iov_md": false 00:15:25.516 }, 00:15:25.516 "memory_domains": [ 00:15:25.516 { 00:15:25.516 "dma_device_id": "system", 00:15:25.516 "dma_device_type": 1 00:15:25.516 }, 00:15:25.516 { 00:15:25.516 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:25.516 "dma_device_type": 2 00:15:25.516 } 00:15:25.516 ], 00:15:25.516 "driver_specific": { 00:15:25.516 "passthru": { 00:15:25.516 "name": "pt3", 00:15:25.516 "base_bdev_name": "malloc3" 00:15:25.516 } 00:15:25.516 } 00:15:25.516 }' 00:15:25.516 11:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:25.775 11:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:25.775 11:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:25.775 11:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:25.775 11:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:25.775 11:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:25.775 11:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:25.775 11:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:25.775 11:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:25.775 11:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:25.775 11:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:26.034 11:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:26.034 11:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:26.034 11:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:15:26.034 11:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:26.034 11:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:26.034 "name": "pt4", 00:15:26.034 "aliases": [ 00:15:26.034 "00000000-0000-0000-0000-000000000004" 00:15:26.034 ], 00:15:26.034 "product_name": "passthru", 00:15:26.034 "block_size": 512, 00:15:26.034 "num_blocks": 65536, 00:15:26.034 "uuid": "00000000-0000-0000-0000-000000000004", 00:15:26.034 "assigned_rate_limits": { 00:15:26.034 "rw_ios_per_sec": 0, 00:15:26.034 "rw_mbytes_per_sec": 0, 00:15:26.034 "r_mbytes_per_sec": 0, 00:15:26.034 "w_mbytes_per_sec": 0 00:15:26.034 }, 00:15:26.034 "claimed": true, 00:15:26.034 "claim_type": "exclusive_write", 00:15:26.034 "zoned": false, 00:15:26.034 "supported_io_types": { 00:15:26.034 "read": true, 00:15:26.034 "write": true, 00:15:26.034 "unmap": true, 00:15:26.034 "flush": true, 00:15:26.034 "reset": true, 00:15:26.034 "nvme_admin": false, 00:15:26.034 "nvme_io": false, 00:15:26.034 "nvme_io_md": false, 00:15:26.034 "write_zeroes": true, 00:15:26.034 "zcopy": true, 00:15:26.034 "get_zone_info": false, 00:15:26.034 "zone_management": false, 00:15:26.034 "zone_append": false, 00:15:26.034 "compare": false, 00:15:26.034 "compare_and_write": false, 00:15:26.034 "abort": true, 00:15:26.034 "seek_hole": false, 00:15:26.034 "seek_data": false, 00:15:26.034 "copy": true, 00:15:26.034 "nvme_iov_md": false 00:15:26.034 }, 00:15:26.034 "memory_domains": [ 00:15:26.034 { 00:15:26.034 "dma_device_id": "system", 00:15:26.034 "dma_device_type": 1 00:15:26.034 }, 00:15:26.034 { 00:15:26.034 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:26.034 "dma_device_type": 2 00:15:26.034 } 00:15:26.034 ], 00:15:26.034 "driver_specific": { 00:15:26.034 "passthru": { 00:15:26.034 "name": "pt4", 00:15:26.034 "base_bdev_name": "malloc4" 00:15:26.034 } 00:15:26.034 } 00:15:26.034 }' 00:15:26.034 11:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:26.034 11:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:26.293 11:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:26.293 11:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:26.293 11:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:26.293 11:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:26.293 11:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:26.293 11:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:26.293 11:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:26.293 11:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:26.293 11:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:26.293 11:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:26.293 11:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:15:26.293 11:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:26.552 [2024-07-12 11:56:16.678098] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:26.552 11:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=1da60072-1d77-487e-9d33-04d2d421c831 00:15:26.552 11:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 1da60072-1d77-487e-9d33-04d2d421c831 ']' 00:15:26.552 11:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:26.810 [2024-07-12 11:56:16.846318] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:26.810 [2024-07-12 11:56:16.846331] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:26.810 [2024-07-12 11:56:16.846366] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:26.811 [2024-07-12 11:56:16.846410] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:26.811 [2024-07-12 11:56:16.846415] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22f6a80 name raid_bdev1, state offline 00:15:26.811 11:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:26.811 11:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:15:26.811 11:56:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:15:26.811 11:56:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:15:26.811 11:56:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:26.811 11:56:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:15:27.069 11:56:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:27.069 11:56:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:27.328 11:56:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:27.328 11:56:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:15:27.328 11:56:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:27.328 11:56:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:15:27.587 11:56:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:15:27.587 11:56:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:15:27.846 11:56:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:15:27.847 11:56:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:15:27.847 11:56:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:15:27.847 11:56:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:15:27.847 11:56:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:27.847 11:56:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:27.847 11:56:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:27.847 11:56:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:27.847 11:56:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:27.847 11:56:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:27.847 11:56:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:27.847 11:56:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:15:27.847 11:56:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:15:27.847 [2024-07-12 11:56:17.997274] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:15:27.847 [2024-07-12 11:56:17.998249] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:15:27.847 [2024-07-12 11:56:17.998281] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:15:27.847 [2024-07-12 11:56:17.998301] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:15:27.847 [2024-07-12 11:56:17.998332] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:15:27.847 [2024-07-12 11:56:17.998358] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:15:27.847 [2024-07-12 11:56:17.998370] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:15:27.847 [2024-07-12 11:56:17.998382] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:15:27.847 [2024-07-12 11:56:17.998391] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:27.847 [2024-07-12 11:56:17.998398] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22f7dd0 name raid_bdev1, state configuring 00:15:27.847 request: 00:15:27.847 { 00:15:27.847 "name": "raid_bdev1", 00:15:27.847 "raid_level": "raid0", 00:15:27.847 "base_bdevs": [ 00:15:27.847 "malloc1", 00:15:27.847 "malloc2", 00:15:27.847 "malloc3", 00:15:27.847 "malloc4" 00:15:27.847 ], 00:15:27.847 "superblock": false, 00:15:27.847 "strip_size_kb": 64, 00:15:27.847 "method": "bdev_raid_create", 00:15:27.847 "req_id": 1 00:15:27.847 } 00:15:27.847 Got JSON-RPC error response 00:15:27.847 response: 00:15:27.847 { 00:15:27.847 "code": -17, 00:15:27.847 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:15:27.847 } 00:15:27.847 11:56:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:15:27.847 11:56:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:15:27.847 11:56:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:15:27.847 11:56:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:15:27.847 11:56:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:27.847 11:56:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:15:28.106 11:56:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:15:28.106 11:56:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:15:28.106 11:56:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:28.106 [2024-07-12 11:56:18.326081] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:28.106 [2024-07-12 11:56:18.326114] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:28.106 [2024-07-12 11:56:18.326124] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2147d90 00:15:28.106 [2024-07-12 11:56:18.326130] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:28.107 [2024-07-12 11:56:18.327335] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:28.107 [2024-07-12 11:56:18.327357] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:28.107 [2024-07-12 11:56:18.327407] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:15:28.107 [2024-07-12 11:56:18.327425] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:28.107 pt1 00:15:28.107 11:56:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:15:28.107 11:56:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:28.107 11:56:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:28.107 11:56:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:28.107 11:56:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:28.107 11:56:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:28.107 11:56:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:28.107 11:56:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:28.107 11:56:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:28.107 11:56:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:28.107 11:56:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:28.107 11:56:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:28.366 11:56:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:28.366 "name": "raid_bdev1", 00:15:28.366 "uuid": "1da60072-1d77-487e-9d33-04d2d421c831", 00:15:28.366 "strip_size_kb": 64, 00:15:28.366 "state": "configuring", 00:15:28.366 "raid_level": "raid0", 00:15:28.366 "superblock": true, 00:15:28.366 "num_base_bdevs": 4, 00:15:28.366 "num_base_bdevs_discovered": 1, 00:15:28.366 "num_base_bdevs_operational": 4, 00:15:28.366 "base_bdevs_list": [ 00:15:28.366 { 00:15:28.366 "name": "pt1", 00:15:28.366 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:28.366 "is_configured": true, 00:15:28.366 "data_offset": 2048, 00:15:28.366 "data_size": 63488 00:15:28.366 }, 00:15:28.366 { 00:15:28.366 "name": null, 00:15:28.366 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:28.366 "is_configured": false, 00:15:28.366 "data_offset": 2048, 00:15:28.366 "data_size": 63488 00:15:28.366 }, 00:15:28.366 { 00:15:28.366 "name": null, 00:15:28.366 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:28.366 "is_configured": false, 00:15:28.366 "data_offset": 2048, 00:15:28.366 "data_size": 63488 00:15:28.366 }, 00:15:28.366 { 00:15:28.366 "name": null, 00:15:28.366 "uuid": "00000000-0000-0000-0000-000000000004", 00:15:28.366 "is_configured": false, 00:15:28.366 "data_offset": 2048, 00:15:28.366 "data_size": 63488 00:15:28.366 } 00:15:28.366 ] 00:15:28.366 }' 00:15:28.366 11:56:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:28.366 11:56:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:28.934 11:56:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:15:28.934 11:56:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:28.934 [2024-07-12 11:56:19.136176] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:28.934 [2024-07-12 11:56:19.136217] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:28.934 [2024-07-12 11:56:19.136244] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22f7090 00:15:28.934 [2024-07-12 11:56:19.136251] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:28.934 [2024-07-12 11:56:19.136513] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:28.934 [2024-07-12 11:56:19.136530] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:28.934 [2024-07-12 11:56:19.136576] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:28.934 [2024-07-12 11:56:19.136589] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:28.934 pt2 00:15:28.934 11:56:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:29.193 [2024-07-12 11:56:19.300621] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:15:29.193 11:56:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:15:29.193 11:56:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:29.193 11:56:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:29.193 11:56:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:29.193 11:56:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:29.193 11:56:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:29.193 11:56:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:29.193 11:56:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:29.193 11:56:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:29.193 11:56:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:29.193 11:56:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:29.193 11:56:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:29.452 11:56:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:29.452 "name": "raid_bdev1", 00:15:29.452 "uuid": "1da60072-1d77-487e-9d33-04d2d421c831", 00:15:29.452 "strip_size_kb": 64, 00:15:29.452 "state": "configuring", 00:15:29.452 "raid_level": "raid0", 00:15:29.452 "superblock": true, 00:15:29.452 "num_base_bdevs": 4, 00:15:29.452 "num_base_bdevs_discovered": 1, 00:15:29.452 "num_base_bdevs_operational": 4, 00:15:29.452 "base_bdevs_list": [ 00:15:29.452 { 00:15:29.452 "name": "pt1", 00:15:29.452 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:29.452 "is_configured": true, 00:15:29.452 "data_offset": 2048, 00:15:29.452 "data_size": 63488 00:15:29.452 }, 00:15:29.452 { 00:15:29.452 "name": null, 00:15:29.452 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:29.452 "is_configured": false, 00:15:29.452 "data_offset": 2048, 00:15:29.452 "data_size": 63488 00:15:29.452 }, 00:15:29.452 { 00:15:29.452 "name": null, 00:15:29.452 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:29.452 "is_configured": false, 00:15:29.452 "data_offset": 2048, 00:15:29.452 "data_size": 63488 00:15:29.452 }, 00:15:29.452 { 00:15:29.452 "name": null, 00:15:29.452 "uuid": "00000000-0000-0000-0000-000000000004", 00:15:29.452 "is_configured": false, 00:15:29.452 "data_offset": 2048, 00:15:29.452 "data_size": 63488 00:15:29.452 } 00:15:29.452 ] 00:15:29.452 }' 00:15:29.452 11:56:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:29.452 11:56:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:29.711 11:56:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:15:29.711 11:56:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:29.711 11:56:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:29.969 [2024-07-12 11:56:20.082626] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:29.969 [2024-07-12 11:56:20.082671] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:29.969 [2024-07-12 11:56:20.082699] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22f8300 00:15:29.969 [2024-07-12 11:56:20.082706] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:29.969 [2024-07-12 11:56:20.082966] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:29.969 [2024-07-12 11:56:20.082975] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:29.969 [2024-07-12 11:56:20.083021] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:29.969 [2024-07-12 11:56:20.083034] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:29.969 pt2 00:15:29.969 11:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:15:29.969 11:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:29.970 11:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:30.229 [2024-07-12 11:56:20.255054] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:30.229 [2024-07-12 11:56:20.255072] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:30.229 [2024-07-12 11:56:20.255081] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21474a0 00:15:30.229 [2024-07-12 11:56:20.255087] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:30.229 [2024-07-12 11:56:20.255290] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:30.229 [2024-07-12 11:56:20.255299] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:30.229 [2024-07-12 11:56:20.255331] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:15:30.229 [2024-07-12 11:56:20.255341] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:30.229 pt3 00:15:30.229 11:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:15:30.229 11:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:30.229 11:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:15:30.229 [2024-07-12 11:56:20.431532] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:15:30.229 [2024-07-12 11:56:20.431561] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:30.229 [2024-07-12 11:56:20.431570] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22f1280 00:15:30.229 [2024-07-12 11:56:20.431577] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:30.229 [2024-07-12 11:56:20.431798] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:30.229 [2024-07-12 11:56:20.431809] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:15:30.229 [2024-07-12 11:56:20.431843] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:15:30.229 [2024-07-12 11:56:20.431855] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:15:30.229 [2024-07-12 11:56:20.431943] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x22f3ff0 00:15:30.229 [2024-07-12 11:56:20.431949] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:15:30.229 [2024-07-12 11:56:20.432076] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2147750 00:15:30.229 [2024-07-12 11:56:20.432174] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22f3ff0 00:15:30.229 [2024-07-12 11:56:20.432180] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x22f3ff0 00:15:30.229 [2024-07-12 11:56:20.432250] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:30.229 pt4 00:15:30.229 11:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:15:30.229 11:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:30.229 11:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:15:30.229 11:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:30.229 11:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:30.229 11:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:30.229 11:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:30.229 11:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:30.229 11:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:30.229 11:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:30.229 11:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:30.229 11:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:30.229 11:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:30.229 11:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:30.488 11:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:30.488 "name": "raid_bdev1", 00:15:30.488 "uuid": "1da60072-1d77-487e-9d33-04d2d421c831", 00:15:30.488 "strip_size_kb": 64, 00:15:30.488 "state": "online", 00:15:30.488 "raid_level": "raid0", 00:15:30.488 "superblock": true, 00:15:30.488 "num_base_bdevs": 4, 00:15:30.488 "num_base_bdevs_discovered": 4, 00:15:30.488 "num_base_bdevs_operational": 4, 00:15:30.488 "base_bdevs_list": [ 00:15:30.488 { 00:15:30.488 "name": "pt1", 00:15:30.488 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:30.488 "is_configured": true, 00:15:30.488 "data_offset": 2048, 00:15:30.488 "data_size": 63488 00:15:30.488 }, 00:15:30.488 { 00:15:30.488 "name": "pt2", 00:15:30.489 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:30.489 "is_configured": true, 00:15:30.489 "data_offset": 2048, 00:15:30.489 "data_size": 63488 00:15:30.489 }, 00:15:30.489 { 00:15:30.489 "name": "pt3", 00:15:30.489 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:30.489 "is_configured": true, 00:15:30.489 "data_offset": 2048, 00:15:30.489 "data_size": 63488 00:15:30.489 }, 00:15:30.489 { 00:15:30.489 "name": "pt4", 00:15:30.489 "uuid": "00000000-0000-0000-0000-000000000004", 00:15:30.489 "is_configured": true, 00:15:30.489 "data_offset": 2048, 00:15:30.489 "data_size": 63488 00:15:30.489 } 00:15:30.489 ] 00:15:30.489 }' 00:15:30.489 11:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:30.489 11:56:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:31.056 11:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:15:31.056 11:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:31.056 11:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:31.056 11:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:31.056 11:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:31.056 11:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:31.056 11:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:31.056 11:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:31.056 [2024-07-12 11:56:21.253839] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:31.056 11:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:31.056 "name": "raid_bdev1", 00:15:31.056 "aliases": [ 00:15:31.056 "1da60072-1d77-487e-9d33-04d2d421c831" 00:15:31.056 ], 00:15:31.056 "product_name": "Raid Volume", 00:15:31.056 "block_size": 512, 00:15:31.056 "num_blocks": 253952, 00:15:31.056 "uuid": "1da60072-1d77-487e-9d33-04d2d421c831", 00:15:31.056 "assigned_rate_limits": { 00:15:31.056 "rw_ios_per_sec": 0, 00:15:31.056 "rw_mbytes_per_sec": 0, 00:15:31.056 "r_mbytes_per_sec": 0, 00:15:31.056 "w_mbytes_per_sec": 0 00:15:31.056 }, 00:15:31.056 "claimed": false, 00:15:31.056 "zoned": false, 00:15:31.056 "supported_io_types": { 00:15:31.056 "read": true, 00:15:31.056 "write": true, 00:15:31.056 "unmap": true, 00:15:31.056 "flush": true, 00:15:31.056 "reset": true, 00:15:31.056 "nvme_admin": false, 00:15:31.056 "nvme_io": false, 00:15:31.056 "nvme_io_md": false, 00:15:31.056 "write_zeroes": true, 00:15:31.056 "zcopy": false, 00:15:31.056 "get_zone_info": false, 00:15:31.056 "zone_management": false, 00:15:31.056 "zone_append": false, 00:15:31.056 "compare": false, 00:15:31.056 "compare_and_write": false, 00:15:31.056 "abort": false, 00:15:31.056 "seek_hole": false, 00:15:31.056 "seek_data": false, 00:15:31.056 "copy": false, 00:15:31.056 "nvme_iov_md": false 00:15:31.056 }, 00:15:31.056 "memory_domains": [ 00:15:31.056 { 00:15:31.056 "dma_device_id": "system", 00:15:31.056 "dma_device_type": 1 00:15:31.056 }, 00:15:31.056 { 00:15:31.056 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:31.056 "dma_device_type": 2 00:15:31.056 }, 00:15:31.056 { 00:15:31.056 "dma_device_id": "system", 00:15:31.056 "dma_device_type": 1 00:15:31.056 }, 00:15:31.056 { 00:15:31.056 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:31.056 "dma_device_type": 2 00:15:31.056 }, 00:15:31.056 { 00:15:31.056 "dma_device_id": "system", 00:15:31.056 "dma_device_type": 1 00:15:31.056 }, 00:15:31.056 { 00:15:31.056 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:31.056 "dma_device_type": 2 00:15:31.056 }, 00:15:31.056 { 00:15:31.056 "dma_device_id": "system", 00:15:31.056 "dma_device_type": 1 00:15:31.056 }, 00:15:31.056 { 00:15:31.056 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:31.056 "dma_device_type": 2 00:15:31.056 } 00:15:31.056 ], 00:15:31.056 "driver_specific": { 00:15:31.056 "raid": { 00:15:31.056 "uuid": "1da60072-1d77-487e-9d33-04d2d421c831", 00:15:31.056 "strip_size_kb": 64, 00:15:31.056 "state": "online", 00:15:31.056 "raid_level": "raid0", 00:15:31.056 "superblock": true, 00:15:31.056 "num_base_bdevs": 4, 00:15:31.056 "num_base_bdevs_discovered": 4, 00:15:31.056 "num_base_bdevs_operational": 4, 00:15:31.056 "base_bdevs_list": [ 00:15:31.056 { 00:15:31.056 "name": "pt1", 00:15:31.056 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:31.056 "is_configured": true, 00:15:31.056 "data_offset": 2048, 00:15:31.056 "data_size": 63488 00:15:31.056 }, 00:15:31.056 { 00:15:31.056 "name": "pt2", 00:15:31.056 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:31.056 "is_configured": true, 00:15:31.056 "data_offset": 2048, 00:15:31.056 "data_size": 63488 00:15:31.056 }, 00:15:31.056 { 00:15:31.056 "name": "pt3", 00:15:31.056 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:31.056 "is_configured": true, 00:15:31.056 "data_offset": 2048, 00:15:31.056 "data_size": 63488 00:15:31.056 }, 00:15:31.056 { 00:15:31.056 "name": "pt4", 00:15:31.056 "uuid": "00000000-0000-0000-0000-000000000004", 00:15:31.056 "is_configured": true, 00:15:31.056 "data_offset": 2048, 00:15:31.056 "data_size": 63488 00:15:31.056 } 00:15:31.056 ] 00:15:31.056 } 00:15:31.056 } 00:15:31.056 }' 00:15:31.056 11:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:31.316 11:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:31.316 pt2 00:15:31.316 pt3 00:15:31.316 pt4' 00:15:31.316 11:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:31.316 11:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:31.316 11:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:31.316 11:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:31.316 "name": "pt1", 00:15:31.316 "aliases": [ 00:15:31.316 "00000000-0000-0000-0000-000000000001" 00:15:31.316 ], 00:15:31.316 "product_name": "passthru", 00:15:31.316 "block_size": 512, 00:15:31.316 "num_blocks": 65536, 00:15:31.316 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:31.316 "assigned_rate_limits": { 00:15:31.316 "rw_ios_per_sec": 0, 00:15:31.316 "rw_mbytes_per_sec": 0, 00:15:31.316 "r_mbytes_per_sec": 0, 00:15:31.316 "w_mbytes_per_sec": 0 00:15:31.316 }, 00:15:31.316 "claimed": true, 00:15:31.316 "claim_type": "exclusive_write", 00:15:31.316 "zoned": false, 00:15:31.316 "supported_io_types": { 00:15:31.316 "read": true, 00:15:31.316 "write": true, 00:15:31.316 "unmap": true, 00:15:31.316 "flush": true, 00:15:31.316 "reset": true, 00:15:31.316 "nvme_admin": false, 00:15:31.316 "nvme_io": false, 00:15:31.316 "nvme_io_md": false, 00:15:31.316 "write_zeroes": true, 00:15:31.316 "zcopy": true, 00:15:31.316 "get_zone_info": false, 00:15:31.316 "zone_management": false, 00:15:31.316 "zone_append": false, 00:15:31.316 "compare": false, 00:15:31.316 "compare_and_write": false, 00:15:31.316 "abort": true, 00:15:31.316 "seek_hole": false, 00:15:31.316 "seek_data": false, 00:15:31.316 "copy": true, 00:15:31.316 "nvme_iov_md": false 00:15:31.316 }, 00:15:31.316 "memory_domains": [ 00:15:31.316 { 00:15:31.316 "dma_device_id": "system", 00:15:31.316 "dma_device_type": 1 00:15:31.316 }, 00:15:31.316 { 00:15:31.316 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:31.316 "dma_device_type": 2 00:15:31.316 } 00:15:31.316 ], 00:15:31.316 "driver_specific": { 00:15:31.316 "passthru": { 00:15:31.316 "name": "pt1", 00:15:31.316 "base_bdev_name": "malloc1" 00:15:31.316 } 00:15:31.316 } 00:15:31.316 }' 00:15:31.316 11:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:31.316 11:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:31.573 11:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:31.573 11:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:31.573 11:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:31.573 11:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:31.574 11:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:31.574 11:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:31.574 11:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:31.574 11:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:31.574 11:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:31.574 11:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:31.574 11:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:31.574 11:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:31.574 11:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:31.832 11:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:31.832 "name": "pt2", 00:15:31.832 "aliases": [ 00:15:31.832 "00000000-0000-0000-0000-000000000002" 00:15:31.832 ], 00:15:31.832 "product_name": "passthru", 00:15:31.832 "block_size": 512, 00:15:31.832 "num_blocks": 65536, 00:15:31.832 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:31.832 "assigned_rate_limits": { 00:15:31.832 "rw_ios_per_sec": 0, 00:15:31.832 "rw_mbytes_per_sec": 0, 00:15:31.832 "r_mbytes_per_sec": 0, 00:15:31.832 "w_mbytes_per_sec": 0 00:15:31.832 }, 00:15:31.832 "claimed": true, 00:15:31.832 "claim_type": "exclusive_write", 00:15:31.832 "zoned": false, 00:15:31.832 "supported_io_types": { 00:15:31.832 "read": true, 00:15:31.832 "write": true, 00:15:31.832 "unmap": true, 00:15:31.832 "flush": true, 00:15:31.832 "reset": true, 00:15:31.832 "nvme_admin": false, 00:15:31.832 "nvme_io": false, 00:15:31.832 "nvme_io_md": false, 00:15:31.832 "write_zeroes": true, 00:15:31.832 "zcopy": true, 00:15:31.832 "get_zone_info": false, 00:15:31.832 "zone_management": false, 00:15:31.832 "zone_append": false, 00:15:31.832 "compare": false, 00:15:31.832 "compare_and_write": false, 00:15:31.832 "abort": true, 00:15:31.832 "seek_hole": false, 00:15:31.832 "seek_data": false, 00:15:31.832 "copy": true, 00:15:31.832 "nvme_iov_md": false 00:15:31.832 }, 00:15:31.832 "memory_domains": [ 00:15:31.832 { 00:15:31.832 "dma_device_id": "system", 00:15:31.832 "dma_device_type": 1 00:15:31.832 }, 00:15:31.832 { 00:15:31.832 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:31.832 "dma_device_type": 2 00:15:31.832 } 00:15:31.832 ], 00:15:31.832 "driver_specific": { 00:15:31.832 "passthru": { 00:15:31.832 "name": "pt2", 00:15:31.832 "base_bdev_name": "malloc2" 00:15:31.832 } 00:15:31.832 } 00:15:31.832 }' 00:15:31.832 11:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:31.832 11:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:31.832 11:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:31.832 11:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:32.091 11:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:32.091 11:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:32.091 11:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:32.091 11:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:32.091 11:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:32.091 11:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:32.091 11:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:32.091 11:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:32.091 11:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:32.091 11:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:15:32.091 11:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:32.349 11:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:32.349 "name": "pt3", 00:15:32.349 "aliases": [ 00:15:32.349 "00000000-0000-0000-0000-000000000003" 00:15:32.349 ], 00:15:32.349 "product_name": "passthru", 00:15:32.349 "block_size": 512, 00:15:32.349 "num_blocks": 65536, 00:15:32.349 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:32.349 "assigned_rate_limits": { 00:15:32.349 "rw_ios_per_sec": 0, 00:15:32.349 "rw_mbytes_per_sec": 0, 00:15:32.349 "r_mbytes_per_sec": 0, 00:15:32.349 "w_mbytes_per_sec": 0 00:15:32.349 }, 00:15:32.349 "claimed": true, 00:15:32.349 "claim_type": "exclusive_write", 00:15:32.349 "zoned": false, 00:15:32.349 "supported_io_types": { 00:15:32.349 "read": true, 00:15:32.349 "write": true, 00:15:32.349 "unmap": true, 00:15:32.349 "flush": true, 00:15:32.349 "reset": true, 00:15:32.349 "nvme_admin": false, 00:15:32.349 "nvme_io": false, 00:15:32.349 "nvme_io_md": false, 00:15:32.349 "write_zeroes": true, 00:15:32.349 "zcopy": true, 00:15:32.349 "get_zone_info": false, 00:15:32.349 "zone_management": false, 00:15:32.349 "zone_append": false, 00:15:32.349 "compare": false, 00:15:32.349 "compare_and_write": false, 00:15:32.349 "abort": true, 00:15:32.349 "seek_hole": false, 00:15:32.350 "seek_data": false, 00:15:32.350 "copy": true, 00:15:32.350 "nvme_iov_md": false 00:15:32.350 }, 00:15:32.350 "memory_domains": [ 00:15:32.350 { 00:15:32.350 "dma_device_id": "system", 00:15:32.350 "dma_device_type": 1 00:15:32.350 }, 00:15:32.350 { 00:15:32.350 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:32.350 "dma_device_type": 2 00:15:32.350 } 00:15:32.350 ], 00:15:32.350 "driver_specific": { 00:15:32.350 "passthru": { 00:15:32.350 "name": "pt3", 00:15:32.350 "base_bdev_name": "malloc3" 00:15:32.350 } 00:15:32.350 } 00:15:32.350 }' 00:15:32.350 11:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:32.350 11:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:32.350 11:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:32.350 11:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:32.350 11:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:32.608 11:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:32.609 11:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:32.609 11:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:32.609 11:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:32.609 11:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:32.609 11:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:32.609 11:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:32.609 11:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:32.609 11:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:15:32.609 11:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:32.867 11:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:32.867 "name": "pt4", 00:15:32.867 "aliases": [ 00:15:32.867 "00000000-0000-0000-0000-000000000004" 00:15:32.867 ], 00:15:32.867 "product_name": "passthru", 00:15:32.867 "block_size": 512, 00:15:32.867 "num_blocks": 65536, 00:15:32.867 "uuid": "00000000-0000-0000-0000-000000000004", 00:15:32.867 "assigned_rate_limits": { 00:15:32.867 "rw_ios_per_sec": 0, 00:15:32.867 "rw_mbytes_per_sec": 0, 00:15:32.867 "r_mbytes_per_sec": 0, 00:15:32.867 "w_mbytes_per_sec": 0 00:15:32.867 }, 00:15:32.867 "claimed": true, 00:15:32.867 "claim_type": "exclusive_write", 00:15:32.867 "zoned": false, 00:15:32.867 "supported_io_types": { 00:15:32.867 "read": true, 00:15:32.867 "write": true, 00:15:32.867 "unmap": true, 00:15:32.867 "flush": true, 00:15:32.867 "reset": true, 00:15:32.867 "nvme_admin": false, 00:15:32.867 "nvme_io": false, 00:15:32.867 "nvme_io_md": false, 00:15:32.867 "write_zeroes": true, 00:15:32.868 "zcopy": true, 00:15:32.868 "get_zone_info": false, 00:15:32.868 "zone_management": false, 00:15:32.868 "zone_append": false, 00:15:32.868 "compare": false, 00:15:32.868 "compare_and_write": false, 00:15:32.868 "abort": true, 00:15:32.868 "seek_hole": false, 00:15:32.868 "seek_data": false, 00:15:32.868 "copy": true, 00:15:32.868 "nvme_iov_md": false 00:15:32.868 }, 00:15:32.868 "memory_domains": [ 00:15:32.868 { 00:15:32.868 "dma_device_id": "system", 00:15:32.868 "dma_device_type": 1 00:15:32.868 }, 00:15:32.868 { 00:15:32.868 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:32.868 "dma_device_type": 2 00:15:32.868 } 00:15:32.868 ], 00:15:32.868 "driver_specific": { 00:15:32.868 "passthru": { 00:15:32.868 "name": "pt4", 00:15:32.868 "base_bdev_name": "malloc4" 00:15:32.868 } 00:15:32.868 } 00:15:32.868 }' 00:15:32.868 11:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:32.868 11:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:32.868 11:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:32.868 11:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:32.868 11:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:32.868 11:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:32.868 11:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:33.126 11:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:33.126 11:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:33.126 11:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:33.126 11:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:33.126 11:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:33.126 11:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:33.126 11:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:15:33.385 [2024-07-12 11:56:23.403396] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:33.385 11:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 1da60072-1d77-487e-9d33-04d2d421c831 '!=' 1da60072-1d77-487e-9d33-04d2d421c831 ']' 00:15:33.385 11:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:15:33.385 11:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:33.385 11:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:33.385 11:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 658651 00:15:33.385 11:56:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 658651 ']' 00:15:33.385 11:56:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 658651 00:15:33.385 11:56:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:15:33.385 11:56:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:33.385 11:56:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 658651 00:15:33.385 11:56:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:33.385 11:56:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:33.385 11:56:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 658651' 00:15:33.385 killing process with pid 658651 00:15:33.385 11:56:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 658651 00:15:33.385 [2024-07-12 11:56:23.471513] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:33.385 [2024-07-12 11:56:23.471574] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:33.385 [2024-07-12 11:56:23.471622] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:33.385 [2024-07-12 11:56:23.471629] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22f3ff0 name raid_bdev1, state offline 00:15:33.385 11:56:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 658651 00:15:33.385 [2024-07-12 11:56:23.502751] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:33.645 11:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:15:33.645 00:15:33.645 real 0m12.317s 00:15:33.645 user 0m22.415s 00:15:33.645 sys 0m1.957s 00:15:33.645 11:56:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:33.645 11:56:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:33.645 ************************************ 00:15:33.645 END TEST raid_superblock_test 00:15:33.645 ************************************ 00:15:33.645 11:56:23 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:33.645 11:56:23 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 4 read 00:15:33.645 11:56:23 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:33.645 11:56:23 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:33.645 11:56:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:33.645 ************************************ 00:15:33.645 START TEST raid_read_error_test 00:15:33.645 ************************************ 00:15:33.645 11:56:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 4 read 00:15:33.645 11:56:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:15:33.645 11:56:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:15:33.645 11:56:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:15:33.645 11:56:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:15:33.645 11:56:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:33.645 11:56:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:15:33.645 11:56:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:33.645 11:56:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:33.645 11:56:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:15:33.645 11:56:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:33.645 11:56:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:33.645 11:56:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:15:33.645 11:56:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:33.645 11:56:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:33.645 11:56:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:15:33.645 11:56:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:33.645 11:56:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:33.645 11:56:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:15:33.645 11:56:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:15:33.645 11:56:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:15:33.645 11:56:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:15:33.645 11:56:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:15:33.645 11:56:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:15:33.645 11:56:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:15:33.645 11:56:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:15:33.645 11:56:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:15:33.645 11:56:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:15:33.645 11:56:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:15:33.645 11:56:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.hBZAQmqJr9 00:15:33.645 11:56:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=661033 00:15:33.645 11:56:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 661033 /var/tmp/spdk-raid.sock 00:15:33.645 11:56:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:33.645 11:56:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 661033 ']' 00:15:33.645 11:56:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:33.645 11:56:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:33.645 11:56:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:33.645 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:33.645 11:56:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:33.645 11:56:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:33.645 [2024-07-12 11:56:23.793338] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:15:33.645 [2024-07-12 11:56:23.793375] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid661033 ] 00:15:33.645 [2024-07-12 11:56:23.855900] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:33.904 [2024-07-12 11:56:23.933464] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:15:33.904 [2024-07-12 11:56:23.983079] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:33.904 [2024-07-12 11:56:23.983102] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:34.472 11:56:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:34.472 11:56:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:15:34.472 11:56:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:34.472 11:56:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:34.731 BaseBdev1_malloc 00:15:34.731 11:56:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:34.731 true 00:15:34.731 11:56:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:34.990 [2024-07-12 11:56:25.070335] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:34.990 [2024-07-12 11:56:25.070363] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:34.990 [2024-07-12 11:56:25.070374] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28772d0 00:15:34.990 [2024-07-12 11:56:25.070380] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:34.990 [2024-07-12 11:56:25.071622] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:34.990 [2024-07-12 11:56:25.071644] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:34.990 BaseBdev1 00:15:34.991 11:56:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:34.991 11:56:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:35.250 BaseBdev2_malloc 00:15:35.250 11:56:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:35.250 true 00:15:35.250 11:56:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:35.509 [2024-07-12 11:56:25.567146] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:35.509 [2024-07-12 11:56:25.567176] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:35.509 [2024-07-12 11:56:25.567188] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x287bf40 00:15:35.509 [2024-07-12 11:56:25.567193] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:35.509 [2024-07-12 11:56:25.568238] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:35.509 [2024-07-12 11:56:25.568258] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:35.509 BaseBdev2 00:15:35.509 11:56:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:35.509 11:56:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:35.509 BaseBdev3_malloc 00:15:35.509 11:56:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:35.767 true 00:15:35.767 11:56:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:36.024 [2024-07-12 11:56:26.063818] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:36.024 [2024-07-12 11:56:26.063847] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:36.025 [2024-07-12 11:56:26.063858] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x287eea0 00:15:36.025 [2024-07-12 11:56:26.063864] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:36.025 [2024-07-12 11:56:26.064947] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:36.025 [2024-07-12 11:56:26.064966] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:36.025 BaseBdev3 00:15:36.025 11:56:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:36.025 11:56:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:15:36.025 BaseBdev4_malloc 00:15:36.025 11:56:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:15:36.283 true 00:15:36.283 11:56:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:15:36.541 [2024-07-12 11:56:26.572576] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:15:36.541 [2024-07-12 11:56:26.572606] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:36.541 [2024-07-12 11:56:26.572617] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28782f0 00:15:36.541 [2024-07-12 11:56:26.572623] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:36.541 [2024-07-12 11:56:26.573687] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:36.541 [2024-07-12 11:56:26.573706] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:15:36.541 BaseBdev4 00:15:36.541 11:56:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:15:36.541 [2024-07-12 11:56:26.737028] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:36.541 [2024-07-12 11:56:26.737934] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:36.541 [2024-07-12 11:56:26.737981] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:36.541 [2024-07-12 11:56:26.738019] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:15:36.541 [2024-07-12 11:56:26.738163] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2878d30 00:15:36.541 [2024-07-12 11:56:26.738169] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:15:36.541 [2024-07-12 11:56:26.738297] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2761720 00:15:36.541 [2024-07-12 11:56:26.738397] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2878d30 00:15:36.541 [2024-07-12 11:56:26.738402] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2878d30 00:15:36.541 [2024-07-12 11:56:26.738470] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:36.541 11:56:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:15:36.541 11:56:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:36.541 11:56:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:36.541 11:56:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:36.541 11:56:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:36.541 11:56:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:36.541 11:56:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:36.541 11:56:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:36.541 11:56:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:36.541 11:56:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:36.541 11:56:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:36.541 11:56:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:36.799 11:56:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:36.799 "name": "raid_bdev1", 00:15:36.799 "uuid": "4eef1bd0-080a-43e3-9e32-874ab717d706", 00:15:36.799 "strip_size_kb": 64, 00:15:36.799 "state": "online", 00:15:36.799 "raid_level": "raid0", 00:15:36.799 "superblock": true, 00:15:36.799 "num_base_bdevs": 4, 00:15:36.799 "num_base_bdevs_discovered": 4, 00:15:36.799 "num_base_bdevs_operational": 4, 00:15:36.799 "base_bdevs_list": [ 00:15:36.799 { 00:15:36.799 "name": "BaseBdev1", 00:15:36.799 "uuid": "e8b1ef64-d45f-5841-b4e4-ca0d4ff5156c", 00:15:36.799 "is_configured": true, 00:15:36.799 "data_offset": 2048, 00:15:36.799 "data_size": 63488 00:15:36.799 }, 00:15:36.799 { 00:15:36.799 "name": "BaseBdev2", 00:15:36.799 "uuid": "2fea2018-40d1-5fe7-8d50-b3843d90589f", 00:15:36.799 "is_configured": true, 00:15:36.799 "data_offset": 2048, 00:15:36.799 "data_size": 63488 00:15:36.799 }, 00:15:36.799 { 00:15:36.799 "name": "BaseBdev3", 00:15:36.799 "uuid": "e99ca1a6-9961-55e7-a43b-f8de7fb4b223", 00:15:36.799 "is_configured": true, 00:15:36.799 "data_offset": 2048, 00:15:36.799 "data_size": 63488 00:15:36.799 }, 00:15:36.799 { 00:15:36.799 "name": "BaseBdev4", 00:15:36.799 "uuid": "cd9b7da2-8e21-5649-9741-01916b5f12b2", 00:15:36.799 "is_configured": true, 00:15:36.799 "data_offset": 2048, 00:15:36.799 "data_size": 63488 00:15:36.799 } 00:15:36.799 ] 00:15:36.799 }' 00:15:36.799 11:56:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:36.799 11:56:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:37.366 11:56:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:15:37.366 11:56:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:37.366 [2024-07-12 11:56:27.475108] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2764f90 00:15:38.303 11:56:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:15:38.562 11:56:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:15:38.562 11:56:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:15:38.562 11:56:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:15:38.562 11:56:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:15:38.562 11:56:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:38.562 11:56:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:38.562 11:56:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:38.562 11:56:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:38.562 11:56:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:38.562 11:56:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:38.562 11:56:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:38.562 11:56:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:38.562 11:56:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:38.562 11:56:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:38.562 11:56:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:38.562 11:56:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:38.562 "name": "raid_bdev1", 00:15:38.562 "uuid": "4eef1bd0-080a-43e3-9e32-874ab717d706", 00:15:38.562 "strip_size_kb": 64, 00:15:38.562 "state": "online", 00:15:38.562 "raid_level": "raid0", 00:15:38.562 "superblock": true, 00:15:38.562 "num_base_bdevs": 4, 00:15:38.562 "num_base_bdevs_discovered": 4, 00:15:38.562 "num_base_bdevs_operational": 4, 00:15:38.562 "base_bdevs_list": [ 00:15:38.562 { 00:15:38.562 "name": "BaseBdev1", 00:15:38.562 "uuid": "e8b1ef64-d45f-5841-b4e4-ca0d4ff5156c", 00:15:38.562 "is_configured": true, 00:15:38.562 "data_offset": 2048, 00:15:38.562 "data_size": 63488 00:15:38.562 }, 00:15:38.562 { 00:15:38.562 "name": "BaseBdev2", 00:15:38.562 "uuid": "2fea2018-40d1-5fe7-8d50-b3843d90589f", 00:15:38.562 "is_configured": true, 00:15:38.562 "data_offset": 2048, 00:15:38.562 "data_size": 63488 00:15:38.562 }, 00:15:38.562 { 00:15:38.562 "name": "BaseBdev3", 00:15:38.562 "uuid": "e99ca1a6-9961-55e7-a43b-f8de7fb4b223", 00:15:38.562 "is_configured": true, 00:15:38.562 "data_offset": 2048, 00:15:38.562 "data_size": 63488 00:15:38.562 }, 00:15:38.562 { 00:15:38.562 "name": "BaseBdev4", 00:15:38.562 "uuid": "cd9b7da2-8e21-5649-9741-01916b5f12b2", 00:15:38.562 "is_configured": true, 00:15:38.562 "data_offset": 2048, 00:15:38.562 "data_size": 63488 00:15:38.562 } 00:15:38.562 ] 00:15:38.562 }' 00:15:38.562 11:56:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:38.562 11:56:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:39.129 11:56:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:39.129 [2024-07-12 11:56:29.343025] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:39.129 [2024-07-12 11:56:29.343051] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:39.129 [2024-07-12 11:56:29.345116] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:39.129 [2024-07-12 11:56:29.345141] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:39.129 [2024-07-12 11:56:29.345167] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:39.129 [2024-07-12 11:56:29.345172] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2878d30 name raid_bdev1, state offline 00:15:39.129 0 00:15:39.129 11:56:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 661033 00:15:39.129 11:56:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 661033 ']' 00:15:39.129 11:56:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 661033 00:15:39.129 11:56:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:15:39.129 11:56:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:39.129 11:56:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 661033 00:15:39.388 11:56:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:39.388 11:56:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:39.388 11:56:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 661033' 00:15:39.388 killing process with pid 661033 00:15:39.388 11:56:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 661033 00:15:39.388 [2024-07-12 11:56:29.402573] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:39.388 11:56:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 661033 00:15:39.388 [2024-07-12 11:56:29.428287] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:39.388 11:56:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.hBZAQmqJr9 00:15:39.388 11:56:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:15:39.388 11:56:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:15:39.388 11:56:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.54 00:15:39.388 11:56:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:15:39.388 11:56:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:39.388 11:56:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:39.388 11:56:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.54 != \0\.\0\0 ]] 00:15:39.388 00:15:39.388 real 0m5.883s 00:15:39.388 user 0m9.245s 00:15:39.388 sys 0m0.857s 00:15:39.388 11:56:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:39.388 11:56:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:39.388 ************************************ 00:15:39.388 END TEST raid_read_error_test 00:15:39.388 ************************************ 00:15:39.646 11:56:29 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:39.646 11:56:29 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 4 write 00:15:39.646 11:56:29 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:39.646 11:56:29 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:39.646 11:56:29 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:39.646 ************************************ 00:15:39.646 START TEST raid_write_error_test 00:15:39.646 ************************************ 00:15:39.646 11:56:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 4 write 00:15:39.646 11:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:15:39.646 11:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:15:39.646 11:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:15:39.646 11:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:15:39.646 11:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:39.646 11:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:15:39.646 11:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:39.646 11:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:39.646 11:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:15:39.646 11:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:39.646 11:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:39.646 11:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:15:39.646 11:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:39.646 11:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:39.646 11:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:15:39.646 11:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:39.646 11:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:39.646 11:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:15:39.646 11:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:15:39.646 11:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:15:39.646 11:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:15:39.646 11:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:15:39.646 11:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:15:39.646 11:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:15:39.646 11:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:15:39.646 11:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:15:39.646 11:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:15:39.646 11:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:15:39.646 11:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.5SMZRNOtK7 00:15:39.646 11:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=662048 00:15:39.646 11:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 662048 /var/tmp/spdk-raid.sock 00:15:39.646 11:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:39.646 11:56:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 662048 ']' 00:15:39.646 11:56:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:39.646 11:56:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:39.646 11:56:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:39.646 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:39.646 11:56:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:39.646 11:56:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:39.646 [2024-07-12 11:56:29.737548] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:15:39.646 [2024-07-12 11:56:29.737587] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid662048 ] 00:15:39.646 [2024-07-12 11:56:29.798802] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:39.646 [2024-07-12 11:56:29.876980] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:15:39.904 [2024-07-12 11:56:29.928025] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:39.905 [2024-07-12 11:56:29.928040] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:40.470 11:56:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:40.470 11:56:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:15:40.470 11:56:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:40.470 11:56:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:40.470 BaseBdev1_malloc 00:15:40.470 11:56:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:40.729 true 00:15:40.729 11:56:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:40.988 [2024-07-12 11:56:31.016036] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:40.988 [2024-07-12 11:56:31.016067] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:40.988 [2024-07-12 11:56:31.016078] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdfa2d0 00:15:40.988 [2024-07-12 11:56:31.016084] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:40.988 [2024-07-12 11:56:31.017291] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:40.988 [2024-07-12 11:56:31.017312] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:40.988 BaseBdev1 00:15:40.988 11:56:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:40.988 11:56:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:40.988 BaseBdev2_malloc 00:15:40.988 11:56:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:41.247 true 00:15:41.247 11:56:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:41.505 [2024-07-12 11:56:31.516850] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:41.505 [2024-07-12 11:56:31.516881] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:41.505 [2024-07-12 11:56:31.516892] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdfef40 00:15:41.505 [2024-07-12 11:56:31.516898] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:41.505 [2024-07-12 11:56:31.517990] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:41.505 [2024-07-12 11:56:31.518010] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:41.505 BaseBdev2 00:15:41.505 11:56:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:41.505 11:56:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:41.505 BaseBdev3_malloc 00:15:41.505 11:56:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:41.763 true 00:15:41.763 11:56:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:41.763 [2024-07-12 11:56:31.997621] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:41.763 [2024-07-12 11:56:31.997649] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:41.763 [2024-07-12 11:56:31.997659] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe01ea0 00:15:41.763 [2024-07-12 11:56:31.997665] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:41.763 [2024-07-12 11:56:31.998679] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:41.763 [2024-07-12 11:56:31.998698] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:41.763 BaseBdev3 00:15:42.022 11:56:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:42.022 11:56:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:15:42.022 BaseBdev4_malloc 00:15:42.022 11:56:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:15:42.280 true 00:15:42.280 11:56:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:15:42.280 [2024-07-12 11:56:32.462207] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:15:42.280 [2024-07-12 11:56:32.462236] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:42.280 [2024-07-12 11:56:32.462245] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdfb2f0 00:15:42.280 [2024-07-12 11:56:32.462251] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:42.280 [2024-07-12 11:56:32.463287] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:42.280 [2024-07-12 11:56:32.463307] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:15:42.280 BaseBdev4 00:15:42.280 11:56:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:15:42.539 [2024-07-12 11:56:32.630682] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:42.539 [2024-07-12 11:56:32.631618] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:42.539 [2024-07-12 11:56:32.631665] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:42.539 [2024-07-12 11:56:32.631705] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:15:42.539 [2024-07-12 11:56:32.631854] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xdfbd30 00:15:42.539 [2024-07-12 11:56:32.631861] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:15:42.539 [2024-07-12 11:56:32.631995] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xce4720 00:15:42.539 [2024-07-12 11:56:32.632098] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xdfbd30 00:15:42.539 [2024-07-12 11:56:32.632104] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xdfbd30 00:15:42.539 [2024-07-12 11:56:32.632173] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:42.539 11:56:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:15:42.539 11:56:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:42.539 11:56:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:42.539 11:56:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:42.539 11:56:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:42.539 11:56:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:42.539 11:56:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:42.539 11:56:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:42.539 11:56:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:42.539 11:56:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:42.539 11:56:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:42.539 11:56:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:42.798 11:56:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:42.798 "name": "raid_bdev1", 00:15:42.798 "uuid": "8a70f760-6a8e-4bab-b5d2-b4d85f0e8f66", 00:15:42.798 "strip_size_kb": 64, 00:15:42.798 "state": "online", 00:15:42.798 "raid_level": "raid0", 00:15:42.798 "superblock": true, 00:15:42.798 "num_base_bdevs": 4, 00:15:42.798 "num_base_bdevs_discovered": 4, 00:15:42.798 "num_base_bdevs_operational": 4, 00:15:42.798 "base_bdevs_list": [ 00:15:42.798 { 00:15:42.798 "name": "BaseBdev1", 00:15:42.798 "uuid": "536887cf-e54f-5cb7-a063-c2bff1b7879e", 00:15:42.798 "is_configured": true, 00:15:42.798 "data_offset": 2048, 00:15:42.798 "data_size": 63488 00:15:42.798 }, 00:15:42.798 { 00:15:42.798 "name": "BaseBdev2", 00:15:42.798 "uuid": "6a70b839-3cdc-5d6d-9fdc-533d01de5fff", 00:15:42.798 "is_configured": true, 00:15:42.798 "data_offset": 2048, 00:15:42.798 "data_size": 63488 00:15:42.798 }, 00:15:42.798 { 00:15:42.798 "name": "BaseBdev3", 00:15:42.798 "uuid": "cb801599-07ef-5840-b5ae-901a652944bf", 00:15:42.798 "is_configured": true, 00:15:42.798 "data_offset": 2048, 00:15:42.798 "data_size": 63488 00:15:42.798 }, 00:15:42.798 { 00:15:42.798 "name": "BaseBdev4", 00:15:42.798 "uuid": "8c0ce062-cf35-587a-8d22-271d1f0629fd", 00:15:42.798 "is_configured": true, 00:15:42.798 "data_offset": 2048, 00:15:42.798 "data_size": 63488 00:15:42.798 } 00:15:42.798 ] 00:15:42.798 }' 00:15:42.798 11:56:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:42.798 11:56:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:43.055 11:56:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:15:43.312 11:56:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:43.312 [2024-07-12 11:56:33.376792] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xce7f90 00:15:44.250 11:56:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:15:44.250 11:56:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:15:44.250 11:56:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:15:44.250 11:56:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:15:44.250 11:56:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:15:44.250 11:56:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:44.250 11:56:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:44.250 11:56:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:44.250 11:56:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:44.250 11:56:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:44.250 11:56:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:44.250 11:56:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:44.250 11:56:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:44.250 11:56:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:44.250 11:56:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:44.250 11:56:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:44.509 11:56:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:44.509 "name": "raid_bdev1", 00:15:44.509 "uuid": "8a70f760-6a8e-4bab-b5d2-b4d85f0e8f66", 00:15:44.509 "strip_size_kb": 64, 00:15:44.509 "state": "online", 00:15:44.509 "raid_level": "raid0", 00:15:44.509 "superblock": true, 00:15:44.509 "num_base_bdevs": 4, 00:15:44.509 "num_base_bdevs_discovered": 4, 00:15:44.509 "num_base_bdevs_operational": 4, 00:15:44.509 "base_bdevs_list": [ 00:15:44.509 { 00:15:44.509 "name": "BaseBdev1", 00:15:44.509 "uuid": "536887cf-e54f-5cb7-a063-c2bff1b7879e", 00:15:44.509 "is_configured": true, 00:15:44.509 "data_offset": 2048, 00:15:44.509 "data_size": 63488 00:15:44.509 }, 00:15:44.509 { 00:15:44.509 "name": "BaseBdev2", 00:15:44.509 "uuid": "6a70b839-3cdc-5d6d-9fdc-533d01de5fff", 00:15:44.509 "is_configured": true, 00:15:44.509 "data_offset": 2048, 00:15:44.509 "data_size": 63488 00:15:44.509 }, 00:15:44.509 { 00:15:44.509 "name": "BaseBdev3", 00:15:44.509 "uuid": "cb801599-07ef-5840-b5ae-901a652944bf", 00:15:44.509 "is_configured": true, 00:15:44.509 "data_offset": 2048, 00:15:44.509 "data_size": 63488 00:15:44.509 }, 00:15:44.509 { 00:15:44.509 "name": "BaseBdev4", 00:15:44.509 "uuid": "8c0ce062-cf35-587a-8d22-271d1f0629fd", 00:15:44.509 "is_configured": true, 00:15:44.509 "data_offset": 2048, 00:15:44.509 "data_size": 63488 00:15:44.509 } 00:15:44.509 ] 00:15:44.509 }' 00:15:44.509 11:56:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:44.510 11:56:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:45.077 11:56:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:45.077 [2024-07-12 11:56:35.313445] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:45.077 [2024-07-12 11:56:35.313478] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:45.077 [2024-07-12 11:56:35.315542] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:45.077 [2024-07-12 11:56:35.315567] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:45.077 [2024-07-12 11:56:35.315593] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:45.077 [2024-07-12 11:56:35.315599] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xdfbd30 name raid_bdev1, state offline 00:15:45.077 0 00:15:45.336 11:56:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 662048 00:15:45.336 11:56:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 662048 ']' 00:15:45.336 11:56:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 662048 00:15:45.336 11:56:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:15:45.336 11:56:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:45.336 11:56:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 662048 00:15:45.336 11:56:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:45.336 11:56:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:45.336 11:56:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 662048' 00:15:45.336 killing process with pid 662048 00:15:45.336 11:56:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 662048 00:15:45.336 [2024-07-12 11:56:35.365146] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:45.336 11:56:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 662048 00:15:45.336 [2024-07-12 11:56:35.391420] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:45.336 11:56:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.5SMZRNOtK7 00:15:45.336 11:56:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:15:45.336 11:56:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:15:45.336 11:56:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:15:45.336 11:56:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:15:45.336 11:56:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:45.336 11:56:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:45.336 11:56:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:15:45.336 00:15:45.336 real 0m5.904s 00:15:45.336 user 0m9.288s 00:15:45.336 sys 0m0.848s 00:15:45.336 11:56:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:45.336 11:56:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:45.336 ************************************ 00:15:45.336 END TEST raid_write_error_test 00:15:45.336 ************************************ 00:15:45.595 11:56:35 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:45.595 11:56:35 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:15:45.595 11:56:35 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 4 false 00:15:45.595 11:56:35 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:45.595 11:56:35 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:45.595 11:56:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:45.595 ************************************ 00:15:45.595 START TEST raid_state_function_test 00:15:45.595 ************************************ 00:15:45.595 11:56:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 4 false 00:15:45.595 11:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:15:45.595 11:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:15:45.595 11:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:15:45.595 11:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:45.595 11:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:45.595 11:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:45.595 11:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:45.595 11:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:45.595 11:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:45.595 11:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:45.595 11:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:45.595 11:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:45.595 11:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:45.595 11:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:45.595 11:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:45.595 11:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:15:45.595 11:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:45.595 11:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:45.595 11:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:15:45.595 11:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:45.595 11:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:45.595 11:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:45.595 11:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:45.595 11:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:45.595 11:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:15:45.595 11:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:15:45.595 11:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:15:45.595 11:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:15:45.595 11:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:15:45.595 11:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=663154 00:15:45.595 11:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 663154' 00:15:45.595 Process raid pid: 663154 00:15:45.595 11:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:45.595 11:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 663154 /var/tmp/spdk-raid.sock 00:15:45.596 11:56:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 663154 ']' 00:15:45.596 11:56:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:45.596 11:56:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:45.596 11:56:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:45.596 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:45.596 11:56:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:45.596 11:56:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:45.596 [2024-07-12 11:56:35.695745] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:15:45.596 [2024-07-12 11:56:35.695783] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:45.596 [2024-07-12 11:56:35.760092] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:45.596 [2024-07-12 11:56:35.836686] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:15:45.854 [2024-07-12 11:56:35.892392] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:45.854 [2024-07-12 11:56:35.892414] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:46.423 11:56:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:46.423 11:56:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:15:46.423 11:56:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:46.423 [2024-07-12 11:56:36.627341] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:46.423 [2024-07-12 11:56:36.627371] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:46.423 [2024-07-12 11:56:36.627377] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:46.423 [2024-07-12 11:56:36.627383] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:46.423 [2024-07-12 11:56:36.627387] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:46.423 [2024-07-12 11:56:36.627393] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:46.423 [2024-07-12 11:56:36.627397] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:46.423 [2024-07-12 11:56:36.627402] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:46.423 11:56:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:46.423 11:56:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:46.423 11:56:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:46.423 11:56:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:46.423 11:56:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:46.423 11:56:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:46.423 11:56:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:46.423 11:56:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:46.423 11:56:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:46.423 11:56:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:46.423 11:56:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:46.423 11:56:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:46.682 11:56:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:46.682 "name": "Existed_Raid", 00:15:46.682 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:46.682 "strip_size_kb": 64, 00:15:46.682 "state": "configuring", 00:15:46.682 "raid_level": "concat", 00:15:46.682 "superblock": false, 00:15:46.682 "num_base_bdevs": 4, 00:15:46.682 "num_base_bdevs_discovered": 0, 00:15:46.682 "num_base_bdevs_operational": 4, 00:15:46.682 "base_bdevs_list": [ 00:15:46.682 { 00:15:46.682 "name": "BaseBdev1", 00:15:46.682 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:46.682 "is_configured": false, 00:15:46.682 "data_offset": 0, 00:15:46.682 "data_size": 0 00:15:46.682 }, 00:15:46.682 { 00:15:46.682 "name": "BaseBdev2", 00:15:46.682 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:46.682 "is_configured": false, 00:15:46.682 "data_offset": 0, 00:15:46.682 "data_size": 0 00:15:46.682 }, 00:15:46.682 { 00:15:46.682 "name": "BaseBdev3", 00:15:46.682 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:46.682 "is_configured": false, 00:15:46.682 "data_offset": 0, 00:15:46.682 "data_size": 0 00:15:46.682 }, 00:15:46.682 { 00:15:46.682 "name": "BaseBdev4", 00:15:46.682 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:46.682 "is_configured": false, 00:15:46.682 "data_offset": 0, 00:15:46.682 "data_size": 0 00:15:46.682 } 00:15:46.682 ] 00:15:46.682 }' 00:15:46.682 11:56:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:46.682 11:56:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:47.251 11:56:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:47.251 [2024-07-12 11:56:37.421302] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:47.251 [2024-07-12 11:56:37.421321] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21781f0 name Existed_Raid, state configuring 00:15:47.251 11:56:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:47.510 [2024-07-12 11:56:37.589747] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:47.510 [2024-07-12 11:56:37.589767] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:47.510 [2024-07-12 11:56:37.589772] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:47.510 [2024-07-12 11:56:37.589776] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:47.510 [2024-07-12 11:56:37.589780] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:47.510 [2024-07-12 11:56:37.589785] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:47.510 [2024-07-12 11:56:37.589804] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:47.510 [2024-07-12 11:56:37.589809] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:47.510 11:56:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:47.510 [2024-07-12 11:56:37.750319] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:47.510 BaseBdev1 00:15:47.769 11:56:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:47.769 11:56:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:47.769 11:56:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:47.769 11:56:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:47.769 11:56:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:47.769 11:56:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:47.769 11:56:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:47.769 11:56:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:48.069 [ 00:15:48.069 { 00:15:48.069 "name": "BaseBdev1", 00:15:48.069 "aliases": [ 00:15:48.069 "189179a0-a13c-4d9c-9c71-941574d641f5" 00:15:48.069 ], 00:15:48.069 "product_name": "Malloc disk", 00:15:48.069 "block_size": 512, 00:15:48.069 "num_blocks": 65536, 00:15:48.069 "uuid": "189179a0-a13c-4d9c-9c71-941574d641f5", 00:15:48.069 "assigned_rate_limits": { 00:15:48.069 "rw_ios_per_sec": 0, 00:15:48.069 "rw_mbytes_per_sec": 0, 00:15:48.069 "r_mbytes_per_sec": 0, 00:15:48.069 "w_mbytes_per_sec": 0 00:15:48.069 }, 00:15:48.069 "claimed": true, 00:15:48.069 "claim_type": "exclusive_write", 00:15:48.069 "zoned": false, 00:15:48.069 "supported_io_types": { 00:15:48.069 "read": true, 00:15:48.069 "write": true, 00:15:48.069 "unmap": true, 00:15:48.069 "flush": true, 00:15:48.069 "reset": true, 00:15:48.069 "nvme_admin": false, 00:15:48.069 "nvme_io": false, 00:15:48.069 "nvme_io_md": false, 00:15:48.069 "write_zeroes": true, 00:15:48.069 "zcopy": true, 00:15:48.069 "get_zone_info": false, 00:15:48.069 "zone_management": false, 00:15:48.069 "zone_append": false, 00:15:48.069 "compare": false, 00:15:48.069 "compare_and_write": false, 00:15:48.069 "abort": true, 00:15:48.069 "seek_hole": false, 00:15:48.069 "seek_data": false, 00:15:48.069 "copy": true, 00:15:48.069 "nvme_iov_md": false 00:15:48.069 }, 00:15:48.069 "memory_domains": [ 00:15:48.069 { 00:15:48.069 "dma_device_id": "system", 00:15:48.069 "dma_device_type": 1 00:15:48.069 }, 00:15:48.069 { 00:15:48.069 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:48.069 "dma_device_type": 2 00:15:48.069 } 00:15:48.069 ], 00:15:48.069 "driver_specific": {} 00:15:48.069 } 00:15:48.069 ] 00:15:48.069 11:56:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:48.069 11:56:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:48.069 11:56:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:48.069 11:56:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:48.069 11:56:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:48.069 11:56:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:48.069 11:56:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:48.069 11:56:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:48.069 11:56:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:48.069 11:56:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:48.069 11:56:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:48.069 11:56:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:48.069 11:56:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:48.069 11:56:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:48.069 "name": "Existed_Raid", 00:15:48.069 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:48.069 "strip_size_kb": 64, 00:15:48.069 "state": "configuring", 00:15:48.069 "raid_level": "concat", 00:15:48.069 "superblock": false, 00:15:48.069 "num_base_bdevs": 4, 00:15:48.069 "num_base_bdevs_discovered": 1, 00:15:48.069 "num_base_bdevs_operational": 4, 00:15:48.069 "base_bdevs_list": [ 00:15:48.069 { 00:15:48.069 "name": "BaseBdev1", 00:15:48.069 "uuid": "189179a0-a13c-4d9c-9c71-941574d641f5", 00:15:48.069 "is_configured": true, 00:15:48.069 "data_offset": 0, 00:15:48.069 "data_size": 65536 00:15:48.069 }, 00:15:48.069 { 00:15:48.069 "name": "BaseBdev2", 00:15:48.069 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:48.069 "is_configured": false, 00:15:48.069 "data_offset": 0, 00:15:48.069 "data_size": 0 00:15:48.069 }, 00:15:48.069 { 00:15:48.069 "name": "BaseBdev3", 00:15:48.069 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:48.069 "is_configured": false, 00:15:48.069 "data_offset": 0, 00:15:48.069 "data_size": 0 00:15:48.069 }, 00:15:48.069 { 00:15:48.069 "name": "BaseBdev4", 00:15:48.069 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:48.069 "is_configured": false, 00:15:48.069 "data_offset": 0, 00:15:48.069 "data_size": 0 00:15:48.069 } 00:15:48.069 ] 00:15:48.069 }' 00:15:48.069 11:56:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:48.069 11:56:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:48.655 11:56:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:48.655 [2024-07-12 11:56:38.857177] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:48.655 [2024-07-12 11:56:38.857208] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2177a60 name Existed_Raid, state configuring 00:15:48.655 11:56:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:48.915 [2024-07-12 11:56:39.025635] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:48.915 [2024-07-12 11:56:39.026708] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:48.915 [2024-07-12 11:56:39.026731] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:48.915 [2024-07-12 11:56:39.026737] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:48.915 [2024-07-12 11:56:39.026742] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:48.915 [2024-07-12 11:56:39.026747] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:48.915 [2024-07-12 11:56:39.026752] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:48.915 11:56:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:48.915 11:56:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:48.915 11:56:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:48.915 11:56:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:48.915 11:56:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:48.915 11:56:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:48.915 11:56:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:48.915 11:56:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:48.915 11:56:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:48.915 11:56:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:48.915 11:56:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:48.915 11:56:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:48.915 11:56:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:48.915 11:56:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:49.174 11:56:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:49.174 "name": "Existed_Raid", 00:15:49.174 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:49.174 "strip_size_kb": 64, 00:15:49.174 "state": "configuring", 00:15:49.174 "raid_level": "concat", 00:15:49.174 "superblock": false, 00:15:49.174 "num_base_bdevs": 4, 00:15:49.174 "num_base_bdevs_discovered": 1, 00:15:49.174 "num_base_bdevs_operational": 4, 00:15:49.174 "base_bdevs_list": [ 00:15:49.174 { 00:15:49.174 "name": "BaseBdev1", 00:15:49.174 "uuid": "189179a0-a13c-4d9c-9c71-941574d641f5", 00:15:49.174 "is_configured": true, 00:15:49.174 "data_offset": 0, 00:15:49.174 "data_size": 65536 00:15:49.174 }, 00:15:49.174 { 00:15:49.174 "name": "BaseBdev2", 00:15:49.174 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:49.174 "is_configured": false, 00:15:49.174 "data_offset": 0, 00:15:49.174 "data_size": 0 00:15:49.174 }, 00:15:49.174 { 00:15:49.174 "name": "BaseBdev3", 00:15:49.174 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:49.174 "is_configured": false, 00:15:49.174 "data_offset": 0, 00:15:49.174 "data_size": 0 00:15:49.174 }, 00:15:49.174 { 00:15:49.174 "name": "BaseBdev4", 00:15:49.174 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:49.175 "is_configured": false, 00:15:49.175 "data_offset": 0, 00:15:49.175 "data_size": 0 00:15:49.175 } 00:15:49.175 ] 00:15:49.175 }' 00:15:49.175 11:56:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:49.175 11:56:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:49.742 11:56:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:49.742 [2024-07-12 11:56:39.866379] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:49.742 BaseBdev2 00:15:49.742 11:56:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:49.742 11:56:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:49.742 11:56:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:49.742 11:56:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:49.742 11:56:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:49.742 11:56:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:49.742 11:56:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:50.002 11:56:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:50.002 [ 00:15:50.002 { 00:15:50.002 "name": "BaseBdev2", 00:15:50.002 "aliases": [ 00:15:50.002 "a6be6fac-cbe4-4d54-b87f-a6e71eff2e67" 00:15:50.002 ], 00:15:50.002 "product_name": "Malloc disk", 00:15:50.002 "block_size": 512, 00:15:50.002 "num_blocks": 65536, 00:15:50.002 "uuid": "a6be6fac-cbe4-4d54-b87f-a6e71eff2e67", 00:15:50.002 "assigned_rate_limits": { 00:15:50.002 "rw_ios_per_sec": 0, 00:15:50.002 "rw_mbytes_per_sec": 0, 00:15:50.002 "r_mbytes_per_sec": 0, 00:15:50.002 "w_mbytes_per_sec": 0 00:15:50.002 }, 00:15:50.002 "claimed": true, 00:15:50.002 "claim_type": "exclusive_write", 00:15:50.002 "zoned": false, 00:15:50.002 "supported_io_types": { 00:15:50.002 "read": true, 00:15:50.002 "write": true, 00:15:50.002 "unmap": true, 00:15:50.002 "flush": true, 00:15:50.002 "reset": true, 00:15:50.002 "nvme_admin": false, 00:15:50.002 "nvme_io": false, 00:15:50.002 "nvme_io_md": false, 00:15:50.002 "write_zeroes": true, 00:15:50.002 "zcopy": true, 00:15:50.002 "get_zone_info": false, 00:15:50.002 "zone_management": false, 00:15:50.002 "zone_append": false, 00:15:50.002 "compare": false, 00:15:50.002 "compare_and_write": false, 00:15:50.002 "abort": true, 00:15:50.002 "seek_hole": false, 00:15:50.002 "seek_data": false, 00:15:50.002 "copy": true, 00:15:50.002 "nvme_iov_md": false 00:15:50.002 }, 00:15:50.002 "memory_domains": [ 00:15:50.002 { 00:15:50.002 "dma_device_id": "system", 00:15:50.002 "dma_device_type": 1 00:15:50.002 }, 00:15:50.002 { 00:15:50.002 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:50.002 "dma_device_type": 2 00:15:50.002 } 00:15:50.002 ], 00:15:50.002 "driver_specific": {} 00:15:50.002 } 00:15:50.002 ] 00:15:50.002 11:56:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:50.002 11:56:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:50.002 11:56:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:50.002 11:56:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:50.002 11:56:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:50.002 11:56:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:50.002 11:56:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:50.002 11:56:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:50.002 11:56:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:50.002 11:56:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:50.002 11:56:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:50.002 11:56:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:50.002 11:56:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:50.002 11:56:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:50.002 11:56:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:50.260 11:56:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:50.260 "name": "Existed_Raid", 00:15:50.260 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:50.260 "strip_size_kb": 64, 00:15:50.260 "state": "configuring", 00:15:50.260 "raid_level": "concat", 00:15:50.260 "superblock": false, 00:15:50.260 "num_base_bdevs": 4, 00:15:50.260 "num_base_bdevs_discovered": 2, 00:15:50.260 "num_base_bdevs_operational": 4, 00:15:50.260 "base_bdevs_list": [ 00:15:50.260 { 00:15:50.260 "name": "BaseBdev1", 00:15:50.260 "uuid": "189179a0-a13c-4d9c-9c71-941574d641f5", 00:15:50.260 "is_configured": true, 00:15:50.260 "data_offset": 0, 00:15:50.260 "data_size": 65536 00:15:50.260 }, 00:15:50.260 { 00:15:50.260 "name": "BaseBdev2", 00:15:50.260 "uuid": "a6be6fac-cbe4-4d54-b87f-a6e71eff2e67", 00:15:50.260 "is_configured": true, 00:15:50.260 "data_offset": 0, 00:15:50.260 "data_size": 65536 00:15:50.260 }, 00:15:50.260 { 00:15:50.260 "name": "BaseBdev3", 00:15:50.260 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:50.260 "is_configured": false, 00:15:50.260 "data_offset": 0, 00:15:50.260 "data_size": 0 00:15:50.260 }, 00:15:50.260 { 00:15:50.260 "name": "BaseBdev4", 00:15:50.260 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:50.260 "is_configured": false, 00:15:50.260 "data_offset": 0, 00:15:50.260 "data_size": 0 00:15:50.260 } 00:15:50.260 ] 00:15:50.260 }' 00:15:50.260 11:56:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:50.260 11:56:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:50.828 11:56:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:50.828 [2024-07-12 11:56:41.040081] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:50.828 BaseBdev3 00:15:50.828 11:56:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:15:50.828 11:56:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:50.828 11:56:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:50.828 11:56:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:50.828 11:56:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:50.828 11:56:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:50.828 11:56:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:51.086 11:56:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:51.344 [ 00:15:51.344 { 00:15:51.344 "name": "BaseBdev3", 00:15:51.344 "aliases": [ 00:15:51.344 "7f53b98e-6b6b-4ac6-adc0-bdbf18736dd6" 00:15:51.344 ], 00:15:51.344 "product_name": "Malloc disk", 00:15:51.344 "block_size": 512, 00:15:51.344 "num_blocks": 65536, 00:15:51.344 "uuid": "7f53b98e-6b6b-4ac6-adc0-bdbf18736dd6", 00:15:51.344 "assigned_rate_limits": { 00:15:51.344 "rw_ios_per_sec": 0, 00:15:51.344 "rw_mbytes_per_sec": 0, 00:15:51.344 "r_mbytes_per_sec": 0, 00:15:51.344 "w_mbytes_per_sec": 0 00:15:51.344 }, 00:15:51.344 "claimed": true, 00:15:51.344 "claim_type": "exclusive_write", 00:15:51.344 "zoned": false, 00:15:51.344 "supported_io_types": { 00:15:51.344 "read": true, 00:15:51.344 "write": true, 00:15:51.344 "unmap": true, 00:15:51.344 "flush": true, 00:15:51.344 "reset": true, 00:15:51.344 "nvme_admin": false, 00:15:51.344 "nvme_io": false, 00:15:51.344 "nvme_io_md": false, 00:15:51.344 "write_zeroes": true, 00:15:51.344 "zcopy": true, 00:15:51.344 "get_zone_info": false, 00:15:51.344 "zone_management": false, 00:15:51.344 "zone_append": false, 00:15:51.344 "compare": false, 00:15:51.344 "compare_and_write": false, 00:15:51.344 "abort": true, 00:15:51.344 "seek_hole": false, 00:15:51.344 "seek_data": false, 00:15:51.344 "copy": true, 00:15:51.344 "nvme_iov_md": false 00:15:51.344 }, 00:15:51.344 "memory_domains": [ 00:15:51.344 { 00:15:51.344 "dma_device_id": "system", 00:15:51.344 "dma_device_type": 1 00:15:51.344 }, 00:15:51.344 { 00:15:51.344 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:51.344 "dma_device_type": 2 00:15:51.344 } 00:15:51.344 ], 00:15:51.344 "driver_specific": {} 00:15:51.344 } 00:15:51.344 ] 00:15:51.344 11:56:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:51.344 11:56:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:51.344 11:56:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:51.344 11:56:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:51.344 11:56:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:51.344 11:56:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:51.344 11:56:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:51.344 11:56:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:51.344 11:56:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:51.344 11:56:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:51.344 11:56:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:51.344 11:56:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:51.344 11:56:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:51.344 11:56:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:51.344 11:56:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:51.345 11:56:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:51.345 "name": "Existed_Raid", 00:15:51.345 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:51.345 "strip_size_kb": 64, 00:15:51.345 "state": "configuring", 00:15:51.345 "raid_level": "concat", 00:15:51.345 "superblock": false, 00:15:51.345 "num_base_bdevs": 4, 00:15:51.345 "num_base_bdevs_discovered": 3, 00:15:51.345 "num_base_bdevs_operational": 4, 00:15:51.345 "base_bdevs_list": [ 00:15:51.345 { 00:15:51.345 "name": "BaseBdev1", 00:15:51.345 "uuid": "189179a0-a13c-4d9c-9c71-941574d641f5", 00:15:51.345 "is_configured": true, 00:15:51.345 "data_offset": 0, 00:15:51.345 "data_size": 65536 00:15:51.345 }, 00:15:51.345 { 00:15:51.345 "name": "BaseBdev2", 00:15:51.345 "uuid": "a6be6fac-cbe4-4d54-b87f-a6e71eff2e67", 00:15:51.345 "is_configured": true, 00:15:51.345 "data_offset": 0, 00:15:51.345 "data_size": 65536 00:15:51.345 }, 00:15:51.345 { 00:15:51.345 "name": "BaseBdev3", 00:15:51.345 "uuid": "7f53b98e-6b6b-4ac6-adc0-bdbf18736dd6", 00:15:51.345 "is_configured": true, 00:15:51.345 "data_offset": 0, 00:15:51.345 "data_size": 65536 00:15:51.345 }, 00:15:51.345 { 00:15:51.345 "name": "BaseBdev4", 00:15:51.345 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:51.345 "is_configured": false, 00:15:51.345 "data_offset": 0, 00:15:51.345 "data_size": 0 00:15:51.345 } 00:15:51.345 ] 00:15:51.345 }' 00:15:51.345 11:56:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:51.345 11:56:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:51.911 11:56:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:15:52.169 [2024-07-12 11:56:42.209685] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:15:52.169 [2024-07-12 11:56:42.209714] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2178b90 00:15:52.169 [2024-07-12 11:56:42.209718] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:15:52.169 [2024-07-12 11:56:42.209850] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2178700 00:15:52.169 [2024-07-12 11:56:42.209935] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2178b90 00:15:52.169 [2024-07-12 11:56:42.209940] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2178b90 00:15:52.169 [2024-07-12 11:56:42.210056] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:52.169 BaseBdev4 00:15:52.169 11:56:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:15:52.169 11:56:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:15:52.169 11:56:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:52.169 11:56:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:52.169 11:56:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:52.169 11:56:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:52.169 11:56:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:52.169 11:56:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:15:52.429 [ 00:15:52.429 { 00:15:52.429 "name": "BaseBdev4", 00:15:52.429 "aliases": [ 00:15:52.429 "1bb070c4-c869-47e2-b786-046c87b9a8f1" 00:15:52.429 ], 00:15:52.429 "product_name": "Malloc disk", 00:15:52.429 "block_size": 512, 00:15:52.429 "num_blocks": 65536, 00:15:52.429 "uuid": "1bb070c4-c869-47e2-b786-046c87b9a8f1", 00:15:52.429 "assigned_rate_limits": { 00:15:52.429 "rw_ios_per_sec": 0, 00:15:52.429 "rw_mbytes_per_sec": 0, 00:15:52.429 "r_mbytes_per_sec": 0, 00:15:52.429 "w_mbytes_per_sec": 0 00:15:52.429 }, 00:15:52.429 "claimed": true, 00:15:52.429 "claim_type": "exclusive_write", 00:15:52.429 "zoned": false, 00:15:52.429 "supported_io_types": { 00:15:52.429 "read": true, 00:15:52.429 "write": true, 00:15:52.429 "unmap": true, 00:15:52.429 "flush": true, 00:15:52.429 "reset": true, 00:15:52.429 "nvme_admin": false, 00:15:52.429 "nvme_io": false, 00:15:52.429 "nvme_io_md": false, 00:15:52.429 "write_zeroes": true, 00:15:52.429 "zcopy": true, 00:15:52.429 "get_zone_info": false, 00:15:52.429 "zone_management": false, 00:15:52.429 "zone_append": false, 00:15:52.429 "compare": false, 00:15:52.429 "compare_and_write": false, 00:15:52.429 "abort": true, 00:15:52.429 "seek_hole": false, 00:15:52.429 "seek_data": false, 00:15:52.429 "copy": true, 00:15:52.429 "nvme_iov_md": false 00:15:52.429 }, 00:15:52.429 "memory_domains": [ 00:15:52.429 { 00:15:52.429 "dma_device_id": "system", 00:15:52.429 "dma_device_type": 1 00:15:52.429 }, 00:15:52.429 { 00:15:52.429 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:52.429 "dma_device_type": 2 00:15:52.429 } 00:15:52.429 ], 00:15:52.429 "driver_specific": {} 00:15:52.429 } 00:15:52.429 ] 00:15:52.429 11:56:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:52.429 11:56:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:52.429 11:56:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:52.429 11:56:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:15:52.429 11:56:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:52.429 11:56:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:52.429 11:56:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:52.429 11:56:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:52.429 11:56:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:52.429 11:56:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:52.429 11:56:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:52.429 11:56:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:52.429 11:56:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:52.429 11:56:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:52.429 11:56:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:52.687 11:56:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:52.688 "name": "Existed_Raid", 00:15:52.688 "uuid": "be19e744-5e6b-4c4b-a0de-7b55627890e1", 00:15:52.688 "strip_size_kb": 64, 00:15:52.688 "state": "online", 00:15:52.688 "raid_level": "concat", 00:15:52.688 "superblock": false, 00:15:52.688 "num_base_bdevs": 4, 00:15:52.688 "num_base_bdevs_discovered": 4, 00:15:52.688 "num_base_bdevs_operational": 4, 00:15:52.688 "base_bdevs_list": [ 00:15:52.688 { 00:15:52.688 "name": "BaseBdev1", 00:15:52.688 "uuid": "189179a0-a13c-4d9c-9c71-941574d641f5", 00:15:52.688 "is_configured": true, 00:15:52.688 "data_offset": 0, 00:15:52.688 "data_size": 65536 00:15:52.688 }, 00:15:52.688 { 00:15:52.688 "name": "BaseBdev2", 00:15:52.688 "uuid": "a6be6fac-cbe4-4d54-b87f-a6e71eff2e67", 00:15:52.688 "is_configured": true, 00:15:52.688 "data_offset": 0, 00:15:52.688 "data_size": 65536 00:15:52.688 }, 00:15:52.688 { 00:15:52.688 "name": "BaseBdev3", 00:15:52.688 "uuid": "7f53b98e-6b6b-4ac6-adc0-bdbf18736dd6", 00:15:52.688 "is_configured": true, 00:15:52.688 "data_offset": 0, 00:15:52.688 "data_size": 65536 00:15:52.688 }, 00:15:52.688 { 00:15:52.688 "name": "BaseBdev4", 00:15:52.688 "uuid": "1bb070c4-c869-47e2-b786-046c87b9a8f1", 00:15:52.688 "is_configured": true, 00:15:52.688 "data_offset": 0, 00:15:52.688 "data_size": 65536 00:15:52.688 } 00:15:52.688 ] 00:15:52.688 }' 00:15:52.688 11:56:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:52.688 11:56:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:53.260 11:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:53.260 11:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:53.260 11:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:53.260 11:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:53.260 11:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:53.260 11:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:53.260 11:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:53.260 11:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:53.260 [2024-07-12 11:56:43.384917] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:53.260 11:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:53.260 "name": "Existed_Raid", 00:15:53.260 "aliases": [ 00:15:53.260 "be19e744-5e6b-4c4b-a0de-7b55627890e1" 00:15:53.260 ], 00:15:53.260 "product_name": "Raid Volume", 00:15:53.260 "block_size": 512, 00:15:53.260 "num_blocks": 262144, 00:15:53.260 "uuid": "be19e744-5e6b-4c4b-a0de-7b55627890e1", 00:15:53.260 "assigned_rate_limits": { 00:15:53.260 "rw_ios_per_sec": 0, 00:15:53.260 "rw_mbytes_per_sec": 0, 00:15:53.260 "r_mbytes_per_sec": 0, 00:15:53.260 "w_mbytes_per_sec": 0 00:15:53.260 }, 00:15:53.260 "claimed": false, 00:15:53.260 "zoned": false, 00:15:53.260 "supported_io_types": { 00:15:53.260 "read": true, 00:15:53.260 "write": true, 00:15:53.260 "unmap": true, 00:15:53.260 "flush": true, 00:15:53.260 "reset": true, 00:15:53.260 "nvme_admin": false, 00:15:53.260 "nvme_io": false, 00:15:53.260 "nvme_io_md": false, 00:15:53.260 "write_zeroes": true, 00:15:53.260 "zcopy": false, 00:15:53.260 "get_zone_info": false, 00:15:53.260 "zone_management": false, 00:15:53.260 "zone_append": false, 00:15:53.260 "compare": false, 00:15:53.260 "compare_and_write": false, 00:15:53.260 "abort": false, 00:15:53.260 "seek_hole": false, 00:15:53.260 "seek_data": false, 00:15:53.260 "copy": false, 00:15:53.260 "nvme_iov_md": false 00:15:53.260 }, 00:15:53.260 "memory_domains": [ 00:15:53.260 { 00:15:53.260 "dma_device_id": "system", 00:15:53.260 "dma_device_type": 1 00:15:53.260 }, 00:15:53.260 { 00:15:53.260 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:53.260 "dma_device_type": 2 00:15:53.260 }, 00:15:53.260 { 00:15:53.260 "dma_device_id": "system", 00:15:53.260 "dma_device_type": 1 00:15:53.260 }, 00:15:53.260 { 00:15:53.260 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:53.260 "dma_device_type": 2 00:15:53.260 }, 00:15:53.260 { 00:15:53.260 "dma_device_id": "system", 00:15:53.260 "dma_device_type": 1 00:15:53.260 }, 00:15:53.260 { 00:15:53.260 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:53.260 "dma_device_type": 2 00:15:53.260 }, 00:15:53.260 { 00:15:53.260 "dma_device_id": "system", 00:15:53.260 "dma_device_type": 1 00:15:53.260 }, 00:15:53.260 { 00:15:53.260 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:53.260 "dma_device_type": 2 00:15:53.260 } 00:15:53.260 ], 00:15:53.260 "driver_specific": { 00:15:53.260 "raid": { 00:15:53.260 "uuid": "be19e744-5e6b-4c4b-a0de-7b55627890e1", 00:15:53.260 "strip_size_kb": 64, 00:15:53.260 "state": "online", 00:15:53.260 "raid_level": "concat", 00:15:53.260 "superblock": false, 00:15:53.260 "num_base_bdevs": 4, 00:15:53.260 "num_base_bdevs_discovered": 4, 00:15:53.260 "num_base_bdevs_operational": 4, 00:15:53.260 "base_bdevs_list": [ 00:15:53.260 { 00:15:53.260 "name": "BaseBdev1", 00:15:53.260 "uuid": "189179a0-a13c-4d9c-9c71-941574d641f5", 00:15:53.260 "is_configured": true, 00:15:53.260 "data_offset": 0, 00:15:53.260 "data_size": 65536 00:15:53.260 }, 00:15:53.260 { 00:15:53.260 "name": "BaseBdev2", 00:15:53.260 "uuid": "a6be6fac-cbe4-4d54-b87f-a6e71eff2e67", 00:15:53.260 "is_configured": true, 00:15:53.260 "data_offset": 0, 00:15:53.260 "data_size": 65536 00:15:53.260 }, 00:15:53.260 { 00:15:53.260 "name": "BaseBdev3", 00:15:53.260 "uuid": "7f53b98e-6b6b-4ac6-adc0-bdbf18736dd6", 00:15:53.260 "is_configured": true, 00:15:53.260 "data_offset": 0, 00:15:53.260 "data_size": 65536 00:15:53.260 }, 00:15:53.260 { 00:15:53.260 "name": "BaseBdev4", 00:15:53.260 "uuid": "1bb070c4-c869-47e2-b786-046c87b9a8f1", 00:15:53.260 "is_configured": true, 00:15:53.260 "data_offset": 0, 00:15:53.260 "data_size": 65536 00:15:53.260 } 00:15:53.260 ] 00:15:53.260 } 00:15:53.260 } 00:15:53.260 }' 00:15:53.260 11:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:53.260 11:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:15:53.260 BaseBdev2 00:15:53.260 BaseBdev3 00:15:53.260 BaseBdev4' 00:15:53.260 11:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:53.260 11:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:53.260 11:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:53.519 11:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:53.519 "name": "BaseBdev1", 00:15:53.519 "aliases": [ 00:15:53.519 "189179a0-a13c-4d9c-9c71-941574d641f5" 00:15:53.519 ], 00:15:53.519 "product_name": "Malloc disk", 00:15:53.519 "block_size": 512, 00:15:53.519 "num_blocks": 65536, 00:15:53.519 "uuid": "189179a0-a13c-4d9c-9c71-941574d641f5", 00:15:53.519 "assigned_rate_limits": { 00:15:53.519 "rw_ios_per_sec": 0, 00:15:53.519 "rw_mbytes_per_sec": 0, 00:15:53.519 "r_mbytes_per_sec": 0, 00:15:53.519 "w_mbytes_per_sec": 0 00:15:53.519 }, 00:15:53.519 "claimed": true, 00:15:53.519 "claim_type": "exclusive_write", 00:15:53.519 "zoned": false, 00:15:53.519 "supported_io_types": { 00:15:53.519 "read": true, 00:15:53.519 "write": true, 00:15:53.519 "unmap": true, 00:15:53.519 "flush": true, 00:15:53.519 "reset": true, 00:15:53.519 "nvme_admin": false, 00:15:53.519 "nvme_io": false, 00:15:53.519 "nvme_io_md": false, 00:15:53.519 "write_zeroes": true, 00:15:53.519 "zcopy": true, 00:15:53.519 "get_zone_info": false, 00:15:53.519 "zone_management": false, 00:15:53.519 "zone_append": false, 00:15:53.519 "compare": false, 00:15:53.519 "compare_and_write": false, 00:15:53.519 "abort": true, 00:15:53.519 "seek_hole": false, 00:15:53.519 "seek_data": false, 00:15:53.519 "copy": true, 00:15:53.519 "nvme_iov_md": false 00:15:53.519 }, 00:15:53.519 "memory_domains": [ 00:15:53.519 { 00:15:53.519 "dma_device_id": "system", 00:15:53.519 "dma_device_type": 1 00:15:53.519 }, 00:15:53.519 { 00:15:53.519 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:53.519 "dma_device_type": 2 00:15:53.519 } 00:15:53.519 ], 00:15:53.519 "driver_specific": {} 00:15:53.519 }' 00:15:53.519 11:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:53.519 11:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:53.519 11:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:53.519 11:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:53.519 11:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:53.777 11:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:53.777 11:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:53.777 11:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:53.777 11:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:53.777 11:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:53.777 11:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:53.777 11:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:53.777 11:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:53.777 11:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:53.777 11:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:54.036 11:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:54.036 "name": "BaseBdev2", 00:15:54.036 "aliases": [ 00:15:54.036 "a6be6fac-cbe4-4d54-b87f-a6e71eff2e67" 00:15:54.036 ], 00:15:54.036 "product_name": "Malloc disk", 00:15:54.036 "block_size": 512, 00:15:54.036 "num_blocks": 65536, 00:15:54.036 "uuid": "a6be6fac-cbe4-4d54-b87f-a6e71eff2e67", 00:15:54.036 "assigned_rate_limits": { 00:15:54.036 "rw_ios_per_sec": 0, 00:15:54.036 "rw_mbytes_per_sec": 0, 00:15:54.036 "r_mbytes_per_sec": 0, 00:15:54.036 "w_mbytes_per_sec": 0 00:15:54.036 }, 00:15:54.036 "claimed": true, 00:15:54.036 "claim_type": "exclusive_write", 00:15:54.036 "zoned": false, 00:15:54.036 "supported_io_types": { 00:15:54.036 "read": true, 00:15:54.036 "write": true, 00:15:54.036 "unmap": true, 00:15:54.036 "flush": true, 00:15:54.036 "reset": true, 00:15:54.036 "nvme_admin": false, 00:15:54.036 "nvme_io": false, 00:15:54.036 "nvme_io_md": false, 00:15:54.036 "write_zeroes": true, 00:15:54.036 "zcopy": true, 00:15:54.036 "get_zone_info": false, 00:15:54.036 "zone_management": false, 00:15:54.036 "zone_append": false, 00:15:54.036 "compare": false, 00:15:54.036 "compare_and_write": false, 00:15:54.036 "abort": true, 00:15:54.036 "seek_hole": false, 00:15:54.036 "seek_data": false, 00:15:54.036 "copy": true, 00:15:54.036 "nvme_iov_md": false 00:15:54.036 }, 00:15:54.036 "memory_domains": [ 00:15:54.036 { 00:15:54.036 "dma_device_id": "system", 00:15:54.036 "dma_device_type": 1 00:15:54.036 }, 00:15:54.036 { 00:15:54.036 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:54.036 "dma_device_type": 2 00:15:54.036 } 00:15:54.036 ], 00:15:54.036 "driver_specific": {} 00:15:54.036 }' 00:15:54.036 11:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:54.036 11:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:54.036 11:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:54.036 11:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:54.036 11:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:54.293 11:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:54.293 11:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:54.293 11:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:54.293 11:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:54.293 11:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:54.293 11:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:54.293 11:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:54.293 11:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:54.293 11:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:54.293 11:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:54.552 11:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:54.552 "name": "BaseBdev3", 00:15:54.552 "aliases": [ 00:15:54.552 "7f53b98e-6b6b-4ac6-adc0-bdbf18736dd6" 00:15:54.552 ], 00:15:54.552 "product_name": "Malloc disk", 00:15:54.552 "block_size": 512, 00:15:54.552 "num_blocks": 65536, 00:15:54.552 "uuid": "7f53b98e-6b6b-4ac6-adc0-bdbf18736dd6", 00:15:54.552 "assigned_rate_limits": { 00:15:54.552 "rw_ios_per_sec": 0, 00:15:54.552 "rw_mbytes_per_sec": 0, 00:15:54.552 "r_mbytes_per_sec": 0, 00:15:54.552 "w_mbytes_per_sec": 0 00:15:54.552 }, 00:15:54.552 "claimed": true, 00:15:54.552 "claim_type": "exclusive_write", 00:15:54.552 "zoned": false, 00:15:54.552 "supported_io_types": { 00:15:54.552 "read": true, 00:15:54.552 "write": true, 00:15:54.552 "unmap": true, 00:15:54.552 "flush": true, 00:15:54.552 "reset": true, 00:15:54.552 "nvme_admin": false, 00:15:54.552 "nvme_io": false, 00:15:54.552 "nvme_io_md": false, 00:15:54.552 "write_zeroes": true, 00:15:54.552 "zcopy": true, 00:15:54.552 "get_zone_info": false, 00:15:54.552 "zone_management": false, 00:15:54.552 "zone_append": false, 00:15:54.552 "compare": false, 00:15:54.552 "compare_and_write": false, 00:15:54.552 "abort": true, 00:15:54.552 "seek_hole": false, 00:15:54.552 "seek_data": false, 00:15:54.552 "copy": true, 00:15:54.552 "nvme_iov_md": false 00:15:54.552 }, 00:15:54.552 "memory_domains": [ 00:15:54.552 { 00:15:54.552 "dma_device_id": "system", 00:15:54.552 "dma_device_type": 1 00:15:54.552 }, 00:15:54.552 { 00:15:54.552 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:54.552 "dma_device_type": 2 00:15:54.552 } 00:15:54.552 ], 00:15:54.552 "driver_specific": {} 00:15:54.552 }' 00:15:54.552 11:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:54.552 11:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:54.552 11:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:54.552 11:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:54.552 11:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:54.552 11:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:54.552 11:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:54.552 11:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:54.810 11:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:54.810 11:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:54.810 11:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:54.811 11:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:54.811 11:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:54.811 11:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:15:54.811 11:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:55.070 11:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:55.070 "name": "BaseBdev4", 00:15:55.070 "aliases": [ 00:15:55.070 "1bb070c4-c869-47e2-b786-046c87b9a8f1" 00:15:55.070 ], 00:15:55.070 "product_name": "Malloc disk", 00:15:55.070 "block_size": 512, 00:15:55.070 "num_blocks": 65536, 00:15:55.070 "uuid": "1bb070c4-c869-47e2-b786-046c87b9a8f1", 00:15:55.070 "assigned_rate_limits": { 00:15:55.070 "rw_ios_per_sec": 0, 00:15:55.070 "rw_mbytes_per_sec": 0, 00:15:55.070 "r_mbytes_per_sec": 0, 00:15:55.070 "w_mbytes_per_sec": 0 00:15:55.070 }, 00:15:55.070 "claimed": true, 00:15:55.070 "claim_type": "exclusive_write", 00:15:55.070 "zoned": false, 00:15:55.070 "supported_io_types": { 00:15:55.070 "read": true, 00:15:55.070 "write": true, 00:15:55.070 "unmap": true, 00:15:55.070 "flush": true, 00:15:55.070 "reset": true, 00:15:55.070 "nvme_admin": false, 00:15:55.070 "nvme_io": false, 00:15:55.070 "nvme_io_md": false, 00:15:55.070 "write_zeroes": true, 00:15:55.070 "zcopy": true, 00:15:55.070 "get_zone_info": false, 00:15:55.070 "zone_management": false, 00:15:55.070 "zone_append": false, 00:15:55.070 "compare": false, 00:15:55.070 "compare_and_write": false, 00:15:55.070 "abort": true, 00:15:55.070 "seek_hole": false, 00:15:55.070 "seek_data": false, 00:15:55.070 "copy": true, 00:15:55.070 "nvme_iov_md": false 00:15:55.070 }, 00:15:55.070 "memory_domains": [ 00:15:55.070 { 00:15:55.070 "dma_device_id": "system", 00:15:55.070 "dma_device_type": 1 00:15:55.070 }, 00:15:55.070 { 00:15:55.070 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:55.070 "dma_device_type": 2 00:15:55.070 } 00:15:55.070 ], 00:15:55.070 "driver_specific": {} 00:15:55.070 }' 00:15:55.070 11:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:55.070 11:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:55.070 11:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:55.070 11:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:55.070 11:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:55.070 11:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:55.070 11:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:55.070 11:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:55.070 11:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:55.070 11:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:55.329 11:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:55.329 11:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:55.329 11:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:55.329 [2024-07-12 11:56:45.530444] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:55.329 [2024-07-12 11:56:45.530462] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:55.329 [2024-07-12 11:56:45.530495] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:55.329 11:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:15:55.329 11:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:15:55.329 11:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:55.329 11:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:55.329 11:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:15:55.329 11:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:15:55.329 11:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:55.329 11:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:15:55.329 11:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:55.329 11:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:55.329 11:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:55.329 11:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:55.329 11:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:55.329 11:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:55.329 11:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:55.329 11:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:55.329 11:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:55.587 11:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:55.587 "name": "Existed_Raid", 00:15:55.587 "uuid": "be19e744-5e6b-4c4b-a0de-7b55627890e1", 00:15:55.587 "strip_size_kb": 64, 00:15:55.587 "state": "offline", 00:15:55.587 "raid_level": "concat", 00:15:55.587 "superblock": false, 00:15:55.587 "num_base_bdevs": 4, 00:15:55.587 "num_base_bdevs_discovered": 3, 00:15:55.587 "num_base_bdevs_operational": 3, 00:15:55.587 "base_bdevs_list": [ 00:15:55.587 { 00:15:55.587 "name": null, 00:15:55.587 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:55.587 "is_configured": false, 00:15:55.587 "data_offset": 0, 00:15:55.587 "data_size": 65536 00:15:55.587 }, 00:15:55.587 { 00:15:55.587 "name": "BaseBdev2", 00:15:55.587 "uuid": "a6be6fac-cbe4-4d54-b87f-a6e71eff2e67", 00:15:55.587 "is_configured": true, 00:15:55.587 "data_offset": 0, 00:15:55.587 "data_size": 65536 00:15:55.587 }, 00:15:55.587 { 00:15:55.587 "name": "BaseBdev3", 00:15:55.587 "uuid": "7f53b98e-6b6b-4ac6-adc0-bdbf18736dd6", 00:15:55.587 "is_configured": true, 00:15:55.587 "data_offset": 0, 00:15:55.587 "data_size": 65536 00:15:55.587 }, 00:15:55.587 { 00:15:55.587 "name": "BaseBdev4", 00:15:55.587 "uuid": "1bb070c4-c869-47e2-b786-046c87b9a8f1", 00:15:55.587 "is_configured": true, 00:15:55.587 "data_offset": 0, 00:15:55.587 "data_size": 65536 00:15:55.587 } 00:15:55.587 ] 00:15:55.587 }' 00:15:55.587 11:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:55.587 11:56:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:56.155 11:56:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:56.155 11:56:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:56.156 11:56:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:56.156 11:56:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:56.156 11:56:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:56.156 11:56:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:56.156 11:56:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:56.415 [2024-07-12 11:56:46.525784] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:56.415 11:56:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:56.415 11:56:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:56.415 11:56:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:56.415 11:56:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:56.674 11:56:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:56.674 11:56:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:56.674 11:56:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:56.674 [2024-07-12 11:56:46.868570] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:56.674 11:56:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:56.674 11:56:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:56.674 11:56:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:56.674 11:56:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:56.932 11:56:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:56.932 11:56:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:56.932 11:56:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:15:57.190 [2024-07-12 11:56:47.203292] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:15:57.190 [2024-07-12 11:56:47.203322] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2178b90 name Existed_Raid, state offline 00:15:57.190 11:56:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:57.190 11:56:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:57.190 11:56:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:57.190 11:56:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:15:57.190 11:56:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:15:57.190 11:56:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:15:57.190 11:56:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:15:57.190 11:56:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:15:57.190 11:56:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:57.190 11:56:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:57.448 BaseBdev2 00:15:57.448 11:56:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:15:57.448 11:56:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:57.448 11:56:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:57.448 11:56:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:57.448 11:56:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:57.448 11:56:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:57.448 11:56:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:57.706 11:56:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:57.706 [ 00:15:57.706 { 00:15:57.706 "name": "BaseBdev2", 00:15:57.706 "aliases": [ 00:15:57.706 "517f084c-ec52-46c1-87bb-ce2f069013f4" 00:15:57.706 ], 00:15:57.706 "product_name": "Malloc disk", 00:15:57.706 "block_size": 512, 00:15:57.706 "num_blocks": 65536, 00:15:57.706 "uuid": "517f084c-ec52-46c1-87bb-ce2f069013f4", 00:15:57.706 "assigned_rate_limits": { 00:15:57.706 "rw_ios_per_sec": 0, 00:15:57.706 "rw_mbytes_per_sec": 0, 00:15:57.706 "r_mbytes_per_sec": 0, 00:15:57.706 "w_mbytes_per_sec": 0 00:15:57.706 }, 00:15:57.706 "claimed": false, 00:15:57.706 "zoned": false, 00:15:57.706 "supported_io_types": { 00:15:57.706 "read": true, 00:15:57.706 "write": true, 00:15:57.706 "unmap": true, 00:15:57.707 "flush": true, 00:15:57.707 "reset": true, 00:15:57.707 "nvme_admin": false, 00:15:57.707 "nvme_io": false, 00:15:57.707 "nvme_io_md": false, 00:15:57.707 "write_zeroes": true, 00:15:57.707 "zcopy": true, 00:15:57.707 "get_zone_info": false, 00:15:57.707 "zone_management": false, 00:15:57.707 "zone_append": false, 00:15:57.707 "compare": false, 00:15:57.707 "compare_and_write": false, 00:15:57.707 "abort": true, 00:15:57.707 "seek_hole": false, 00:15:57.707 "seek_data": false, 00:15:57.707 "copy": true, 00:15:57.707 "nvme_iov_md": false 00:15:57.707 }, 00:15:57.707 "memory_domains": [ 00:15:57.707 { 00:15:57.707 "dma_device_id": "system", 00:15:57.707 "dma_device_type": 1 00:15:57.707 }, 00:15:57.707 { 00:15:57.707 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:57.707 "dma_device_type": 2 00:15:57.707 } 00:15:57.707 ], 00:15:57.707 "driver_specific": {} 00:15:57.707 } 00:15:57.707 ] 00:15:57.707 11:56:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:57.707 11:56:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:57.707 11:56:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:57.707 11:56:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:57.965 BaseBdev3 00:15:57.965 11:56:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:15:57.965 11:56:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:57.965 11:56:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:57.965 11:56:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:57.965 11:56:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:57.965 11:56:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:57.965 11:56:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:57.965 11:56:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:58.223 [ 00:15:58.223 { 00:15:58.223 "name": "BaseBdev3", 00:15:58.223 "aliases": [ 00:15:58.223 "ce7ffb8a-9512-472d-b602-2785deb20da3" 00:15:58.223 ], 00:15:58.223 "product_name": "Malloc disk", 00:15:58.223 "block_size": 512, 00:15:58.223 "num_blocks": 65536, 00:15:58.223 "uuid": "ce7ffb8a-9512-472d-b602-2785deb20da3", 00:15:58.223 "assigned_rate_limits": { 00:15:58.223 "rw_ios_per_sec": 0, 00:15:58.223 "rw_mbytes_per_sec": 0, 00:15:58.223 "r_mbytes_per_sec": 0, 00:15:58.223 "w_mbytes_per_sec": 0 00:15:58.223 }, 00:15:58.223 "claimed": false, 00:15:58.223 "zoned": false, 00:15:58.223 "supported_io_types": { 00:15:58.223 "read": true, 00:15:58.223 "write": true, 00:15:58.223 "unmap": true, 00:15:58.223 "flush": true, 00:15:58.223 "reset": true, 00:15:58.223 "nvme_admin": false, 00:15:58.223 "nvme_io": false, 00:15:58.223 "nvme_io_md": false, 00:15:58.223 "write_zeroes": true, 00:15:58.223 "zcopy": true, 00:15:58.223 "get_zone_info": false, 00:15:58.223 "zone_management": false, 00:15:58.223 "zone_append": false, 00:15:58.223 "compare": false, 00:15:58.223 "compare_and_write": false, 00:15:58.223 "abort": true, 00:15:58.223 "seek_hole": false, 00:15:58.223 "seek_data": false, 00:15:58.223 "copy": true, 00:15:58.223 "nvme_iov_md": false 00:15:58.223 }, 00:15:58.223 "memory_domains": [ 00:15:58.223 { 00:15:58.223 "dma_device_id": "system", 00:15:58.223 "dma_device_type": 1 00:15:58.223 }, 00:15:58.223 { 00:15:58.223 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:58.223 "dma_device_type": 2 00:15:58.223 } 00:15:58.223 ], 00:15:58.223 "driver_specific": {} 00:15:58.223 } 00:15:58.223 ] 00:15:58.223 11:56:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:58.223 11:56:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:58.223 11:56:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:58.223 11:56:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:15:58.481 BaseBdev4 00:15:58.481 11:56:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:15:58.481 11:56:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:15:58.481 11:56:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:58.481 11:56:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:58.481 11:56:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:58.481 11:56:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:58.481 11:56:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:58.481 11:56:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:15:58.739 [ 00:15:58.739 { 00:15:58.739 "name": "BaseBdev4", 00:15:58.739 "aliases": [ 00:15:58.739 "42dda1de-248a-4a0d-b76f-1b5761b91225" 00:15:58.739 ], 00:15:58.739 "product_name": "Malloc disk", 00:15:58.739 "block_size": 512, 00:15:58.739 "num_blocks": 65536, 00:15:58.739 "uuid": "42dda1de-248a-4a0d-b76f-1b5761b91225", 00:15:58.739 "assigned_rate_limits": { 00:15:58.739 "rw_ios_per_sec": 0, 00:15:58.739 "rw_mbytes_per_sec": 0, 00:15:58.739 "r_mbytes_per_sec": 0, 00:15:58.739 "w_mbytes_per_sec": 0 00:15:58.739 }, 00:15:58.739 "claimed": false, 00:15:58.739 "zoned": false, 00:15:58.739 "supported_io_types": { 00:15:58.739 "read": true, 00:15:58.739 "write": true, 00:15:58.739 "unmap": true, 00:15:58.739 "flush": true, 00:15:58.739 "reset": true, 00:15:58.739 "nvme_admin": false, 00:15:58.739 "nvme_io": false, 00:15:58.739 "nvme_io_md": false, 00:15:58.739 "write_zeroes": true, 00:15:58.739 "zcopy": true, 00:15:58.739 "get_zone_info": false, 00:15:58.739 "zone_management": false, 00:15:58.739 "zone_append": false, 00:15:58.739 "compare": false, 00:15:58.739 "compare_and_write": false, 00:15:58.739 "abort": true, 00:15:58.739 "seek_hole": false, 00:15:58.739 "seek_data": false, 00:15:58.739 "copy": true, 00:15:58.739 "nvme_iov_md": false 00:15:58.739 }, 00:15:58.739 "memory_domains": [ 00:15:58.739 { 00:15:58.739 "dma_device_id": "system", 00:15:58.739 "dma_device_type": 1 00:15:58.739 }, 00:15:58.739 { 00:15:58.739 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:58.739 "dma_device_type": 2 00:15:58.739 } 00:15:58.739 ], 00:15:58.739 "driver_specific": {} 00:15:58.739 } 00:15:58.739 ] 00:15:58.739 11:56:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:58.739 11:56:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:58.739 11:56:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:58.739 11:56:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:58.998 [2024-07-12 11:56:49.020864] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:58.998 [2024-07-12 11:56:49.020891] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:58.998 [2024-07-12 11:56:49.020902] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:58.998 [2024-07-12 11:56:49.021891] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:58.998 [2024-07-12 11:56:49.021921] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:15:58.998 11:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:58.998 11:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:58.998 11:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:58.998 11:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:58.998 11:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:58.998 11:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:58.998 11:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:58.998 11:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:58.998 11:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:58.998 11:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:58.998 11:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:58.998 11:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:58.998 11:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:58.998 "name": "Existed_Raid", 00:15:58.998 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:58.998 "strip_size_kb": 64, 00:15:58.998 "state": "configuring", 00:15:58.998 "raid_level": "concat", 00:15:58.998 "superblock": false, 00:15:58.998 "num_base_bdevs": 4, 00:15:58.998 "num_base_bdevs_discovered": 3, 00:15:58.998 "num_base_bdevs_operational": 4, 00:15:58.998 "base_bdevs_list": [ 00:15:58.998 { 00:15:58.998 "name": "BaseBdev1", 00:15:58.998 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:58.998 "is_configured": false, 00:15:58.998 "data_offset": 0, 00:15:58.998 "data_size": 0 00:15:58.998 }, 00:15:58.998 { 00:15:58.998 "name": "BaseBdev2", 00:15:58.998 "uuid": "517f084c-ec52-46c1-87bb-ce2f069013f4", 00:15:58.998 "is_configured": true, 00:15:58.998 "data_offset": 0, 00:15:58.998 "data_size": 65536 00:15:58.998 }, 00:15:58.998 { 00:15:58.998 "name": "BaseBdev3", 00:15:58.998 "uuid": "ce7ffb8a-9512-472d-b602-2785deb20da3", 00:15:58.998 "is_configured": true, 00:15:58.998 "data_offset": 0, 00:15:58.998 "data_size": 65536 00:15:58.998 }, 00:15:58.998 { 00:15:58.998 "name": "BaseBdev4", 00:15:58.998 "uuid": "42dda1de-248a-4a0d-b76f-1b5761b91225", 00:15:58.998 "is_configured": true, 00:15:58.998 "data_offset": 0, 00:15:58.998 "data_size": 65536 00:15:58.998 } 00:15:58.998 ] 00:15:58.998 }' 00:15:58.998 11:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:58.998 11:56:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:59.565 11:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:15:59.824 [2024-07-12 11:56:49.846985] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:59.824 11:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:59.824 11:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:59.824 11:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:59.824 11:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:59.824 11:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:59.824 11:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:59.824 11:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:59.824 11:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:59.824 11:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:59.824 11:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:59.824 11:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:59.824 11:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:59.824 11:56:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:59.824 "name": "Existed_Raid", 00:15:59.824 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:59.824 "strip_size_kb": 64, 00:15:59.824 "state": "configuring", 00:15:59.824 "raid_level": "concat", 00:15:59.824 "superblock": false, 00:15:59.824 "num_base_bdevs": 4, 00:15:59.824 "num_base_bdevs_discovered": 2, 00:15:59.824 "num_base_bdevs_operational": 4, 00:15:59.824 "base_bdevs_list": [ 00:15:59.824 { 00:15:59.824 "name": "BaseBdev1", 00:15:59.824 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:59.824 "is_configured": false, 00:15:59.824 "data_offset": 0, 00:15:59.824 "data_size": 0 00:15:59.824 }, 00:15:59.824 { 00:15:59.824 "name": null, 00:15:59.824 "uuid": "517f084c-ec52-46c1-87bb-ce2f069013f4", 00:15:59.824 "is_configured": false, 00:15:59.824 "data_offset": 0, 00:15:59.824 "data_size": 65536 00:15:59.824 }, 00:15:59.824 { 00:15:59.824 "name": "BaseBdev3", 00:15:59.824 "uuid": "ce7ffb8a-9512-472d-b602-2785deb20da3", 00:15:59.824 "is_configured": true, 00:15:59.824 "data_offset": 0, 00:15:59.824 "data_size": 65536 00:15:59.824 }, 00:15:59.824 { 00:15:59.824 "name": "BaseBdev4", 00:15:59.824 "uuid": "42dda1de-248a-4a0d-b76f-1b5761b91225", 00:15:59.824 "is_configured": true, 00:15:59.824 "data_offset": 0, 00:15:59.824 "data_size": 65536 00:15:59.824 } 00:15:59.824 ] 00:15:59.824 }' 00:15:59.824 11:56:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:59.824 11:56:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:00.391 11:56:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:00.391 11:56:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:00.649 11:56:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:00.649 11:56:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:00.649 [2024-07-12 11:56:50.856435] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:00.649 BaseBdev1 00:16:00.649 11:56:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:00.649 11:56:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:00.649 11:56:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:00.649 11:56:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:00.649 11:56:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:00.649 11:56:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:00.649 11:56:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:00.908 11:56:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:01.167 [ 00:16:01.167 { 00:16:01.167 "name": "BaseBdev1", 00:16:01.167 "aliases": [ 00:16:01.167 "97453b94-dd68-4af0-aab5-ac4904809b4a" 00:16:01.167 ], 00:16:01.167 "product_name": "Malloc disk", 00:16:01.167 "block_size": 512, 00:16:01.167 "num_blocks": 65536, 00:16:01.167 "uuid": "97453b94-dd68-4af0-aab5-ac4904809b4a", 00:16:01.167 "assigned_rate_limits": { 00:16:01.167 "rw_ios_per_sec": 0, 00:16:01.167 "rw_mbytes_per_sec": 0, 00:16:01.167 "r_mbytes_per_sec": 0, 00:16:01.167 "w_mbytes_per_sec": 0 00:16:01.167 }, 00:16:01.167 "claimed": true, 00:16:01.167 "claim_type": "exclusive_write", 00:16:01.167 "zoned": false, 00:16:01.167 "supported_io_types": { 00:16:01.167 "read": true, 00:16:01.167 "write": true, 00:16:01.167 "unmap": true, 00:16:01.167 "flush": true, 00:16:01.167 "reset": true, 00:16:01.167 "nvme_admin": false, 00:16:01.167 "nvme_io": false, 00:16:01.167 "nvme_io_md": false, 00:16:01.167 "write_zeroes": true, 00:16:01.167 "zcopy": true, 00:16:01.167 "get_zone_info": false, 00:16:01.167 "zone_management": false, 00:16:01.167 "zone_append": false, 00:16:01.167 "compare": false, 00:16:01.167 "compare_and_write": false, 00:16:01.167 "abort": true, 00:16:01.167 "seek_hole": false, 00:16:01.167 "seek_data": false, 00:16:01.167 "copy": true, 00:16:01.167 "nvme_iov_md": false 00:16:01.167 }, 00:16:01.167 "memory_domains": [ 00:16:01.167 { 00:16:01.167 "dma_device_id": "system", 00:16:01.167 "dma_device_type": 1 00:16:01.167 }, 00:16:01.167 { 00:16:01.167 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:01.167 "dma_device_type": 2 00:16:01.167 } 00:16:01.167 ], 00:16:01.167 "driver_specific": {} 00:16:01.167 } 00:16:01.167 ] 00:16:01.167 11:56:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:01.167 11:56:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:01.167 11:56:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:01.167 11:56:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:01.167 11:56:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:01.167 11:56:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:01.167 11:56:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:01.167 11:56:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:01.167 11:56:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:01.167 11:56:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:01.167 11:56:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:01.167 11:56:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:01.167 11:56:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:01.167 11:56:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:01.167 "name": "Existed_Raid", 00:16:01.167 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:01.167 "strip_size_kb": 64, 00:16:01.167 "state": "configuring", 00:16:01.167 "raid_level": "concat", 00:16:01.167 "superblock": false, 00:16:01.167 "num_base_bdevs": 4, 00:16:01.167 "num_base_bdevs_discovered": 3, 00:16:01.167 "num_base_bdevs_operational": 4, 00:16:01.167 "base_bdevs_list": [ 00:16:01.167 { 00:16:01.167 "name": "BaseBdev1", 00:16:01.167 "uuid": "97453b94-dd68-4af0-aab5-ac4904809b4a", 00:16:01.167 "is_configured": true, 00:16:01.167 "data_offset": 0, 00:16:01.167 "data_size": 65536 00:16:01.167 }, 00:16:01.167 { 00:16:01.167 "name": null, 00:16:01.167 "uuid": "517f084c-ec52-46c1-87bb-ce2f069013f4", 00:16:01.167 "is_configured": false, 00:16:01.167 "data_offset": 0, 00:16:01.167 "data_size": 65536 00:16:01.167 }, 00:16:01.167 { 00:16:01.167 "name": "BaseBdev3", 00:16:01.167 "uuid": "ce7ffb8a-9512-472d-b602-2785deb20da3", 00:16:01.167 "is_configured": true, 00:16:01.167 "data_offset": 0, 00:16:01.167 "data_size": 65536 00:16:01.167 }, 00:16:01.167 { 00:16:01.167 "name": "BaseBdev4", 00:16:01.167 "uuid": "42dda1de-248a-4a0d-b76f-1b5761b91225", 00:16:01.167 "is_configured": true, 00:16:01.167 "data_offset": 0, 00:16:01.167 "data_size": 65536 00:16:01.167 } 00:16:01.167 ] 00:16:01.167 }' 00:16:01.167 11:56:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:01.167 11:56:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:01.733 11:56:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:01.733 11:56:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:01.992 11:56:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:01.992 11:56:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:01.992 [2024-07-12 11:56:52.147816] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:01.992 11:56:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:01.992 11:56:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:01.992 11:56:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:01.992 11:56:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:01.992 11:56:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:01.992 11:56:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:01.992 11:56:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:01.992 11:56:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:01.992 11:56:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:01.992 11:56:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:01.992 11:56:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:01.992 11:56:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:02.250 11:56:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:02.250 "name": "Existed_Raid", 00:16:02.250 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:02.250 "strip_size_kb": 64, 00:16:02.250 "state": "configuring", 00:16:02.250 "raid_level": "concat", 00:16:02.250 "superblock": false, 00:16:02.250 "num_base_bdevs": 4, 00:16:02.250 "num_base_bdevs_discovered": 2, 00:16:02.250 "num_base_bdevs_operational": 4, 00:16:02.250 "base_bdevs_list": [ 00:16:02.250 { 00:16:02.250 "name": "BaseBdev1", 00:16:02.250 "uuid": "97453b94-dd68-4af0-aab5-ac4904809b4a", 00:16:02.250 "is_configured": true, 00:16:02.250 "data_offset": 0, 00:16:02.250 "data_size": 65536 00:16:02.250 }, 00:16:02.250 { 00:16:02.250 "name": null, 00:16:02.250 "uuid": "517f084c-ec52-46c1-87bb-ce2f069013f4", 00:16:02.250 "is_configured": false, 00:16:02.250 "data_offset": 0, 00:16:02.250 "data_size": 65536 00:16:02.250 }, 00:16:02.250 { 00:16:02.250 "name": null, 00:16:02.250 "uuid": "ce7ffb8a-9512-472d-b602-2785deb20da3", 00:16:02.250 "is_configured": false, 00:16:02.250 "data_offset": 0, 00:16:02.250 "data_size": 65536 00:16:02.250 }, 00:16:02.250 { 00:16:02.250 "name": "BaseBdev4", 00:16:02.250 "uuid": "42dda1de-248a-4a0d-b76f-1b5761b91225", 00:16:02.250 "is_configured": true, 00:16:02.250 "data_offset": 0, 00:16:02.250 "data_size": 65536 00:16:02.250 } 00:16:02.250 ] 00:16:02.250 }' 00:16:02.250 11:56:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:02.250 11:56:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:02.817 11:56:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:02.817 11:56:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:02.817 11:56:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:02.817 11:56:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:03.075 [2024-07-12 11:56:53.142394] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:03.075 11:56:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:03.075 11:56:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:03.075 11:56:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:03.075 11:56:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:03.075 11:56:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:03.075 11:56:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:03.075 11:56:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:03.075 11:56:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:03.075 11:56:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:03.075 11:56:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:03.075 11:56:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:03.075 11:56:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:03.333 11:56:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:03.333 "name": "Existed_Raid", 00:16:03.333 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:03.333 "strip_size_kb": 64, 00:16:03.333 "state": "configuring", 00:16:03.333 "raid_level": "concat", 00:16:03.333 "superblock": false, 00:16:03.333 "num_base_bdevs": 4, 00:16:03.333 "num_base_bdevs_discovered": 3, 00:16:03.333 "num_base_bdevs_operational": 4, 00:16:03.333 "base_bdevs_list": [ 00:16:03.333 { 00:16:03.333 "name": "BaseBdev1", 00:16:03.333 "uuid": "97453b94-dd68-4af0-aab5-ac4904809b4a", 00:16:03.333 "is_configured": true, 00:16:03.333 "data_offset": 0, 00:16:03.333 "data_size": 65536 00:16:03.333 }, 00:16:03.333 { 00:16:03.333 "name": null, 00:16:03.333 "uuid": "517f084c-ec52-46c1-87bb-ce2f069013f4", 00:16:03.333 "is_configured": false, 00:16:03.333 "data_offset": 0, 00:16:03.333 "data_size": 65536 00:16:03.333 }, 00:16:03.333 { 00:16:03.333 "name": "BaseBdev3", 00:16:03.333 "uuid": "ce7ffb8a-9512-472d-b602-2785deb20da3", 00:16:03.333 "is_configured": true, 00:16:03.333 "data_offset": 0, 00:16:03.333 "data_size": 65536 00:16:03.333 }, 00:16:03.333 { 00:16:03.333 "name": "BaseBdev4", 00:16:03.333 "uuid": "42dda1de-248a-4a0d-b76f-1b5761b91225", 00:16:03.333 "is_configured": true, 00:16:03.333 "data_offset": 0, 00:16:03.333 "data_size": 65536 00:16:03.333 } 00:16:03.333 ] 00:16:03.333 }' 00:16:03.333 11:56:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:03.333 11:56:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:03.591 11:56:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:03.591 11:56:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:03.850 11:56:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:03.850 11:56:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:04.109 [2024-07-12 11:56:54.144990] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:04.109 11:56:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:04.109 11:56:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:04.109 11:56:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:04.109 11:56:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:04.109 11:56:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:04.109 11:56:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:04.109 11:56:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:04.109 11:56:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:04.109 11:56:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:04.109 11:56:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:04.109 11:56:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:04.109 11:56:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:04.109 11:56:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:04.109 "name": "Existed_Raid", 00:16:04.109 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:04.109 "strip_size_kb": 64, 00:16:04.109 "state": "configuring", 00:16:04.109 "raid_level": "concat", 00:16:04.109 "superblock": false, 00:16:04.109 "num_base_bdevs": 4, 00:16:04.109 "num_base_bdevs_discovered": 2, 00:16:04.109 "num_base_bdevs_operational": 4, 00:16:04.109 "base_bdevs_list": [ 00:16:04.109 { 00:16:04.109 "name": null, 00:16:04.109 "uuid": "97453b94-dd68-4af0-aab5-ac4904809b4a", 00:16:04.109 "is_configured": false, 00:16:04.109 "data_offset": 0, 00:16:04.109 "data_size": 65536 00:16:04.109 }, 00:16:04.109 { 00:16:04.109 "name": null, 00:16:04.109 "uuid": "517f084c-ec52-46c1-87bb-ce2f069013f4", 00:16:04.109 "is_configured": false, 00:16:04.109 "data_offset": 0, 00:16:04.109 "data_size": 65536 00:16:04.109 }, 00:16:04.109 { 00:16:04.109 "name": "BaseBdev3", 00:16:04.109 "uuid": "ce7ffb8a-9512-472d-b602-2785deb20da3", 00:16:04.109 "is_configured": true, 00:16:04.109 "data_offset": 0, 00:16:04.109 "data_size": 65536 00:16:04.109 }, 00:16:04.109 { 00:16:04.109 "name": "BaseBdev4", 00:16:04.109 "uuid": "42dda1de-248a-4a0d-b76f-1b5761b91225", 00:16:04.109 "is_configured": true, 00:16:04.109 "data_offset": 0, 00:16:04.109 "data_size": 65536 00:16:04.109 } 00:16:04.109 ] 00:16:04.109 }' 00:16:04.109 11:56:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:04.109 11:56:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:04.677 11:56:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:04.677 11:56:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:04.936 11:56:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:04.936 11:56:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:04.936 [2024-07-12 11:56:55.153206] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:04.936 11:56:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:04.936 11:56:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:04.936 11:56:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:04.936 11:56:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:04.936 11:56:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:04.936 11:56:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:04.936 11:56:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:04.936 11:56:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:04.936 11:56:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:04.936 11:56:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:04.936 11:56:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:04.936 11:56:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:05.195 11:56:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:05.195 "name": "Existed_Raid", 00:16:05.195 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:05.195 "strip_size_kb": 64, 00:16:05.195 "state": "configuring", 00:16:05.195 "raid_level": "concat", 00:16:05.195 "superblock": false, 00:16:05.195 "num_base_bdevs": 4, 00:16:05.195 "num_base_bdevs_discovered": 3, 00:16:05.195 "num_base_bdevs_operational": 4, 00:16:05.195 "base_bdevs_list": [ 00:16:05.195 { 00:16:05.195 "name": null, 00:16:05.195 "uuid": "97453b94-dd68-4af0-aab5-ac4904809b4a", 00:16:05.195 "is_configured": false, 00:16:05.195 "data_offset": 0, 00:16:05.195 "data_size": 65536 00:16:05.195 }, 00:16:05.195 { 00:16:05.195 "name": "BaseBdev2", 00:16:05.195 "uuid": "517f084c-ec52-46c1-87bb-ce2f069013f4", 00:16:05.195 "is_configured": true, 00:16:05.195 "data_offset": 0, 00:16:05.195 "data_size": 65536 00:16:05.195 }, 00:16:05.195 { 00:16:05.195 "name": "BaseBdev3", 00:16:05.195 "uuid": "ce7ffb8a-9512-472d-b602-2785deb20da3", 00:16:05.195 "is_configured": true, 00:16:05.195 "data_offset": 0, 00:16:05.195 "data_size": 65536 00:16:05.195 }, 00:16:05.195 { 00:16:05.195 "name": "BaseBdev4", 00:16:05.195 "uuid": "42dda1de-248a-4a0d-b76f-1b5761b91225", 00:16:05.195 "is_configured": true, 00:16:05.195 "data_offset": 0, 00:16:05.195 "data_size": 65536 00:16:05.195 } 00:16:05.195 ] 00:16:05.195 }' 00:16:05.195 11:56:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:05.195 11:56:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:05.763 11:56:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:05.763 11:56:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:05.763 11:56:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:05.763 11:56:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:05.763 11:56:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:06.021 11:56:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 97453b94-dd68-4af0-aab5-ac4904809b4a 00:16:06.280 [2024-07-12 11:56:56.330975] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:06.280 [2024-07-12 11:56:56.331001] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x231c1e0 00:16:06.280 [2024-07-12 11:56:56.331005] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:16:06.280 [2024-07-12 11:56:56.331144] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2177530 00:16:06.280 [2024-07-12 11:56:56.331228] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x231c1e0 00:16:06.280 [2024-07-12 11:56:56.331233] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x231c1e0 00:16:06.280 [2024-07-12 11:56:56.331346] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:06.280 NewBaseBdev 00:16:06.280 11:56:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:06.280 11:56:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:16:06.280 11:56:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:06.280 11:56:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:06.280 11:56:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:06.280 11:56:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:06.280 11:56:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:06.280 11:56:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:06.539 [ 00:16:06.539 { 00:16:06.539 "name": "NewBaseBdev", 00:16:06.539 "aliases": [ 00:16:06.539 "97453b94-dd68-4af0-aab5-ac4904809b4a" 00:16:06.539 ], 00:16:06.539 "product_name": "Malloc disk", 00:16:06.539 "block_size": 512, 00:16:06.539 "num_blocks": 65536, 00:16:06.539 "uuid": "97453b94-dd68-4af0-aab5-ac4904809b4a", 00:16:06.539 "assigned_rate_limits": { 00:16:06.539 "rw_ios_per_sec": 0, 00:16:06.539 "rw_mbytes_per_sec": 0, 00:16:06.539 "r_mbytes_per_sec": 0, 00:16:06.539 "w_mbytes_per_sec": 0 00:16:06.539 }, 00:16:06.539 "claimed": true, 00:16:06.539 "claim_type": "exclusive_write", 00:16:06.539 "zoned": false, 00:16:06.539 "supported_io_types": { 00:16:06.539 "read": true, 00:16:06.539 "write": true, 00:16:06.539 "unmap": true, 00:16:06.539 "flush": true, 00:16:06.539 "reset": true, 00:16:06.539 "nvme_admin": false, 00:16:06.539 "nvme_io": false, 00:16:06.539 "nvme_io_md": false, 00:16:06.539 "write_zeroes": true, 00:16:06.539 "zcopy": true, 00:16:06.539 "get_zone_info": false, 00:16:06.539 "zone_management": false, 00:16:06.539 "zone_append": false, 00:16:06.539 "compare": false, 00:16:06.539 "compare_and_write": false, 00:16:06.539 "abort": true, 00:16:06.539 "seek_hole": false, 00:16:06.539 "seek_data": false, 00:16:06.539 "copy": true, 00:16:06.539 "nvme_iov_md": false 00:16:06.539 }, 00:16:06.539 "memory_domains": [ 00:16:06.539 { 00:16:06.539 "dma_device_id": "system", 00:16:06.539 "dma_device_type": 1 00:16:06.539 }, 00:16:06.539 { 00:16:06.539 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:06.539 "dma_device_type": 2 00:16:06.539 } 00:16:06.539 ], 00:16:06.539 "driver_specific": {} 00:16:06.539 } 00:16:06.539 ] 00:16:06.539 11:56:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:06.539 11:56:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:16:06.539 11:56:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:06.539 11:56:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:06.539 11:56:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:06.539 11:56:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:06.539 11:56:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:06.539 11:56:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:06.539 11:56:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:06.539 11:56:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:06.539 11:56:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:06.540 11:56:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:06.540 11:56:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:06.798 11:56:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:06.798 "name": "Existed_Raid", 00:16:06.798 "uuid": "623e640d-2513-4000-b671-36ebc75f5469", 00:16:06.798 "strip_size_kb": 64, 00:16:06.798 "state": "online", 00:16:06.798 "raid_level": "concat", 00:16:06.798 "superblock": false, 00:16:06.798 "num_base_bdevs": 4, 00:16:06.798 "num_base_bdevs_discovered": 4, 00:16:06.798 "num_base_bdevs_operational": 4, 00:16:06.798 "base_bdevs_list": [ 00:16:06.798 { 00:16:06.798 "name": "NewBaseBdev", 00:16:06.798 "uuid": "97453b94-dd68-4af0-aab5-ac4904809b4a", 00:16:06.798 "is_configured": true, 00:16:06.798 "data_offset": 0, 00:16:06.798 "data_size": 65536 00:16:06.798 }, 00:16:06.798 { 00:16:06.798 "name": "BaseBdev2", 00:16:06.798 "uuid": "517f084c-ec52-46c1-87bb-ce2f069013f4", 00:16:06.798 "is_configured": true, 00:16:06.798 "data_offset": 0, 00:16:06.798 "data_size": 65536 00:16:06.798 }, 00:16:06.798 { 00:16:06.798 "name": "BaseBdev3", 00:16:06.798 "uuid": "ce7ffb8a-9512-472d-b602-2785deb20da3", 00:16:06.798 "is_configured": true, 00:16:06.798 "data_offset": 0, 00:16:06.798 "data_size": 65536 00:16:06.798 }, 00:16:06.798 { 00:16:06.798 "name": "BaseBdev4", 00:16:06.798 "uuid": "42dda1de-248a-4a0d-b76f-1b5761b91225", 00:16:06.798 "is_configured": true, 00:16:06.798 "data_offset": 0, 00:16:06.798 "data_size": 65536 00:16:06.798 } 00:16:06.798 ] 00:16:06.798 }' 00:16:06.798 11:56:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:06.798 11:56:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:07.364 11:56:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:07.364 11:56:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:07.364 11:56:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:07.364 11:56:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:07.364 11:56:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:07.364 11:56:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:07.364 11:56:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:07.364 11:56:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:07.364 [2024-07-12 11:56:57.514267] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:07.364 11:56:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:07.364 "name": "Existed_Raid", 00:16:07.364 "aliases": [ 00:16:07.364 "623e640d-2513-4000-b671-36ebc75f5469" 00:16:07.364 ], 00:16:07.364 "product_name": "Raid Volume", 00:16:07.364 "block_size": 512, 00:16:07.364 "num_blocks": 262144, 00:16:07.364 "uuid": "623e640d-2513-4000-b671-36ebc75f5469", 00:16:07.364 "assigned_rate_limits": { 00:16:07.364 "rw_ios_per_sec": 0, 00:16:07.364 "rw_mbytes_per_sec": 0, 00:16:07.364 "r_mbytes_per_sec": 0, 00:16:07.364 "w_mbytes_per_sec": 0 00:16:07.364 }, 00:16:07.364 "claimed": false, 00:16:07.364 "zoned": false, 00:16:07.364 "supported_io_types": { 00:16:07.364 "read": true, 00:16:07.364 "write": true, 00:16:07.364 "unmap": true, 00:16:07.364 "flush": true, 00:16:07.364 "reset": true, 00:16:07.364 "nvme_admin": false, 00:16:07.364 "nvme_io": false, 00:16:07.364 "nvme_io_md": false, 00:16:07.364 "write_zeroes": true, 00:16:07.364 "zcopy": false, 00:16:07.364 "get_zone_info": false, 00:16:07.364 "zone_management": false, 00:16:07.364 "zone_append": false, 00:16:07.364 "compare": false, 00:16:07.364 "compare_and_write": false, 00:16:07.364 "abort": false, 00:16:07.364 "seek_hole": false, 00:16:07.364 "seek_data": false, 00:16:07.364 "copy": false, 00:16:07.364 "nvme_iov_md": false 00:16:07.364 }, 00:16:07.364 "memory_domains": [ 00:16:07.364 { 00:16:07.364 "dma_device_id": "system", 00:16:07.364 "dma_device_type": 1 00:16:07.364 }, 00:16:07.364 { 00:16:07.364 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:07.364 "dma_device_type": 2 00:16:07.364 }, 00:16:07.364 { 00:16:07.364 "dma_device_id": "system", 00:16:07.364 "dma_device_type": 1 00:16:07.364 }, 00:16:07.364 { 00:16:07.364 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:07.364 "dma_device_type": 2 00:16:07.364 }, 00:16:07.364 { 00:16:07.364 "dma_device_id": "system", 00:16:07.364 "dma_device_type": 1 00:16:07.364 }, 00:16:07.364 { 00:16:07.364 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:07.364 "dma_device_type": 2 00:16:07.364 }, 00:16:07.364 { 00:16:07.364 "dma_device_id": "system", 00:16:07.364 "dma_device_type": 1 00:16:07.364 }, 00:16:07.364 { 00:16:07.364 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:07.364 "dma_device_type": 2 00:16:07.364 } 00:16:07.364 ], 00:16:07.364 "driver_specific": { 00:16:07.364 "raid": { 00:16:07.364 "uuid": "623e640d-2513-4000-b671-36ebc75f5469", 00:16:07.364 "strip_size_kb": 64, 00:16:07.364 "state": "online", 00:16:07.364 "raid_level": "concat", 00:16:07.364 "superblock": false, 00:16:07.364 "num_base_bdevs": 4, 00:16:07.364 "num_base_bdevs_discovered": 4, 00:16:07.364 "num_base_bdevs_operational": 4, 00:16:07.364 "base_bdevs_list": [ 00:16:07.364 { 00:16:07.364 "name": "NewBaseBdev", 00:16:07.364 "uuid": "97453b94-dd68-4af0-aab5-ac4904809b4a", 00:16:07.364 "is_configured": true, 00:16:07.364 "data_offset": 0, 00:16:07.364 "data_size": 65536 00:16:07.364 }, 00:16:07.364 { 00:16:07.364 "name": "BaseBdev2", 00:16:07.364 "uuid": "517f084c-ec52-46c1-87bb-ce2f069013f4", 00:16:07.364 "is_configured": true, 00:16:07.364 "data_offset": 0, 00:16:07.364 "data_size": 65536 00:16:07.364 }, 00:16:07.364 { 00:16:07.364 "name": "BaseBdev3", 00:16:07.364 "uuid": "ce7ffb8a-9512-472d-b602-2785deb20da3", 00:16:07.364 "is_configured": true, 00:16:07.364 "data_offset": 0, 00:16:07.364 "data_size": 65536 00:16:07.364 }, 00:16:07.364 { 00:16:07.364 "name": "BaseBdev4", 00:16:07.364 "uuid": "42dda1de-248a-4a0d-b76f-1b5761b91225", 00:16:07.364 "is_configured": true, 00:16:07.364 "data_offset": 0, 00:16:07.364 "data_size": 65536 00:16:07.364 } 00:16:07.365 ] 00:16:07.365 } 00:16:07.365 } 00:16:07.365 }' 00:16:07.365 11:56:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:07.365 11:56:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:07.365 BaseBdev2 00:16:07.365 BaseBdev3 00:16:07.365 BaseBdev4' 00:16:07.365 11:56:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:07.365 11:56:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:07.365 11:56:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:07.623 11:56:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:07.623 "name": "NewBaseBdev", 00:16:07.623 "aliases": [ 00:16:07.623 "97453b94-dd68-4af0-aab5-ac4904809b4a" 00:16:07.623 ], 00:16:07.623 "product_name": "Malloc disk", 00:16:07.623 "block_size": 512, 00:16:07.623 "num_blocks": 65536, 00:16:07.623 "uuid": "97453b94-dd68-4af0-aab5-ac4904809b4a", 00:16:07.623 "assigned_rate_limits": { 00:16:07.623 "rw_ios_per_sec": 0, 00:16:07.623 "rw_mbytes_per_sec": 0, 00:16:07.623 "r_mbytes_per_sec": 0, 00:16:07.623 "w_mbytes_per_sec": 0 00:16:07.623 }, 00:16:07.623 "claimed": true, 00:16:07.623 "claim_type": "exclusive_write", 00:16:07.623 "zoned": false, 00:16:07.623 "supported_io_types": { 00:16:07.623 "read": true, 00:16:07.623 "write": true, 00:16:07.623 "unmap": true, 00:16:07.623 "flush": true, 00:16:07.623 "reset": true, 00:16:07.623 "nvme_admin": false, 00:16:07.623 "nvme_io": false, 00:16:07.623 "nvme_io_md": false, 00:16:07.623 "write_zeroes": true, 00:16:07.623 "zcopy": true, 00:16:07.623 "get_zone_info": false, 00:16:07.623 "zone_management": false, 00:16:07.623 "zone_append": false, 00:16:07.623 "compare": false, 00:16:07.623 "compare_and_write": false, 00:16:07.623 "abort": true, 00:16:07.623 "seek_hole": false, 00:16:07.623 "seek_data": false, 00:16:07.623 "copy": true, 00:16:07.623 "nvme_iov_md": false 00:16:07.623 }, 00:16:07.623 "memory_domains": [ 00:16:07.623 { 00:16:07.623 "dma_device_id": "system", 00:16:07.623 "dma_device_type": 1 00:16:07.623 }, 00:16:07.623 { 00:16:07.623 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:07.623 "dma_device_type": 2 00:16:07.623 } 00:16:07.623 ], 00:16:07.623 "driver_specific": {} 00:16:07.623 }' 00:16:07.623 11:56:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:07.623 11:56:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:07.623 11:56:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:07.623 11:56:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:07.882 11:56:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:07.882 11:56:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:07.882 11:56:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:07.882 11:56:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:07.882 11:56:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:07.882 11:56:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:07.882 11:56:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:07.882 11:56:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:07.882 11:56:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:07.882 11:56:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:07.882 11:56:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:08.141 11:56:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:08.141 "name": "BaseBdev2", 00:16:08.141 "aliases": [ 00:16:08.141 "517f084c-ec52-46c1-87bb-ce2f069013f4" 00:16:08.141 ], 00:16:08.141 "product_name": "Malloc disk", 00:16:08.141 "block_size": 512, 00:16:08.141 "num_blocks": 65536, 00:16:08.141 "uuid": "517f084c-ec52-46c1-87bb-ce2f069013f4", 00:16:08.141 "assigned_rate_limits": { 00:16:08.141 "rw_ios_per_sec": 0, 00:16:08.141 "rw_mbytes_per_sec": 0, 00:16:08.141 "r_mbytes_per_sec": 0, 00:16:08.141 "w_mbytes_per_sec": 0 00:16:08.141 }, 00:16:08.141 "claimed": true, 00:16:08.141 "claim_type": "exclusive_write", 00:16:08.141 "zoned": false, 00:16:08.141 "supported_io_types": { 00:16:08.141 "read": true, 00:16:08.141 "write": true, 00:16:08.141 "unmap": true, 00:16:08.141 "flush": true, 00:16:08.141 "reset": true, 00:16:08.141 "nvme_admin": false, 00:16:08.141 "nvme_io": false, 00:16:08.141 "nvme_io_md": false, 00:16:08.141 "write_zeroes": true, 00:16:08.141 "zcopy": true, 00:16:08.141 "get_zone_info": false, 00:16:08.141 "zone_management": false, 00:16:08.141 "zone_append": false, 00:16:08.141 "compare": false, 00:16:08.141 "compare_and_write": false, 00:16:08.141 "abort": true, 00:16:08.141 "seek_hole": false, 00:16:08.141 "seek_data": false, 00:16:08.141 "copy": true, 00:16:08.141 "nvme_iov_md": false 00:16:08.141 }, 00:16:08.141 "memory_domains": [ 00:16:08.141 { 00:16:08.141 "dma_device_id": "system", 00:16:08.141 "dma_device_type": 1 00:16:08.141 }, 00:16:08.141 { 00:16:08.141 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:08.141 "dma_device_type": 2 00:16:08.141 } 00:16:08.141 ], 00:16:08.141 "driver_specific": {} 00:16:08.141 }' 00:16:08.141 11:56:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:08.141 11:56:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:08.141 11:56:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:08.141 11:56:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:08.141 11:56:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:08.400 11:56:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:08.400 11:56:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:08.400 11:56:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:08.400 11:56:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:08.400 11:56:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:08.400 11:56:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:08.400 11:56:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:08.400 11:56:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:08.400 11:56:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:08.400 11:56:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:08.659 11:56:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:08.659 "name": "BaseBdev3", 00:16:08.659 "aliases": [ 00:16:08.659 "ce7ffb8a-9512-472d-b602-2785deb20da3" 00:16:08.659 ], 00:16:08.659 "product_name": "Malloc disk", 00:16:08.659 "block_size": 512, 00:16:08.659 "num_blocks": 65536, 00:16:08.659 "uuid": "ce7ffb8a-9512-472d-b602-2785deb20da3", 00:16:08.659 "assigned_rate_limits": { 00:16:08.659 "rw_ios_per_sec": 0, 00:16:08.659 "rw_mbytes_per_sec": 0, 00:16:08.659 "r_mbytes_per_sec": 0, 00:16:08.659 "w_mbytes_per_sec": 0 00:16:08.659 }, 00:16:08.659 "claimed": true, 00:16:08.659 "claim_type": "exclusive_write", 00:16:08.659 "zoned": false, 00:16:08.659 "supported_io_types": { 00:16:08.659 "read": true, 00:16:08.659 "write": true, 00:16:08.659 "unmap": true, 00:16:08.659 "flush": true, 00:16:08.659 "reset": true, 00:16:08.659 "nvme_admin": false, 00:16:08.659 "nvme_io": false, 00:16:08.659 "nvme_io_md": false, 00:16:08.659 "write_zeroes": true, 00:16:08.659 "zcopy": true, 00:16:08.659 "get_zone_info": false, 00:16:08.659 "zone_management": false, 00:16:08.659 "zone_append": false, 00:16:08.659 "compare": false, 00:16:08.659 "compare_and_write": false, 00:16:08.659 "abort": true, 00:16:08.659 "seek_hole": false, 00:16:08.659 "seek_data": false, 00:16:08.659 "copy": true, 00:16:08.659 "nvme_iov_md": false 00:16:08.659 }, 00:16:08.659 "memory_domains": [ 00:16:08.659 { 00:16:08.659 "dma_device_id": "system", 00:16:08.659 "dma_device_type": 1 00:16:08.659 }, 00:16:08.659 { 00:16:08.659 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:08.659 "dma_device_type": 2 00:16:08.659 } 00:16:08.659 ], 00:16:08.659 "driver_specific": {} 00:16:08.659 }' 00:16:08.659 11:56:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:08.659 11:56:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:08.659 11:56:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:08.659 11:56:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:08.659 11:56:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:08.659 11:56:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:08.659 11:56:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:08.916 11:56:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:08.916 11:56:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:08.916 11:56:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:08.916 11:56:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:08.916 11:56:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:08.916 11:56:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:08.916 11:56:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:16:08.916 11:56:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:09.174 11:56:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:09.174 "name": "BaseBdev4", 00:16:09.174 "aliases": [ 00:16:09.174 "42dda1de-248a-4a0d-b76f-1b5761b91225" 00:16:09.174 ], 00:16:09.174 "product_name": "Malloc disk", 00:16:09.174 "block_size": 512, 00:16:09.174 "num_blocks": 65536, 00:16:09.174 "uuid": "42dda1de-248a-4a0d-b76f-1b5761b91225", 00:16:09.174 "assigned_rate_limits": { 00:16:09.174 "rw_ios_per_sec": 0, 00:16:09.174 "rw_mbytes_per_sec": 0, 00:16:09.174 "r_mbytes_per_sec": 0, 00:16:09.174 "w_mbytes_per_sec": 0 00:16:09.174 }, 00:16:09.174 "claimed": true, 00:16:09.174 "claim_type": "exclusive_write", 00:16:09.174 "zoned": false, 00:16:09.174 "supported_io_types": { 00:16:09.174 "read": true, 00:16:09.174 "write": true, 00:16:09.174 "unmap": true, 00:16:09.174 "flush": true, 00:16:09.174 "reset": true, 00:16:09.174 "nvme_admin": false, 00:16:09.174 "nvme_io": false, 00:16:09.174 "nvme_io_md": false, 00:16:09.174 "write_zeroes": true, 00:16:09.174 "zcopy": true, 00:16:09.174 "get_zone_info": false, 00:16:09.174 "zone_management": false, 00:16:09.174 "zone_append": false, 00:16:09.174 "compare": false, 00:16:09.174 "compare_and_write": false, 00:16:09.174 "abort": true, 00:16:09.174 "seek_hole": false, 00:16:09.174 "seek_data": false, 00:16:09.174 "copy": true, 00:16:09.174 "nvme_iov_md": false 00:16:09.174 }, 00:16:09.174 "memory_domains": [ 00:16:09.174 { 00:16:09.174 "dma_device_id": "system", 00:16:09.174 "dma_device_type": 1 00:16:09.174 }, 00:16:09.174 { 00:16:09.174 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:09.174 "dma_device_type": 2 00:16:09.174 } 00:16:09.174 ], 00:16:09.174 "driver_specific": {} 00:16:09.174 }' 00:16:09.174 11:56:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:09.174 11:56:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:09.174 11:56:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:09.174 11:56:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:09.174 11:56:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:09.174 11:56:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:09.174 11:56:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:09.174 11:56:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:09.432 11:56:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:09.432 11:56:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:09.432 11:56:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:09.432 11:56:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:09.432 11:56:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:09.432 [2024-07-12 11:56:59.655604] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:09.432 [2024-07-12 11:56:59.655623] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:09.432 [2024-07-12 11:56:59.655659] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:09.432 [2024-07-12 11:56:59.655700] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:09.432 [2024-07-12 11:56:59.655706] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x231c1e0 name Existed_Raid, state offline 00:16:09.432 11:56:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 663154 00:16:09.432 11:56:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 663154 ']' 00:16:09.432 11:56:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 663154 00:16:09.689 11:56:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:16:09.689 11:56:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:09.689 11:56:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 663154 00:16:09.689 11:56:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:09.689 11:56:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:09.689 11:56:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 663154' 00:16:09.689 killing process with pid 663154 00:16:09.689 11:56:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 663154 00:16:09.689 [2024-07-12 11:56:59.712158] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:09.689 11:56:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 663154 00:16:09.689 [2024-07-12 11:56:59.743994] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:09.689 11:56:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:16:09.689 00:16:09.689 real 0m24.277s 00:16:09.689 user 0m45.313s 00:16:09.689 sys 0m3.613s 00:16:09.689 11:56:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:09.689 11:56:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:09.689 ************************************ 00:16:09.689 END TEST raid_state_function_test 00:16:09.689 ************************************ 00:16:09.947 11:56:59 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:09.947 11:56:59 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 4 true 00:16:09.948 11:56:59 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:09.948 11:56:59 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:09.948 11:56:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:09.948 ************************************ 00:16:09.948 START TEST raid_state_function_test_sb 00:16:09.948 ************************************ 00:16:09.948 11:56:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 4 true 00:16:09.948 11:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:16:09.948 11:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:16:09.948 11:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:16:09.948 11:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:09.948 11:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:09.948 11:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:09.948 11:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:09.948 11:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:09.948 11:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:09.948 11:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:09.948 11:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:09.948 11:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:09.948 11:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:16:09.948 11:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:09.948 11:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:09.948 11:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:16:09.948 11:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:09.948 11:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:09.948 11:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:16:09.948 11:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:09.948 11:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:09.948 11:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:09.948 11:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:09.948 11:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:09.948 11:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:16:09.948 11:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:16:09.948 11:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:16:09.948 11:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:16:09.948 11:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:16:09.948 11:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=667824 00:16:09.948 11:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 667824' 00:16:09.948 Process raid pid: 667824 00:16:09.948 11:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:09.948 11:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 667824 /var/tmp/spdk-raid.sock 00:16:09.948 11:56:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 667824 ']' 00:16:09.948 11:56:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:09.948 11:56:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:09.948 11:56:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:09.948 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:09.948 11:56:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:09.948 11:56:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:09.948 [2024-07-12 11:57:00.039536] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:16:09.948 [2024-07-12 11:57:00.039575] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:09.948 [2024-07-12 11:57:00.104003] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:09.948 [2024-07-12 11:57:00.176673] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:16:10.205 [2024-07-12 11:57:00.230523] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:10.206 [2024-07-12 11:57:00.230548] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:10.769 11:57:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:10.769 11:57:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:16:10.769 11:57:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:10.769 [2024-07-12 11:57:00.977530] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:10.769 [2024-07-12 11:57:00.977559] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:10.769 [2024-07-12 11:57:00.977565] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:10.769 [2024-07-12 11:57:00.977571] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:10.769 [2024-07-12 11:57:00.977575] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:10.769 [2024-07-12 11:57:00.977580] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:10.769 [2024-07-12 11:57:00.977584] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:10.769 [2024-07-12 11:57:00.977589] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:10.769 11:57:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:10.769 11:57:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:10.769 11:57:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:10.769 11:57:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:10.769 11:57:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:10.769 11:57:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:10.769 11:57:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:10.769 11:57:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:10.769 11:57:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:10.769 11:57:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:10.769 11:57:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:10.769 11:57:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:11.053 11:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:11.053 "name": "Existed_Raid", 00:16:11.053 "uuid": "6bd1121a-ba2b-48fa-adc4-49875f4fe98b", 00:16:11.053 "strip_size_kb": 64, 00:16:11.053 "state": "configuring", 00:16:11.053 "raid_level": "concat", 00:16:11.053 "superblock": true, 00:16:11.053 "num_base_bdevs": 4, 00:16:11.053 "num_base_bdevs_discovered": 0, 00:16:11.053 "num_base_bdevs_operational": 4, 00:16:11.053 "base_bdevs_list": [ 00:16:11.053 { 00:16:11.053 "name": "BaseBdev1", 00:16:11.053 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:11.053 "is_configured": false, 00:16:11.053 "data_offset": 0, 00:16:11.053 "data_size": 0 00:16:11.053 }, 00:16:11.053 { 00:16:11.053 "name": "BaseBdev2", 00:16:11.053 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:11.053 "is_configured": false, 00:16:11.053 "data_offset": 0, 00:16:11.053 "data_size": 0 00:16:11.053 }, 00:16:11.053 { 00:16:11.053 "name": "BaseBdev3", 00:16:11.053 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:11.053 "is_configured": false, 00:16:11.053 "data_offset": 0, 00:16:11.053 "data_size": 0 00:16:11.053 }, 00:16:11.053 { 00:16:11.053 "name": "BaseBdev4", 00:16:11.053 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:11.053 "is_configured": false, 00:16:11.053 "data_offset": 0, 00:16:11.053 "data_size": 0 00:16:11.053 } 00:16:11.053 ] 00:16:11.053 }' 00:16:11.053 11:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:11.053 11:57:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:11.628 11:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:11.628 [2024-07-12 11:57:01.827636] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:11.628 [2024-07-12 11:57:01.827657] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15a91f0 name Existed_Raid, state configuring 00:16:11.628 11:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:11.887 [2024-07-12 11:57:02.012132] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:11.887 [2024-07-12 11:57:02.012151] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:11.887 [2024-07-12 11:57:02.012156] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:11.887 [2024-07-12 11:57:02.012161] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:11.887 [2024-07-12 11:57:02.012166] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:11.887 [2024-07-12 11:57:02.012171] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:11.887 [2024-07-12 11:57:02.012175] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:11.887 [2024-07-12 11:57:02.012180] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:11.887 11:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:12.144 [2024-07-12 11:57:02.192793] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:12.144 BaseBdev1 00:16:12.144 11:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:12.144 11:57:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:12.144 11:57:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:12.144 11:57:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:12.144 11:57:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:12.144 11:57:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:12.144 11:57:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:12.144 11:57:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:12.402 [ 00:16:12.402 { 00:16:12.402 "name": "BaseBdev1", 00:16:12.402 "aliases": [ 00:16:12.402 "886ff0fe-c1ee-45c3-a47f-ed67c02a50fe" 00:16:12.402 ], 00:16:12.402 "product_name": "Malloc disk", 00:16:12.402 "block_size": 512, 00:16:12.402 "num_blocks": 65536, 00:16:12.402 "uuid": "886ff0fe-c1ee-45c3-a47f-ed67c02a50fe", 00:16:12.402 "assigned_rate_limits": { 00:16:12.402 "rw_ios_per_sec": 0, 00:16:12.402 "rw_mbytes_per_sec": 0, 00:16:12.402 "r_mbytes_per_sec": 0, 00:16:12.402 "w_mbytes_per_sec": 0 00:16:12.402 }, 00:16:12.402 "claimed": true, 00:16:12.402 "claim_type": "exclusive_write", 00:16:12.402 "zoned": false, 00:16:12.402 "supported_io_types": { 00:16:12.402 "read": true, 00:16:12.402 "write": true, 00:16:12.402 "unmap": true, 00:16:12.402 "flush": true, 00:16:12.402 "reset": true, 00:16:12.402 "nvme_admin": false, 00:16:12.402 "nvme_io": false, 00:16:12.402 "nvme_io_md": false, 00:16:12.402 "write_zeroes": true, 00:16:12.402 "zcopy": true, 00:16:12.402 "get_zone_info": false, 00:16:12.402 "zone_management": false, 00:16:12.402 "zone_append": false, 00:16:12.402 "compare": false, 00:16:12.402 "compare_and_write": false, 00:16:12.402 "abort": true, 00:16:12.402 "seek_hole": false, 00:16:12.402 "seek_data": false, 00:16:12.402 "copy": true, 00:16:12.402 "nvme_iov_md": false 00:16:12.402 }, 00:16:12.402 "memory_domains": [ 00:16:12.402 { 00:16:12.402 "dma_device_id": "system", 00:16:12.402 "dma_device_type": 1 00:16:12.402 }, 00:16:12.402 { 00:16:12.402 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:12.402 "dma_device_type": 2 00:16:12.402 } 00:16:12.402 ], 00:16:12.402 "driver_specific": {} 00:16:12.402 } 00:16:12.402 ] 00:16:12.402 11:57:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:12.402 11:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:12.402 11:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:12.402 11:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:12.402 11:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:12.402 11:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:12.402 11:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:12.402 11:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:12.402 11:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:12.402 11:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:12.402 11:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:12.402 11:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:12.402 11:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:12.660 11:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:12.660 "name": "Existed_Raid", 00:16:12.660 "uuid": "565d94f5-b707-4f36-9d62-ce9b762b8c54", 00:16:12.661 "strip_size_kb": 64, 00:16:12.661 "state": "configuring", 00:16:12.661 "raid_level": "concat", 00:16:12.661 "superblock": true, 00:16:12.661 "num_base_bdevs": 4, 00:16:12.661 "num_base_bdevs_discovered": 1, 00:16:12.661 "num_base_bdevs_operational": 4, 00:16:12.661 "base_bdevs_list": [ 00:16:12.661 { 00:16:12.661 "name": "BaseBdev1", 00:16:12.661 "uuid": "886ff0fe-c1ee-45c3-a47f-ed67c02a50fe", 00:16:12.661 "is_configured": true, 00:16:12.661 "data_offset": 2048, 00:16:12.661 "data_size": 63488 00:16:12.661 }, 00:16:12.661 { 00:16:12.661 "name": "BaseBdev2", 00:16:12.661 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:12.661 "is_configured": false, 00:16:12.661 "data_offset": 0, 00:16:12.661 "data_size": 0 00:16:12.661 }, 00:16:12.661 { 00:16:12.661 "name": "BaseBdev3", 00:16:12.661 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:12.661 "is_configured": false, 00:16:12.661 "data_offset": 0, 00:16:12.661 "data_size": 0 00:16:12.661 }, 00:16:12.661 { 00:16:12.661 "name": "BaseBdev4", 00:16:12.661 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:12.661 "is_configured": false, 00:16:12.661 "data_offset": 0, 00:16:12.661 "data_size": 0 00:16:12.661 } 00:16:12.661 ] 00:16:12.661 }' 00:16:12.661 11:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:12.661 11:57:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:13.227 11:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:13.227 [2024-07-12 11:57:03.359805] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:13.227 [2024-07-12 11:57:03.359831] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15a8a60 name Existed_Raid, state configuring 00:16:13.227 11:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:13.485 [2024-07-12 11:57:03.536299] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:13.485 [2024-07-12 11:57:03.537401] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:13.485 [2024-07-12 11:57:03.537425] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:13.485 [2024-07-12 11:57:03.537431] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:13.485 [2024-07-12 11:57:03.537437] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:13.485 [2024-07-12 11:57:03.537442] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:13.485 [2024-07-12 11:57:03.537447] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:13.485 11:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:13.485 11:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:13.485 11:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:13.485 11:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:13.485 11:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:13.485 11:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:13.485 11:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:13.485 11:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:13.485 11:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:13.485 11:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:13.485 11:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:13.485 11:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:13.485 11:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:13.485 11:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:13.485 11:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:13.485 "name": "Existed_Raid", 00:16:13.485 "uuid": "a3702e19-83cd-45e2-a18c-a51e7736e85c", 00:16:13.485 "strip_size_kb": 64, 00:16:13.485 "state": "configuring", 00:16:13.485 "raid_level": "concat", 00:16:13.485 "superblock": true, 00:16:13.485 "num_base_bdevs": 4, 00:16:13.485 "num_base_bdevs_discovered": 1, 00:16:13.485 "num_base_bdevs_operational": 4, 00:16:13.485 "base_bdevs_list": [ 00:16:13.485 { 00:16:13.485 "name": "BaseBdev1", 00:16:13.485 "uuid": "886ff0fe-c1ee-45c3-a47f-ed67c02a50fe", 00:16:13.485 "is_configured": true, 00:16:13.485 "data_offset": 2048, 00:16:13.485 "data_size": 63488 00:16:13.485 }, 00:16:13.485 { 00:16:13.485 "name": "BaseBdev2", 00:16:13.485 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:13.485 "is_configured": false, 00:16:13.485 "data_offset": 0, 00:16:13.485 "data_size": 0 00:16:13.485 }, 00:16:13.485 { 00:16:13.485 "name": "BaseBdev3", 00:16:13.485 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:13.485 "is_configured": false, 00:16:13.485 "data_offset": 0, 00:16:13.485 "data_size": 0 00:16:13.485 }, 00:16:13.485 { 00:16:13.485 "name": "BaseBdev4", 00:16:13.485 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:13.485 "is_configured": false, 00:16:13.485 "data_offset": 0, 00:16:13.485 "data_size": 0 00:16:13.486 } 00:16:13.486 ] 00:16:13.486 }' 00:16:13.486 11:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:13.486 11:57:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:14.054 11:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:14.313 [2024-07-12 11:57:04.357045] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:14.313 BaseBdev2 00:16:14.313 11:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:14.313 11:57:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:14.313 11:57:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:14.313 11:57:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:14.313 11:57:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:14.313 11:57:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:14.313 11:57:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:14.313 11:57:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:14.572 [ 00:16:14.572 { 00:16:14.572 "name": "BaseBdev2", 00:16:14.572 "aliases": [ 00:16:14.572 "299632e8-869a-484a-b6e8-ea92aa2a24cb" 00:16:14.572 ], 00:16:14.572 "product_name": "Malloc disk", 00:16:14.572 "block_size": 512, 00:16:14.572 "num_blocks": 65536, 00:16:14.572 "uuid": "299632e8-869a-484a-b6e8-ea92aa2a24cb", 00:16:14.572 "assigned_rate_limits": { 00:16:14.572 "rw_ios_per_sec": 0, 00:16:14.572 "rw_mbytes_per_sec": 0, 00:16:14.572 "r_mbytes_per_sec": 0, 00:16:14.572 "w_mbytes_per_sec": 0 00:16:14.572 }, 00:16:14.572 "claimed": true, 00:16:14.572 "claim_type": "exclusive_write", 00:16:14.572 "zoned": false, 00:16:14.572 "supported_io_types": { 00:16:14.572 "read": true, 00:16:14.572 "write": true, 00:16:14.572 "unmap": true, 00:16:14.572 "flush": true, 00:16:14.572 "reset": true, 00:16:14.572 "nvme_admin": false, 00:16:14.572 "nvme_io": false, 00:16:14.572 "nvme_io_md": false, 00:16:14.572 "write_zeroes": true, 00:16:14.572 "zcopy": true, 00:16:14.572 "get_zone_info": false, 00:16:14.572 "zone_management": false, 00:16:14.572 "zone_append": false, 00:16:14.572 "compare": false, 00:16:14.572 "compare_and_write": false, 00:16:14.572 "abort": true, 00:16:14.572 "seek_hole": false, 00:16:14.572 "seek_data": false, 00:16:14.572 "copy": true, 00:16:14.572 "nvme_iov_md": false 00:16:14.572 }, 00:16:14.572 "memory_domains": [ 00:16:14.572 { 00:16:14.572 "dma_device_id": "system", 00:16:14.572 "dma_device_type": 1 00:16:14.572 }, 00:16:14.572 { 00:16:14.572 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:14.572 "dma_device_type": 2 00:16:14.572 } 00:16:14.572 ], 00:16:14.572 "driver_specific": {} 00:16:14.572 } 00:16:14.572 ] 00:16:14.572 11:57:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:14.572 11:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:14.572 11:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:14.572 11:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:14.572 11:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:14.572 11:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:14.572 11:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:14.572 11:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:14.572 11:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:14.572 11:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:14.572 11:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:14.572 11:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:14.572 11:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:14.572 11:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:14.572 11:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:14.830 11:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:14.830 "name": "Existed_Raid", 00:16:14.830 "uuid": "a3702e19-83cd-45e2-a18c-a51e7736e85c", 00:16:14.830 "strip_size_kb": 64, 00:16:14.830 "state": "configuring", 00:16:14.830 "raid_level": "concat", 00:16:14.830 "superblock": true, 00:16:14.830 "num_base_bdevs": 4, 00:16:14.830 "num_base_bdevs_discovered": 2, 00:16:14.830 "num_base_bdevs_operational": 4, 00:16:14.830 "base_bdevs_list": [ 00:16:14.830 { 00:16:14.830 "name": "BaseBdev1", 00:16:14.830 "uuid": "886ff0fe-c1ee-45c3-a47f-ed67c02a50fe", 00:16:14.830 "is_configured": true, 00:16:14.830 "data_offset": 2048, 00:16:14.830 "data_size": 63488 00:16:14.830 }, 00:16:14.830 { 00:16:14.830 "name": "BaseBdev2", 00:16:14.830 "uuid": "299632e8-869a-484a-b6e8-ea92aa2a24cb", 00:16:14.830 "is_configured": true, 00:16:14.830 "data_offset": 2048, 00:16:14.830 "data_size": 63488 00:16:14.830 }, 00:16:14.830 { 00:16:14.830 "name": "BaseBdev3", 00:16:14.830 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:14.830 "is_configured": false, 00:16:14.830 "data_offset": 0, 00:16:14.830 "data_size": 0 00:16:14.830 }, 00:16:14.830 { 00:16:14.830 "name": "BaseBdev4", 00:16:14.830 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:14.831 "is_configured": false, 00:16:14.831 "data_offset": 0, 00:16:14.831 "data_size": 0 00:16:14.831 } 00:16:14.831 ] 00:16:14.831 }' 00:16:14.831 11:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:14.831 11:57:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:15.397 11:57:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:15.397 [2024-07-12 11:57:05.562753] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:15.397 BaseBdev3 00:16:15.397 11:57:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:15.397 11:57:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:15.397 11:57:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:15.397 11:57:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:15.397 11:57:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:15.397 11:57:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:15.397 11:57:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:15.656 11:57:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:15.656 [ 00:16:15.656 { 00:16:15.656 "name": "BaseBdev3", 00:16:15.656 "aliases": [ 00:16:15.656 "e97acddd-b4c8-49fd-a314-76d5b89896c0" 00:16:15.656 ], 00:16:15.656 "product_name": "Malloc disk", 00:16:15.656 "block_size": 512, 00:16:15.656 "num_blocks": 65536, 00:16:15.656 "uuid": "e97acddd-b4c8-49fd-a314-76d5b89896c0", 00:16:15.656 "assigned_rate_limits": { 00:16:15.656 "rw_ios_per_sec": 0, 00:16:15.656 "rw_mbytes_per_sec": 0, 00:16:15.656 "r_mbytes_per_sec": 0, 00:16:15.656 "w_mbytes_per_sec": 0 00:16:15.656 }, 00:16:15.656 "claimed": true, 00:16:15.656 "claim_type": "exclusive_write", 00:16:15.656 "zoned": false, 00:16:15.656 "supported_io_types": { 00:16:15.656 "read": true, 00:16:15.656 "write": true, 00:16:15.656 "unmap": true, 00:16:15.656 "flush": true, 00:16:15.656 "reset": true, 00:16:15.656 "nvme_admin": false, 00:16:15.656 "nvme_io": false, 00:16:15.656 "nvme_io_md": false, 00:16:15.656 "write_zeroes": true, 00:16:15.656 "zcopy": true, 00:16:15.656 "get_zone_info": false, 00:16:15.656 "zone_management": false, 00:16:15.656 "zone_append": false, 00:16:15.656 "compare": false, 00:16:15.656 "compare_and_write": false, 00:16:15.656 "abort": true, 00:16:15.656 "seek_hole": false, 00:16:15.656 "seek_data": false, 00:16:15.656 "copy": true, 00:16:15.656 "nvme_iov_md": false 00:16:15.656 }, 00:16:15.656 "memory_domains": [ 00:16:15.656 { 00:16:15.656 "dma_device_id": "system", 00:16:15.656 "dma_device_type": 1 00:16:15.656 }, 00:16:15.656 { 00:16:15.656 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:15.656 "dma_device_type": 2 00:16:15.656 } 00:16:15.656 ], 00:16:15.656 "driver_specific": {} 00:16:15.656 } 00:16:15.656 ] 00:16:15.915 11:57:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:15.915 11:57:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:15.915 11:57:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:15.915 11:57:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:15.915 11:57:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:15.915 11:57:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:15.915 11:57:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:15.915 11:57:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:15.915 11:57:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:15.915 11:57:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:15.915 11:57:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:15.915 11:57:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:15.915 11:57:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:15.915 11:57:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:15.915 11:57:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:15.915 11:57:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:15.915 "name": "Existed_Raid", 00:16:15.915 "uuid": "a3702e19-83cd-45e2-a18c-a51e7736e85c", 00:16:15.915 "strip_size_kb": 64, 00:16:15.915 "state": "configuring", 00:16:15.915 "raid_level": "concat", 00:16:15.915 "superblock": true, 00:16:15.915 "num_base_bdevs": 4, 00:16:15.915 "num_base_bdevs_discovered": 3, 00:16:15.915 "num_base_bdevs_operational": 4, 00:16:15.915 "base_bdevs_list": [ 00:16:15.915 { 00:16:15.915 "name": "BaseBdev1", 00:16:15.915 "uuid": "886ff0fe-c1ee-45c3-a47f-ed67c02a50fe", 00:16:15.915 "is_configured": true, 00:16:15.915 "data_offset": 2048, 00:16:15.915 "data_size": 63488 00:16:15.915 }, 00:16:15.915 { 00:16:15.915 "name": "BaseBdev2", 00:16:15.915 "uuid": "299632e8-869a-484a-b6e8-ea92aa2a24cb", 00:16:15.915 "is_configured": true, 00:16:15.915 "data_offset": 2048, 00:16:15.915 "data_size": 63488 00:16:15.915 }, 00:16:15.915 { 00:16:15.915 "name": "BaseBdev3", 00:16:15.915 "uuid": "e97acddd-b4c8-49fd-a314-76d5b89896c0", 00:16:15.915 "is_configured": true, 00:16:15.915 "data_offset": 2048, 00:16:15.915 "data_size": 63488 00:16:15.915 }, 00:16:15.915 { 00:16:15.915 "name": "BaseBdev4", 00:16:15.915 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:15.915 "is_configured": false, 00:16:15.915 "data_offset": 0, 00:16:15.915 "data_size": 0 00:16:15.915 } 00:16:15.915 ] 00:16:15.915 }' 00:16:15.915 11:57:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:15.915 11:57:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:16.482 11:57:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:16:16.741 [2024-07-12 11:57:06.752576] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:16.741 [2024-07-12 11:57:06.752700] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x15a9b90 00:16:16.741 [2024-07-12 11:57:06.752709] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:16:16.741 [2024-07-12 11:57:06.752826] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15a9700 00:16:16.741 [2024-07-12 11:57:06.752910] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15a9b90 00:16:16.741 [2024-07-12 11:57:06.752916] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x15a9b90 00:16:16.741 [2024-07-12 11:57:06.752979] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:16.741 BaseBdev4 00:16:16.741 11:57:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:16:16.741 11:57:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:16:16.741 11:57:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:16.741 11:57:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:16.741 11:57:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:16.741 11:57:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:16.741 11:57:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:16.741 11:57:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:16:17.000 [ 00:16:17.000 { 00:16:17.000 "name": "BaseBdev4", 00:16:17.000 "aliases": [ 00:16:17.000 "14b8f5c4-02bd-468d-94a6-284566ac1a05" 00:16:17.000 ], 00:16:17.000 "product_name": "Malloc disk", 00:16:17.000 "block_size": 512, 00:16:17.000 "num_blocks": 65536, 00:16:17.000 "uuid": "14b8f5c4-02bd-468d-94a6-284566ac1a05", 00:16:17.000 "assigned_rate_limits": { 00:16:17.000 "rw_ios_per_sec": 0, 00:16:17.000 "rw_mbytes_per_sec": 0, 00:16:17.000 "r_mbytes_per_sec": 0, 00:16:17.000 "w_mbytes_per_sec": 0 00:16:17.000 }, 00:16:17.000 "claimed": true, 00:16:17.000 "claim_type": "exclusive_write", 00:16:17.000 "zoned": false, 00:16:17.000 "supported_io_types": { 00:16:17.000 "read": true, 00:16:17.000 "write": true, 00:16:17.000 "unmap": true, 00:16:17.000 "flush": true, 00:16:17.000 "reset": true, 00:16:17.000 "nvme_admin": false, 00:16:17.000 "nvme_io": false, 00:16:17.000 "nvme_io_md": false, 00:16:17.000 "write_zeroes": true, 00:16:17.000 "zcopy": true, 00:16:17.000 "get_zone_info": false, 00:16:17.000 "zone_management": false, 00:16:17.000 "zone_append": false, 00:16:17.000 "compare": false, 00:16:17.000 "compare_and_write": false, 00:16:17.000 "abort": true, 00:16:17.000 "seek_hole": false, 00:16:17.000 "seek_data": false, 00:16:17.000 "copy": true, 00:16:17.000 "nvme_iov_md": false 00:16:17.000 }, 00:16:17.000 "memory_domains": [ 00:16:17.000 { 00:16:17.000 "dma_device_id": "system", 00:16:17.000 "dma_device_type": 1 00:16:17.000 }, 00:16:17.000 { 00:16:17.000 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:17.000 "dma_device_type": 2 00:16:17.000 } 00:16:17.000 ], 00:16:17.000 "driver_specific": {} 00:16:17.000 } 00:16:17.000 ] 00:16:17.000 11:57:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:17.000 11:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:17.000 11:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:17.000 11:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:16:17.000 11:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:17.000 11:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:17.000 11:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:17.000 11:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:17.000 11:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:17.000 11:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:17.000 11:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:17.000 11:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:17.000 11:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:17.000 11:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:17.000 11:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:17.259 11:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:17.259 "name": "Existed_Raid", 00:16:17.259 "uuid": "a3702e19-83cd-45e2-a18c-a51e7736e85c", 00:16:17.259 "strip_size_kb": 64, 00:16:17.259 "state": "online", 00:16:17.259 "raid_level": "concat", 00:16:17.259 "superblock": true, 00:16:17.259 "num_base_bdevs": 4, 00:16:17.259 "num_base_bdevs_discovered": 4, 00:16:17.259 "num_base_bdevs_operational": 4, 00:16:17.259 "base_bdevs_list": [ 00:16:17.259 { 00:16:17.259 "name": "BaseBdev1", 00:16:17.259 "uuid": "886ff0fe-c1ee-45c3-a47f-ed67c02a50fe", 00:16:17.259 "is_configured": true, 00:16:17.259 "data_offset": 2048, 00:16:17.259 "data_size": 63488 00:16:17.259 }, 00:16:17.259 { 00:16:17.259 "name": "BaseBdev2", 00:16:17.259 "uuid": "299632e8-869a-484a-b6e8-ea92aa2a24cb", 00:16:17.259 "is_configured": true, 00:16:17.259 "data_offset": 2048, 00:16:17.259 "data_size": 63488 00:16:17.259 }, 00:16:17.259 { 00:16:17.259 "name": "BaseBdev3", 00:16:17.260 "uuid": "e97acddd-b4c8-49fd-a314-76d5b89896c0", 00:16:17.260 "is_configured": true, 00:16:17.260 "data_offset": 2048, 00:16:17.260 "data_size": 63488 00:16:17.260 }, 00:16:17.260 { 00:16:17.260 "name": "BaseBdev4", 00:16:17.260 "uuid": "14b8f5c4-02bd-468d-94a6-284566ac1a05", 00:16:17.260 "is_configured": true, 00:16:17.260 "data_offset": 2048, 00:16:17.260 "data_size": 63488 00:16:17.260 } 00:16:17.260 ] 00:16:17.260 }' 00:16:17.260 11:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:17.260 11:57:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:17.827 11:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:17.827 11:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:17.827 11:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:17.827 11:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:17.827 11:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:17.827 11:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:16:17.827 11:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:17.827 11:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:17.827 [2024-07-12 11:57:07.935860] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:17.827 11:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:17.827 "name": "Existed_Raid", 00:16:17.827 "aliases": [ 00:16:17.827 "a3702e19-83cd-45e2-a18c-a51e7736e85c" 00:16:17.827 ], 00:16:17.827 "product_name": "Raid Volume", 00:16:17.827 "block_size": 512, 00:16:17.827 "num_blocks": 253952, 00:16:17.827 "uuid": "a3702e19-83cd-45e2-a18c-a51e7736e85c", 00:16:17.827 "assigned_rate_limits": { 00:16:17.827 "rw_ios_per_sec": 0, 00:16:17.827 "rw_mbytes_per_sec": 0, 00:16:17.827 "r_mbytes_per_sec": 0, 00:16:17.827 "w_mbytes_per_sec": 0 00:16:17.827 }, 00:16:17.827 "claimed": false, 00:16:17.827 "zoned": false, 00:16:17.827 "supported_io_types": { 00:16:17.827 "read": true, 00:16:17.827 "write": true, 00:16:17.827 "unmap": true, 00:16:17.827 "flush": true, 00:16:17.827 "reset": true, 00:16:17.827 "nvme_admin": false, 00:16:17.827 "nvme_io": false, 00:16:17.827 "nvme_io_md": false, 00:16:17.827 "write_zeroes": true, 00:16:17.827 "zcopy": false, 00:16:17.827 "get_zone_info": false, 00:16:17.827 "zone_management": false, 00:16:17.827 "zone_append": false, 00:16:17.827 "compare": false, 00:16:17.827 "compare_and_write": false, 00:16:17.827 "abort": false, 00:16:17.827 "seek_hole": false, 00:16:17.827 "seek_data": false, 00:16:17.827 "copy": false, 00:16:17.827 "nvme_iov_md": false 00:16:17.827 }, 00:16:17.827 "memory_domains": [ 00:16:17.827 { 00:16:17.827 "dma_device_id": "system", 00:16:17.827 "dma_device_type": 1 00:16:17.827 }, 00:16:17.827 { 00:16:17.827 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:17.827 "dma_device_type": 2 00:16:17.827 }, 00:16:17.827 { 00:16:17.827 "dma_device_id": "system", 00:16:17.827 "dma_device_type": 1 00:16:17.827 }, 00:16:17.827 { 00:16:17.827 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:17.827 "dma_device_type": 2 00:16:17.827 }, 00:16:17.827 { 00:16:17.827 "dma_device_id": "system", 00:16:17.827 "dma_device_type": 1 00:16:17.827 }, 00:16:17.827 { 00:16:17.827 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:17.827 "dma_device_type": 2 00:16:17.827 }, 00:16:17.827 { 00:16:17.827 "dma_device_id": "system", 00:16:17.827 "dma_device_type": 1 00:16:17.827 }, 00:16:17.827 { 00:16:17.827 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:17.827 "dma_device_type": 2 00:16:17.827 } 00:16:17.827 ], 00:16:17.827 "driver_specific": { 00:16:17.827 "raid": { 00:16:17.827 "uuid": "a3702e19-83cd-45e2-a18c-a51e7736e85c", 00:16:17.827 "strip_size_kb": 64, 00:16:17.827 "state": "online", 00:16:17.827 "raid_level": "concat", 00:16:17.827 "superblock": true, 00:16:17.827 "num_base_bdevs": 4, 00:16:17.827 "num_base_bdevs_discovered": 4, 00:16:17.827 "num_base_bdevs_operational": 4, 00:16:17.827 "base_bdevs_list": [ 00:16:17.827 { 00:16:17.827 "name": "BaseBdev1", 00:16:17.827 "uuid": "886ff0fe-c1ee-45c3-a47f-ed67c02a50fe", 00:16:17.827 "is_configured": true, 00:16:17.827 "data_offset": 2048, 00:16:17.827 "data_size": 63488 00:16:17.827 }, 00:16:17.827 { 00:16:17.827 "name": "BaseBdev2", 00:16:17.827 "uuid": "299632e8-869a-484a-b6e8-ea92aa2a24cb", 00:16:17.827 "is_configured": true, 00:16:17.827 "data_offset": 2048, 00:16:17.827 "data_size": 63488 00:16:17.827 }, 00:16:17.827 { 00:16:17.827 "name": "BaseBdev3", 00:16:17.827 "uuid": "e97acddd-b4c8-49fd-a314-76d5b89896c0", 00:16:17.827 "is_configured": true, 00:16:17.827 "data_offset": 2048, 00:16:17.827 "data_size": 63488 00:16:17.827 }, 00:16:17.827 { 00:16:17.827 "name": "BaseBdev4", 00:16:17.827 "uuid": "14b8f5c4-02bd-468d-94a6-284566ac1a05", 00:16:17.827 "is_configured": true, 00:16:17.827 "data_offset": 2048, 00:16:17.827 "data_size": 63488 00:16:17.827 } 00:16:17.827 ] 00:16:17.827 } 00:16:17.827 } 00:16:17.827 }' 00:16:17.827 11:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:17.827 11:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:17.827 BaseBdev2 00:16:17.827 BaseBdev3 00:16:17.827 BaseBdev4' 00:16:17.827 11:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:17.827 11:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:17.827 11:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:18.085 11:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:18.085 "name": "BaseBdev1", 00:16:18.085 "aliases": [ 00:16:18.085 "886ff0fe-c1ee-45c3-a47f-ed67c02a50fe" 00:16:18.085 ], 00:16:18.085 "product_name": "Malloc disk", 00:16:18.085 "block_size": 512, 00:16:18.085 "num_blocks": 65536, 00:16:18.085 "uuid": "886ff0fe-c1ee-45c3-a47f-ed67c02a50fe", 00:16:18.085 "assigned_rate_limits": { 00:16:18.085 "rw_ios_per_sec": 0, 00:16:18.085 "rw_mbytes_per_sec": 0, 00:16:18.085 "r_mbytes_per_sec": 0, 00:16:18.085 "w_mbytes_per_sec": 0 00:16:18.085 }, 00:16:18.085 "claimed": true, 00:16:18.085 "claim_type": "exclusive_write", 00:16:18.085 "zoned": false, 00:16:18.085 "supported_io_types": { 00:16:18.085 "read": true, 00:16:18.085 "write": true, 00:16:18.085 "unmap": true, 00:16:18.085 "flush": true, 00:16:18.085 "reset": true, 00:16:18.085 "nvme_admin": false, 00:16:18.085 "nvme_io": false, 00:16:18.085 "nvme_io_md": false, 00:16:18.085 "write_zeroes": true, 00:16:18.085 "zcopy": true, 00:16:18.085 "get_zone_info": false, 00:16:18.085 "zone_management": false, 00:16:18.085 "zone_append": false, 00:16:18.085 "compare": false, 00:16:18.085 "compare_and_write": false, 00:16:18.085 "abort": true, 00:16:18.085 "seek_hole": false, 00:16:18.085 "seek_data": false, 00:16:18.085 "copy": true, 00:16:18.085 "nvme_iov_md": false 00:16:18.085 }, 00:16:18.085 "memory_domains": [ 00:16:18.085 { 00:16:18.085 "dma_device_id": "system", 00:16:18.085 "dma_device_type": 1 00:16:18.085 }, 00:16:18.085 { 00:16:18.085 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:18.085 "dma_device_type": 2 00:16:18.085 } 00:16:18.085 ], 00:16:18.085 "driver_specific": {} 00:16:18.085 }' 00:16:18.085 11:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:18.085 11:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:18.085 11:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:18.085 11:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:18.085 11:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:18.085 11:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:18.085 11:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:18.344 11:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:18.344 11:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:18.344 11:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:18.344 11:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:18.344 11:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:18.344 11:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:18.344 11:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:18.344 11:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:18.603 11:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:18.603 "name": "BaseBdev2", 00:16:18.603 "aliases": [ 00:16:18.603 "299632e8-869a-484a-b6e8-ea92aa2a24cb" 00:16:18.603 ], 00:16:18.603 "product_name": "Malloc disk", 00:16:18.603 "block_size": 512, 00:16:18.603 "num_blocks": 65536, 00:16:18.603 "uuid": "299632e8-869a-484a-b6e8-ea92aa2a24cb", 00:16:18.603 "assigned_rate_limits": { 00:16:18.603 "rw_ios_per_sec": 0, 00:16:18.603 "rw_mbytes_per_sec": 0, 00:16:18.603 "r_mbytes_per_sec": 0, 00:16:18.603 "w_mbytes_per_sec": 0 00:16:18.603 }, 00:16:18.603 "claimed": true, 00:16:18.603 "claim_type": "exclusive_write", 00:16:18.603 "zoned": false, 00:16:18.603 "supported_io_types": { 00:16:18.603 "read": true, 00:16:18.603 "write": true, 00:16:18.603 "unmap": true, 00:16:18.603 "flush": true, 00:16:18.603 "reset": true, 00:16:18.603 "nvme_admin": false, 00:16:18.603 "nvme_io": false, 00:16:18.603 "nvme_io_md": false, 00:16:18.603 "write_zeroes": true, 00:16:18.603 "zcopy": true, 00:16:18.603 "get_zone_info": false, 00:16:18.603 "zone_management": false, 00:16:18.603 "zone_append": false, 00:16:18.603 "compare": false, 00:16:18.603 "compare_and_write": false, 00:16:18.603 "abort": true, 00:16:18.603 "seek_hole": false, 00:16:18.603 "seek_data": false, 00:16:18.603 "copy": true, 00:16:18.603 "nvme_iov_md": false 00:16:18.603 }, 00:16:18.603 "memory_domains": [ 00:16:18.603 { 00:16:18.603 "dma_device_id": "system", 00:16:18.603 "dma_device_type": 1 00:16:18.603 }, 00:16:18.603 { 00:16:18.603 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:18.603 "dma_device_type": 2 00:16:18.603 } 00:16:18.603 ], 00:16:18.603 "driver_specific": {} 00:16:18.603 }' 00:16:18.603 11:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:18.603 11:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:18.603 11:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:18.603 11:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:18.603 11:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:18.603 11:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:18.603 11:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:18.603 11:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:18.603 11:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:18.603 11:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:18.862 11:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:18.862 11:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:18.862 11:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:18.862 11:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:18.862 11:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:18.862 11:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:18.862 "name": "BaseBdev3", 00:16:18.862 "aliases": [ 00:16:18.862 "e97acddd-b4c8-49fd-a314-76d5b89896c0" 00:16:18.862 ], 00:16:18.862 "product_name": "Malloc disk", 00:16:18.862 "block_size": 512, 00:16:18.862 "num_blocks": 65536, 00:16:18.862 "uuid": "e97acddd-b4c8-49fd-a314-76d5b89896c0", 00:16:18.862 "assigned_rate_limits": { 00:16:18.862 "rw_ios_per_sec": 0, 00:16:18.862 "rw_mbytes_per_sec": 0, 00:16:18.862 "r_mbytes_per_sec": 0, 00:16:18.862 "w_mbytes_per_sec": 0 00:16:18.862 }, 00:16:18.862 "claimed": true, 00:16:18.862 "claim_type": "exclusive_write", 00:16:18.862 "zoned": false, 00:16:18.862 "supported_io_types": { 00:16:18.862 "read": true, 00:16:18.862 "write": true, 00:16:18.862 "unmap": true, 00:16:18.862 "flush": true, 00:16:18.862 "reset": true, 00:16:18.862 "nvme_admin": false, 00:16:18.862 "nvme_io": false, 00:16:18.862 "nvme_io_md": false, 00:16:18.862 "write_zeroes": true, 00:16:18.862 "zcopy": true, 00:16:18.862 "get_zone_info": false, 00:16:18.862 "zone_management": false, 00:16:18.862 "zone_append": false, 00:16:18.862 "compare": false, 00:16:18.862 "compare_and_write": false, 00:16:18.862 "abort": true, 00:16:18.862 "seek_hole": false, 00:16:18.862 "seek_data": false, 00:16:18.862 "copy": true, 00:16:18.862 "nvme_iov_md": false 00:16:18.862 }, 00:16:18.862 "memory_domains": [ 00:16:18.862 { 00:16:18.862 "dma_device_id": "system", 00:16:18.862 "dma_device_type": 1 00:16:18.862 }, 00:16:18.862 { 00:16:18.862 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:18.862 "dma_device_type": 2 00:16:18.862 } 00:16:18.862 ], 00:16:18.862 "driver_specific": {} 00:16:18.862 }' 00:16:18.862 11:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:19.120 11:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:19.120 11:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:19.120 11:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:19.120 11:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:19.120 11:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:19.120 11:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:19.120 11:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:19.120 11:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:19.120 11:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:19.379 11:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:19.379 11:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:19.379 11:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:19.379 11:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:16:19.379 11:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:19.379 11:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:19.379 "name": "BaseBdev4", 00:16:19.379 "aliases": [ 00:16:19.379 "14b8f5c4-02bd-468d-94a6-284566ac1a05" 00:16:19.379 ], 00:16:19.379 "product_name": "Malloc disk", 00:16:19.379 "block_size": 512, 00:16:19.379 "num_blocks": 65536, 00:16:19.379 "uuid": "14b8f5c4-02bd-468d-94a6-284566ac1a05", 00:16:19.379 "assigned_rate_limits": { 00:16:19.379 "rw_ios_per_sec": 0, 00:16:19.379 "rw_mbytes_per_sec": 0, 00:16:19.379 "r_mbytes_per_sec": 0, 00:16:19.379 "w_mbytes_per_sec": 0 00:16:19.379 }, 00:16:19.379 "claimed": true, 00:16:19.379 "claim_type": "exclusive_write", 00:16:19.379 "zoned": false, 00:16:19.379 "supported_io_types": { 00:16:19.379 "read": true, 00:16:19.379 "write": true, 00:16:19.379 "unmap": true, 00:16:19.379 "flush": true, 00:16:19.379 "reset": true, 00:16:19.379 "nvme_admin": false, 00:16:19.379 "nvme_io": false, 00:16:19.379 "nvme_io_md": false, 00:16:19.379 "write_zeroes": true, 00:16:19.379 "zcopy": true, 00:16:19.379 "get_zone_info": false, 00:16:19.379 "zone_management": false, 00:16:19.379 "zone_append": false, 00:16:19.379 "compare": false, 00:16:19.379 "compare_and_write": false, 00:16:19.379 "abort": true, 00:16:19.379 "seek_hole": false, 00:16:19.379 "seek_data": false, 00:16:19.379 "copy": true, 00:16:19.379 "nvme_iov_md": false 00:16:19.379 }, 00:16:19.379 "memory_domains": [ 00:16:19.379 { 00:16:19.379 "dma_device_id": "system", 00:16:19.379 "dma_device_type": 1 00:16:19.379 }, 00:16:19.379 { 00:16:19.379 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:19.379 "dma_device_type": 2 00:16:19.379 } 00:16:19.379 ], 00:16:19.379 "driver_specific": {} 00:16:19.379 }' 00:16:19.379 11:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:19.638 11:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:19.638 11:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:19.638 11:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:19.638 11:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:19.638 11:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:19.638 11:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:19.638 11:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:19.638 11:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:19.638 11:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:19.638 11:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:19.638 11:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:19.638 11:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:19.896 [2024-07-12 11:57:10.037148] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:19.896 [2024-07-12 11:57:10.037170] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:19.896 [2024-07-12 11:57:10.037204] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:19.896 11:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:19.896 11:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:16:19.896 11:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:19.896 11:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:16:19.896 11:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:16:19.896 11:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:16:19.896 11:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:19.896 11:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:16:19.896 11:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:19.896 11:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:19.896 11:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:19.896 11:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:19.896 11:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:19.896 11:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:19.896 11:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:19.896 11:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:19.896 11:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:20.155 11:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:20.155 "name": "Existed_Raid", 00:16:20.155 "uuid": "a3702e19-83cd-45e2-a18c-a51e7736e85c", 00:16:20.155 "strip_size_kb": 64, 00:16:20.155 "state": "offline", 00:16:20.155 "raid_level": "concat", 00:16:20.155 "superblock": true, 00:16:20.155 "num_base_bdevs": 4, 00:16:20.155 "num_base_bdevs_discovered": 3, 00:16:20.155 "num_base_bdevs_operational": 3, 00:16:20.155 "base_bdevs_list": [ 00:16:20.155 { 00:16:20.155 "name": null, 00:16:20.155 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:20.155 "is_configured": false, 00:16:20.155 "data_offset": 2048, 00:16:20.155 "data_size": 63488 00:16:20.155 }, 00:16:20.155 { 00:16:20.155 "name": "BaseBdev2", 00:16:20.155 "uuid": "299632e8-869a-484a-b6e8-ea92aa2a24cb", 00:16:20.155 "is_configured": true, 00:16:20.155 "data_offset": 2048, 00:16:20.155 "data_size": 63488 00:16:20.155 }, 00:16:20.155 { 00:16:20.155 "name": "BaseBdev3", 00:16:20.155 "uuid": "e97acddd-b4c8-49fd-a314-76d5b89896c0", 00:16:20.155 "is_configured": true, 00:16:20.155 "data_offset": 2048, 00:16:20.155 "data_size": 63488 00:16:20.155 }, 00:16:20.155 { 00:16:20.155 "name": "BaseBdev4", 00:16:20.155 "uuid": "14b8f5c4-02bd-468d-94a6-284566ac1a05", 00:16:20.155 "is_configured": true, 00:16:20.155 "data_offset": 2048, 00:16:20.155 "data_size": 63488 00:16:20.155 } 00:16:20.155 ] 00:16:20.155 }' 00:16:20.155 11:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:20.155 11:57:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:20.722 11:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:20.722 11:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:20.722 11:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:20.722 11:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:20.722 11:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:20.722 11:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:20.722 11:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:20.979 [2024-07-12 11:57:11.068667] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:20.979 11:57:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:20.979 11:57:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:20.979 11:57:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:20.979 11:57:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:21.237 11:57:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:21.237 11:57:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:21.237 11:57:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:21.237 [2024-07-12 11:57:11.427114] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:21.237 11:57:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:21.237 11:57:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:21.237 11:57:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:21.237 11:57:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:21.496 11:57:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:21.496 11:57:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:21.496 11:57:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:16:21.755 [2024-07-12 11:57:11.773632] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:16:21.755 [2024-07-12 11:57:11.773661] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15a9b90 name Existed_Raid, state offline 00:16:21.755 11:57:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:21.755 11:57:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:21.755 11:57:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:21.755 11:57:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:21.755 11:57:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:21.755 11:57:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:21.755 11:57:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:16:21.755 11:57:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:21.755 11:57:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:21.755 11:57:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:22.014 BaseBdev2 00:16:22.014 11:57:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:22.014 11:57:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:22.014 11:57:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:22.014 11:57:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:22.014 11:57:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:22.014 11:57:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:22.014 11:57:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:22.273 11:57:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:22.273 [ 00:16:22.273 { 00:16:22.273 "name": "BaseBdev2", 00:16:22.273 "aliases": [ 00:16:22.273 "fed265e8-cab8-43cc-9812-ac04b466b2bd" 00:16:22.273 ], 00:16:22.273 "product_name": "Malloc disk", 00:16:22.273 "block_size": 512, 00:16:22.273 "num_blocks": 65536, 00:16:22.273 "uuid": "fed265e8-cab8-43cc-9812-ac04b466b2bd", 00:16:22.273 "assigned_rate_limits": { 00:16:22.273 "rw_ios_per_sec": 0, 00:16:22.273 "rw_mbytes_per_sec": 0, 00:16:22.273 "r_mbytes_per_sec": 0, 00:16:22.273 "w_mbytes_per_sec": 0 00:16:22.273 }, 00:16:22.273 "claimed": false, 00:16:22.273 "zoned": false, 00:16:22.273 "supported_io_types": { 00:16:22.273 "read": true, 00:16:22.273 "write": true, 00:16:22.273 "unmap": true, 00:16:22.273 "flush": true, 00:16:22.273 "reset": true, 00:16:22.273 "nvme_admin": false, 00:16:22.273 "nvme_io": false, 00:16:22.273 "nvme_io_md": false, 00:16:22.273 "write_zeroes": true, 00:16:22.273 "zcopy": true, 00:16:22.273 "get_zone_info": false, 00:16:22.273 "zone_management": false, 00:16:22.273 "zone_append": false, 00:16:22.273 "compare": false, 00:16:22.273 "compare_and_write": false, 00:16:22.273 "abort": true, 00:16:22.273 "seek_hole": false, 00:16:22.273 "seek_data": false, 00:16:22.273 "copy": true, 00:16:22.273 "nvme_iov_md": false 00:16:22.273 }, 00:16:22.273 "memory_domains": [ 00:16:22.273 { 00:16:22.273 "dma_device_id": "system", 00:16:22.273 "dma_device_type": 1 00:16:22.273 }, 00:16:22.273 { 00:16:22.273 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:22.273 "dma_device_type": 2 00:16:22.273 } 00:16:22.273 ], 00:16:22.273 "driver_specific": {} 00:16:22.273 } 00:16:22.273 ] 00:16:22.273 11:57:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:22.273 11:57:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:22.273 11:57:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:22.273 11:57:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:22.532 BaseBdev3 00:16:22.532 11:57:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:22.532 11:57:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:22.532 11:57:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:22.532 11:57:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:22.532 11:57:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:22.532 11:57:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:22.532 11:57:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:22.790 11:57:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:22.790 [ 00:16:22.790 { 00:16:22.790 "name": "BaseBdev3", 00:16:22.790 "aliases": [ 00:16:22.790 "34208269-03ca-4a75-959e-5547ba4fc081" 00:16:22.790 ], 00:16:22.790 "product_name": "Malloc disk", 00:16:22.790 "block_size": 512, 00:16:22.790 "num_blocks": 65536, 00:16:22.790 "uuid": "34208269-03ca-4a75-959e-5547ba4fc081", 00:16:22.790 "assigned_rate_limits": { 00:16:22.790 "rw_ios_per_sec": 0, 00:16:22.790 "rw_mbytes_per_sec": 0, 00:16:22.790 "r_mbytes_per_sec": 0, 00:16:22.790 "w_mbytes_per_sec": 0 00:16:22.790 }, 00:16:22.790 "claimed": false, 00:16:22.790 "zoned": false, 00:16:22.790 "supported_io_types": { 00:16:22.790 "read": true, 00:16:22.790 "write": true, 00:16:22.790 "unmap": true, 00:16:22.790 "flush": true, 00:16:22.790 "reset": true, 00:16:22.790 "nvme_admin": false, 00:16:22.790 "nvme_io": false, 00:16:22.790 "nvme_io_md": false, 00:16:22.790 "write_zeroes": true, 00:16:22.790 "zcopy": true, 00:16:22.790 "get_zone_info": false, 00:16:22.790 "zone_management": false, 00:16:22.790 "zone_append": false, 00:16:22.790 "compare": false, 00:16:22.790 "compare_and_write": false, 00:16:22.790 "abort": true, 00:16:22.790 "seek_hole": false, 00:16:22.790 "seek_data": false, 00:16:22.790 "copy": true, 00:16:22.790 "nvme_iov_md": false 00:16:22.790 }, 00:16:22.790 "memory_domains": [ 00:16:22.790 { 00:16:22.790 "dma_device_id": "system", 00:16:22.790 "dma_device_type": 1 00:16:22.790 }, 00:16:22.790 { 00:16:22.790 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:22.790 "dma_device_type": 2 00:16:22.790 } 00:16:22.790 ], 00:16:22.790 "driver_specific": {} 00:16:22.790 } 00:16:22.790 ] 00:16:22.790 11:57:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:22.790 11:57:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:22.790 11:57:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:22.790 11:57:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:16:23.048 BaseBdev4 00:16:23.048 11:57:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:16:23.048 11:57:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:16:23.048 11:57:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:23.048 11:57:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:23.048 11:57:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:23.048 11:57:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:23.048 11:57:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:23.307 11:57:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:16:23.307 [ 00:16:23.307 { 00:16:23.307 "name": "BaseBdev4", 00:16:23.307 "aliases": [ 00:16:23.307 "47de8451-1f1c-4f1e-bafd-8c85dd2e490e" 00:16:23.307 ], 00:16:23.307 "product_name": "Malloc disk", 00:16:23.307 "block_size": 512, 00:16:23.307 "num_blocks": 65536, 00:16:23.307 "uuid": "47de8451-1f1c-4f1e-bafd-8c85dd2e490e", 00:16:23.307 "assigned_rate_limits": { 00:16:23.307 "rw_ios_per_sec": 0, 00:16:23.307 "rw_mbytes_per_sec": 0, 00:16:23.307 "r_mbytes_per_sec": 0, 00:16:23.307 "w_mbytes_per_sec": 0 00:16:23.307 }, 00:16:23.307 "claimed": false, 00:16:23.307 "zoned": false, 00:16:23.307 "supported_io_types": { 00:16:23.307 "read": true, 00:16:23.307 "write": true, 00:16:23.307 "unmap": true, 00:16:23.307 "flush": true, 00:16:23.307 "reset": true, 00:16:23.307 "nvme_admin": false, 00:16:23.307 "nvme_io": false, 00:16:23.307 "nvme_io_md": false, 00:16:23.307 "write_zeroes": true, 00:16:23.307 "zcopy": true, 00:16:23.307 "get_zone_info": false, 00:16:23.307 "zone_management": false, 00:16:23.307 "zone_append": false, 00:16:23.307 "compare": false, 00:16:23.307 "compare_and_write": false, 00:16:23.307 "abort": true, 00:16:23.307 "seek_hole": false, 00:16:23.307 "seek_data": false, 00:16:23.307 "copy": true, 00:16:23.307 "nvme_iov_md": false 00:16:23.307 }, 00:16:23.307 "memory_domains": [ 00:16:23.307 { 00:16:23.307 "dma_device_id": "system", 00:16:23.307 "dma_device_type": 1 00:16:23.307 }, 00:16:23.307 { 00:16:23.307 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:23.307 "dma_device_type": 2 00:16:23.307 } 00:16:23.307 ], 00:16:23.307 "driver_specific": {} 00:16:23.307 } 00:16:23.307 ] 00:16:23.307 11:57:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:23.307 11:57:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:23.307 11:57:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:23.307 11:57:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:23.566 [2024-07-12 11:57:13.619475] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:23.566 [2024-07-12 11:57:13.619503] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:23.566 [2024-07-12 11:57:13.619515] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:23.566 [2024-07-12 11:57:13.620493] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:23.566 [2024-07-12 11:57:13.620532] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:23.566 11:57:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:23.566 11:57:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:23.566 11:57:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:23.566 11:57:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:23.566 11:57:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:23.566 11:57:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:23.566 11:57:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:23.566 11:57:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:23.566 11:57:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:23.566 11:57:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:23.566 11:57:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:23.566 11:57:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:23.566 11:57:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:23.566 "name": "Existed_Raid", 00:16:23.566 "uuid": "1a25a6f3-da8b-4617-8aa0-ba1ce3acb3b2", 00:16:23.566 "strip_size_kb": 64, 00:16:23.566 "state": "configuring", 00:16:23.566 "raid_level": "concat", 00:16:23.566 "superblock": true, 00:16:23.566 "num_base_bdevs": 4, 00:16:23.566 "num_base_bdevs_discovered": 3, 00:16:23.566 "num_base_bdevs_operational": 4, 00:16:23.566 "base_bdevs_list": [ 00:16:23.566 { 00:16:23.566 "name": "BaseBdev1", 00:16:23.566 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:23.566 "is_configured": false, 00:16:23.566 "data_offset": 0, 00:16:23.566 "data_size": 0 00:16:23.566 }, 00:16:23.566 { 00:16:23.566 "name": "BaseBdev2", 00:16:23.566 "uuid": "fed265e8-cab8-43cc-9812-ac04b466b2bd", 00:16:23.566 "is_configured": true, 00:16:23.566 "data_offset": 2048, 00:16:23.566 "data_size": 63488 00:16:23.566 }, 00:16:23.566 { 00:16:23.566 "name": "BaseBdev3", 00:16:23.566 "uuid": "34208269-03ca-4a75-959e-5547ba4fc081", 00:16:23.566 "is_configured": true, 00:16:23.566 "data_offset": 2048, 00:16:23.566 "data_size": 63488 00:16:23.566 }, 00:16:23.566 { 00:16:23.566 "name": "BaseBdev4", 00:16:23.566 "uuid": "47de8451-1f1c-4f1e-bafd-8c85dd2e490e", 00:16:23.566 "is_configured": true, 00:16:23.566 "data_offset": 2048, 00:16:23.566 "data_size": 63488 00:16:23.566 } 00:16:23.566 ] 00:16:23.566 }' 00:16:23.566 11:57:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:23.566 11:57:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:24.152 11:57:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:24.410 [2024-07-12 11:57:14.437573] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:24.410 11:57:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:24.410 11:57:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:24.410 11:57:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:24.410 11:57:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:24.410 11:57:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:24.410 11:57:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:24.410 11:57:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:24.410 11:57:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:24.410 11:57:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:24.410 11:57:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:24.410 11:57:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:24.410 11:57:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:24.410 11:57:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:24.410 "name": "Existed_Raid", 00:16:24.410 "uuid": "1a25a6f3-da8b-4617-8aa0-ba1ce3acb3b2", 00:16:24.411 "strip_size_kb": 64, 00:16:24.411 "state": "configuring", 00:16:24.411 "raid_level": "concat", 00:16:24.411 "superblock": true, 00:16:24.411 "num_base_bdevs": 4, 00:16:24.411 "num_base_bdevs_discovered": 2, 00:16:24.411 "num_base_bdevs_operational": 4, 00:16:24.411 "base_bdevs_list": [ 00:16:24.411 { 00:16:24.411 "name": "BaseBdev1", 00:16:24.411 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:24.411 "is_configured": false, 00:16:24.411 "data_offset": 0, 00:16:24.411 "data_size": 0 00:16:24.411 }, 00:16:24.411 { 00:16:24.411 "name": null, 00:16:24.411 "uuid": "fed265e8-cab8-43cc-9812-ac04b466b2bd", 00:16:24.411 "is_configured": false, 00:16:24.411 "data_offset": 2048, 00:16:24.411 "data_size": 63488 00:16:24.411 }, 00:16:24.411 { 00:16:24.411 "name": "BaseBdev3", 00:16:24.411 "uuid": "34208269-03ca-4a75-959e-5547ba4fc081", 00:16:24.411 "is_configured": true, 00:16:24.411 "data_offset": 2048, 00:16:24.411 "data_size": 63488 00:16:24.411 }, 00:16:24.411 { 00:16:24.411 "name": "BaseBdev4", 00:16:24.411 "uuid": "47de8451-1f1c-4f1e-bafd-8c85dd2e490e", 00:16:24.411 "is_configured": true, 00:16:24.411 "data_offset": 2048, 00:16:24.411 "data_size": 63488 00:16:24.411 } 00:16:24.411 ] 00:16:24.411 }' 00:16:24.411 11:57:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:24.411 11:57:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:24.974 11:57:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:24.974 11:57:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:25.232 11:57:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:25.232 11:57:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:25.232 [2024-07-12 11:57:15.430781] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:25.232 BaseBdev1 00:16:25.232 11:57:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:25.232 11:57:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:25.232 11:57:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:25.232 11:57:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:25.232 11:57:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:25.232 11:57:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:25.232 11:57:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:25.509 11:57:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:25.768 [ 00:16:25.768 { 00:16:25.768 "name": "BaseBdev1", 00:16:25.768 "aliases": [ 00:16:25.768 "272b0043-5dc1-4846-a64b-bc9cf16145b5" 00:16:25.768 ], 00:16:25.768 "product_name": "Malloc disk", 00:16:25.768 "block_size": 512, 00:16:25.768 "num_blocks": 65536, 00:16:25.768 "uuid": "272b0043-5dc1-4846-a64b-bc9cf16145b5", 00:16:25.768 "assigned_rate_limits": { 00:16:25.768 "rw_ios_per_sec": 0, 00:16:25.768 "rw_mbytes_per_sec": 0, 00:16:25.768 "r_mbytes_per_sec": 0, 00:16:25.768 "w_mbytes_per_sec": 0 00:16:25.768 }, 00:16:25.768 "claimed": true, 00:16:25.768 "claim_type": "exclusive_write", 00:16:25.768 "zoned": false, 00:16:25.768 "supported_io_types": { 00:16:25.768 "read": true, 00:16:25.768 "write": true, 00:16:25.768 "unmap": true, 00:16:25.768 "flush": true, 00:16:25.768 "reset": true, 00:16:25.768 "nvme_admin": false, 00:16:25.768 "nvme_io": false, 00:16:25.768 "nvme_io_md": false, 00:16:25.768 "write_zeroes": true, 00:16:25.768 "zcopy": true, 00:16:25.768 "get_zone_info": false, 00:16:25.768 "zone_management": false, 00:16:25.768 "zone_append": false, 00:16:25.768 "compare": false, 00:16:25.768 "compare_and_write": false, 00:16:25.768 "abort": true, 00:16:25.768 "seek_hole": false, 00:16:25.768 "seek_data": false, 00:16:25.768 "copy": true, 00:16:25.768 "nvme_iov_md": false 00:16:25.768 }, 00:16:25.768 "memory_domains": [ 00:16:25.768 { 00:16:25.768 "dma_device_id": "system", 00:16:25.768 "dma_device_type": 1 00:16:25.768 }, 00:16:25.768 { 00:16:25.768 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:25.768 "dma_device_type": 2 00:16:25.768 } 00:16:25.768 ], 00:16:25.768 "driver_specific": {} 00:16:25.768 } 00:16:25.768 ] 00:16:25.768 11:57:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:25.768 11:57:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:25.768 11:57:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:25.768 11:57:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:25.768 11:57:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:25.768 11:57:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:25.768 11:57:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:25.768 11:57:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:25.768 11:57:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:25.768 11:57:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:25.768 11:57:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:25.768 11:57:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:25.768 11:57:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:25.768 11:57:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:25.768 "name": "Existed_Raid", 00:16:25.768 "uuid": "1a25a6f3-da8b-4617-8aa0-ba1ce3acb3b2", 00:16:25.768 "strip_size_kb": 64, 00:16:25.768 "state": "configuring", 00:16:25.768 "raid_level": "concat", 00:16:25.768 "superblock": true, 00:16:25.768 "num_base_bdevs": 4, 00:16:25.768 "num_base_bdevs_discovered": 3, 00:16:25.768 "num_base_bdevs_operational": 4, 00:16:25.768 "base_bdevs_list": [ 00:16:25.768 { 00:16:25.768 "name": "BaseBdev1", 00:16:25.768 "uuid": "272b0043-5dc1-4846-a64b-bc9cf16145b5", 00:16:25.768 "is_configured": true, 00:16:25.768 "data_offset": 2048, 00:16:25.768 "data_size": 63488 00:16:25.768 }, 00:16:25.768 { 00:16:25.768 "name": null, 00:16:25.768 "uuid": "fed265e8-cab8-43cc-9812-ac04b466b2bd", 00:16:25.768 "is_configured": false, 00:16:25.768 "data_offset": 2048, 00:16:25.768 "data_size": 63488 00:16:25.768 }, 00:16:25.768 { 00:16:25.768 "name": "BaseBdev3", 00:16:25.768 "uuid": "34208269-03ca-4a75-959e-5547ba4fc081", 00:16:25.768 "is_configured": true, 00:16:25.768 "data_offset": 2048, 00:16:25.768 "data_size": 63488 00:16:25.768 }, 00:16:25.768 { 00:16:25.768 "name": "BaseBdev4", 00:16:25.768 "uuid": "47de8451-1f1c-4f1e-bafd-8c85dd2e490e", 00:16:25.768 "is_configured": true, 00:16:25.768 "data_offset": 2048, 00:16:25.768 "data_size": 63488 00:16:25.768 } 00:16:25.768 ] 00:16:25.768 }' 00:16:25.768 11:57:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:25.768 11:57:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:26.336 11:57:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:26.336 11:57:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:26.594 11:57:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:26.594 11:57:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:26.594 [2024-07-12 11:57:16.746209] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:26.594 11:57:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:26.594 11:57:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:26.594 11:57:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:26.594 11:57:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:26.594 11:57:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:26.594 11:57:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:26.594 11:57:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:26.594 11:57:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:26.594 11:57:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:26.594 11:57:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:26.594 11:57:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:26.594 11:57:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:26.852 11:57:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:26.852 "name": "Existed_Raid", 00:16:26.852 "uuid": "1a25a6f3-da8b-4617-8aa0-ba1ce3acb3b2", 00:16:26.852 "strip_size_kb": 64, 00:16:26.852 "state": "configuring", 00:16:26.852 "raid_level": "concat", 00:16:26.852 "superblock": true, 00:16:26.852 "num_base_bdevs": 4, 00:16:26.852 "num_base_bdevs_discovered": 2, 00:16:26.852 "num_base_bdevs_operational": 4, 00:16:26.852 "base_bdevs_list": [ 00:16:26.852 { 00:16:26.852 "name": "BaseBdev1", 00:16:26.852 "uuid": "272b0043-5dc1-4846-a64b-bc9cf16145b5", 00:16:26.852 "is_configured": true, 00:16:26.852 "data_offset": 2048, 00:16:26.852 "data_size": 63488 00:16:26.852 }, 00:16:26.852 { 00:16:26.852 "name": null, 00:16:26.852 "uuid": "fed265e8-cab8-43cc-9812-ac04b466b2bd", 00:16:26.852 "is_configured": false, 00:16:26.852 "data_offset": 2048, 00:16:26.852 "data_size": 63488 00:16:26.852 }, 00:16:26.853 { 00:16:26.853 "name": null, 00:16:26.853 "uuid": "34208269-03ca-4a75-959e-5547ba4fc081", 00:16:26.853 "is_configured": false, 00:16:26.853 "data_offset": 2048, 00:16:26.853 "data_size": 63488 00:16:26.853 }, 00:16:26.853 { 00:16:26.853 "name": "BaseBdev4", 00:16:26.853 "uuid": "47de8451-1f1c-4f1e-bafd-8c85dd2e490e", 00:16:26.853 "is_configured": true, 00:16:26.853 "data_offset": 2048, 00:16:26.853 "data_size": 63488 00:16:26.853 } 00:16:26.853 ] 00:16:26.853 }' 00:16:26.853 11:57:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:26.853 11:57:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:27.420 11:57:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:27.420 11:57:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:27.420 11:57:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:27.420 11:57:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:27.678 [2024-07-12 11:57:17.728774] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:27.678 11:57:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:27.678 11:57:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:27.678 11:57:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:27.678 11:57:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:27.678 11:57:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:27.678 11:57:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:27.678 11:57:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:27.678 11:57:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:27.678 11:57:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:27.678 11:57:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:27.678 11:57:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:27.678 11:57:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:27.678 11:57:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:27.678 "name": "Existed_Raid", 00:16:27.678 "uuid": "1a25a6f3-da8b-4617-8aa0-ba1ce3acb3b2", 00:16:27.678 "strip_size_kb": 64, 00:16:27.678 "state": "configuring", 00:16:27.678 "raid_level": "concat", 00:16:27.678 "superblock": true, 00:16:27.678 "num_base_bdevs": 4, 00:16:27.678 "num_base_bdevs_discovered": 3, 00:16:27.678 "num_base_bdevs_operational": 4, 00:16:27.678 "base_bdevs_list": [ 00:16:27.678 { 00:16:27.678 "name": "BaseBdev1", 00:16:27.678 "uuid": "272b0043-5dc1-4846-a64b-bc9cf16145b5", 00:16:27.678 "is_configured": true, 00:16:27.678 "data_offset": 2048, 00:16:27.678 "data_size": 63488 00:16:27.678 }, 00:16:27.678 { 00:16:27.678 "name": null, 00:16:27.678 "uuid": "fed265e8-cab8-43cc-9812-ac04b466b2bd", 00:16:27.678 "is_configured": false, 00:16:27.678 "data_offset": 2048, 00:16:27.678 "data_size": 63488 00:16:27.678 }, 00:16:27.678 { 00:16:27.678 "name": "BaseBdev3", 00:16:27.678 "uuid": "34208269-03ca-4a75-959e-5547ba4fc081", 00:16:27.678 "is_configured": true, 00:16:27.678 "data_offset": 2048, 00:16:27.678 "data_size": 63488 00:16:27.678 }, 00:16:27.678 { 00:16:27.678 "name": "BaseBdev4", 00:16:27.678 "uuid": "47de8451-1f1c-4f1e-bafd-8c85dd2e490e", 00:16:27.678 "is_configured": true, 00:16:27.678 "data_offset": 2048, 00:16:27.678 "data_size": 63488 00:16:27.678 } 00:16:27.678 ] 00:16:27.678 }' 00:16:27.678 11:57:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:27.678 11:57:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:28.245 11:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:28.245 11:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:28.504 11:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:28.504 11:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:28.504 [2024-07-12 11:57:18.715341] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:28.504 11:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:28.504 11:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:28.504 11:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:28.504 11:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:28.504 11:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:28.504 11:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:28.504 11:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:28.504 11:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:28.504 11:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:28.504 11:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:28.504 11:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:28.504 11:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:28.763 11:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:28.763 "name": "Existed_Raid", 00:16:28.763 "uuid": "1a25a6f3-da8b-4617-8aa0-ba1ce3acb3b2", 00:16:28.763 "strip_size_kb": 64, 00:16:28.763 "state": "configuring", 00:16:28.763 "raid_level": "concat", 00:16:28.763 "superblock": true, 00:16:28.763 "num_base_bdevs": 4, 00:16:28.763 "num_base_bdevs_discovered": 2, 00:16:28.763 "num_base_bdevs_operational": 4, 00:16:28.763 "base_bdevs_list": [ 00:16:28.763 { 00:16:28.763 "name": null, 00:16:28.763 "uuid": "272b0043-5dc1-4846-a64b-bc9cf16145b5", 00:16:28.763 "is_configured": false, 00:16:28.763 "data_offset": 2048, 00:16:28.763 "data_size": 63488 00:16:28.763 }, 00:16:28.763 { 00:16:28.763 "name": null, 00:16:28.763 "uuid": "fed265e8-cab8-43cc-9812-ac04b466b2bd", 00:16:28.763 "is_configured": false, 00:16:28.763 "data_offset": 2048, 00:16:28.763 "data_size": 63488 00:16:28.763 }, 00:16:28.763 { 00:16:28.763 "name": "BaseBdev3", 00:16:28.763 "uuid": "34208269-03ca-4a75-959e-5547ba4fc081", 00:16:28.763 "is_configured": true, 00:16:28.763 "data_offset": 2048, 00:16:28.763 "data_size": 63488 00:16:28.763 }, 00:16:28.763 { 00:16:28.763 "name": "BaseBdev4", 00:16:28.763 "uuid": "47de8451-1f1c-4f1e-bafd-8c85dd2e490e", 00:16:28.763 "is_configured": true, 00:16:28.763 "data_offset": 2048, 00:16:28.763 "data_size": 63488 00:16:28.763 } 00:16:28.763 ] 00:16:28.763 }' 00:16:28.763 11:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:28.763 11:57:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:29.330 11:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:29.330 11:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:29.330 11:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:29.330 11:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:29.589 [2024-07-12 11:57:19.715630] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:29.589 11:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:29.589 11:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:29.589 11:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:29.589 11:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:29.589 11:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:29.589 11:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:29.589 11:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:29.589 11:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:29.589 11:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:29.589 11:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:29.589 11:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:29.589 11:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:29.847 11:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:29.847 "name": "Existed_Raid", 00:16:29.847 "uuid": "1a25a6f3-da8b-4617-8aa0-ba1ce3acb3b2", 00:16:29.847 "strip_size_kb": 64, 00:16:29.847 "state": "configuring", 00:16:29.847 "raid_level": "concat", 00:16:29.847 "superblock": true, 00:16:29.847 "num_base_bdevs": 4, 00:16:29.847 "num_base_bdevs_discovered": 3, 00:16:29.847 "num_base_bdevs_operational": 4, 00:16:29.847 "base_bdevs_list": [ 00:16:29.847 { 00:16:29.847 "name": null, 00:16:29.847 "uuid": "272b0043-5dc1-4846-a64b-bc9cf16145b5", 00:16:29.847 "is_configured": false, 00:16:29.847 "data_offset": 2048, 00:16:29.847 "data_size": 63488 00:16:29.847 }, 00:16:29.847 { 00:16:29.847 "name": "BaseBdev2", 00:16:29.847 "uuid": "fed265e8-cab8-43cc-9812-ac04b466b2bd", 00:16:29.847 "is_configured": true, 00:16:29.847 "data_offset": 2048, 00:16:29.847 "data_size": 63488 00:16:29.847 }, 00:16:29.847 { 00:16:29.847 "name": "BaseBdev3", 00:16:29.847 "uuid": "34208269-03ca-4a75-959e-5547ba4fc081", 00:16:29.847 "is_configured": true, 00:16:29.847 "data_offset": 2048, 00:16:29.847 "data_size": 63488 00:16:29.847 }, 00:16:29.847 { 00:16:29.847 "name": "BaseBdev4", 00:16:29.847 "uuid": "47de8451-1f1c-4f1e-bafd-8c85dd2e490e", 00:16:29.847 "is_configured": true, 00:16:29.847 "data_offset": 2048, 00:16:29.847 "data_size": 63488 00:16:29.847 } 00:16:29.847 ] 00:16:29.847 }' 00:16:29.847 11:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:29.847 11:57:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:30.414 11:57:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:30.414 11:57:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:30.414 11:57:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:30.414 11:57:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:30.414 11:57:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:30.674 11:57:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 272b0043-5dc1-4846-a64b-bc9cf16145b5 00:16:30.674 [2024-07-12 11:57:20.869300] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:30.674 [2024-07-12 11:57:20.869415] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x174f6b0 00:16:30.674 [2024-07-12 11:57:20.869424] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:16:30.674 [2024-07-12 11:57:20.869551] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x174f130 00:16:30.674 [2024-07-12 11:57:20.869631] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x174f6b0 00:16:30.674 [2024-07-12 11:57:20.869637] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x174f6b0 00:16:30.674 [2024-07-12 11:57:20.869699] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:30.674 NewBaseBdev 00:16:30.674 11:57:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:30.674 11:57:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:16:30.674 11:57:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:30.674 11:57:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:30.674 11:57:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:30.674 11:57:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:30.674 11:57:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:30.933 11:57:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:31.191 [ 00:16:31.191 { 00:16:31.191 "name": "NewBaseBdev", 00:16:31.191 "aliases": [ 00:16:31.191 "272b0043-5dc1-4846-a64b-bc9cf16145b5" 00:16:31.191 ], 00:16:31.191 "product_name": "Malloc disk", 00:16:31.191 "block_size": 512, 00:16:31.191 "num_blocks": 65536, 00:16:31.191 "uuid": "272b0043-5dc1-4846-a64b-bc9cf16145b5", 00:16:31.191 "assigned_rate_limits": { 00:16:31.191 "rw_ios_per_sec": 0, 00:16:31.191 "rw_mbytes_per_sec": 0, 00:16:31.191 "r_mbytes_per_sec": 0, 00:16:31.191 "w_mbytes_per_sec": 0 00:16:31.191 }, 00:16:31.191 "claimed": true, 00:16:31.191 "claim_type": "exclusive_write", 00:16:31.191 "zoned": false, 00:16:31.191 "supported_io_types": { 00:16:31.191 "read": true, 00:16:31.191 "write": true, 00:16:31.191 "unmap": true, 00:16:31.191 "flush": true, 00:16:31.191 "reset": true, 00:16:31.191 "nvme_admin": false, 00:16:31.191 "nvme_io": false, 00:16:31.191 "nvme_io_md": false, 00:16:31.191 "write_zeroes": true, 00:16:31.191 "zcopy": true, 00:16:31.191 "get_zone_info": false, 00:16:31.191 "zone_management": false, 00:16:31.191 "zone_append": false, 00:16:31.191 "compare": false, 00:16:31.191 "compare_and_write": false, 00:16:31.191 "abort": true, 00:16:31.191 "seek_hole": false, 00:16:31.191 "seek_data": false, 00:16:31.191 "copy": true, 00:16:31.191 "nvme_iov_md": false 00:16:31.191 }, 00:16:31.191 "memory_domains": [ 00:16:31.191 { 00:16:31.191 "dma_device_id": "system", 00:16:31.191 "dma_device_type": 1 00:16:31.191 }, 00:16:31.191 { 00:16:31.191 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:31.191 "dma_device_type": 2 00:16:31.191 } 00:16:31.191 ], 00:16:31.191 "driver_specific": {} 00:16:31.191 } 00:16:31.191 ] 00:16:31.191 11:57:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:31.191 11:57:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:16:31.191 11:57:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:31.191 11:57:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:31.191 11:57:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:31.191 11:57:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:31.191 11:57:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:31.191 11:57:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:31.191 11:57:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:31.191 11:57:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:31.191 11:57:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:31.191 11:57:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:31.191 11:57:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:31.191 11:57:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:31.191 "name": "Existed_Raid", 00:16:31.191 "uuid": "1a25a6f3-da8b-4617-8aa0-ba1ce3acb3b2", 00:16:31.191 "strip_size_kb": 64, 00:16:31.191 "state": "online", 00:16:31.191 "raid_level": "concat", 00:16:31.191 "superblock": true, 00:16:31.191 "num_base_bdevs": 4, 00:16:31.191 "num_base_bdevs_discovered": 4, 00:16:31.191 "num_base_bdevs_operational": 4, 00:16:31.191 "base_bdevs_list": [ 00:16:31.191 { 00:16:31.191 "name": "NewBaseBdev", 00:16:31.191 "uuid": "272b0043-5dc1-4846-a64b-bc9cf16145b5", 00:16:31.191 "is_configured": true, 00:16:31.191 "data_offset": 2048, 00:16:31.191 "data_size": 63488 00:16:31.191 }, 00:16:31.191 { 00:16:31.191 "name": "BaseBdev2", 00:16:31.191 "uuid": "fed265e8-cab8-43cc-9812-ac04b466b2bd", 00:16:31.191 "is_configured": true, 00:16:31.191 "data_offset": 2048, 00:16:31.191 "data_size": 63488 00:16:31.191 }, 00:16:31.191 { 00:16:31.191 "name": "BaseBdev3", 00:16:31.191 "uuid": "34208269-03ca-4a75-959e-5547ba4fc081", 00:16:31.191 "is_configured": true, 00:16:31.191 "data_offset": 2048, 00:16:31.191 "data_size": 63488 00:16:31.191 }, 00:16:31.191 { 00:16:31.191 "name": "BaseBdev4", 00:16:31.191 "uuid": "47de8451-1f1c-4f1e-bafd-8c85dd2e490e", 00:16:31.191 "is_configured": true, 00:16:31.191 "data_offset": 2048, 00:16:31.191 "data_size": 63488 00:16:31.191 } 00:16:31.191 ] 00:16:31.191 }' 00:16:31.191 11:57:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:31.191 11:57:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:31.758 11:57:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:31.758 11:57:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:31.758 11:57:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:31.758 11:57:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:31.758 11:57:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:31.758 11:57:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:16:31.758 11:57:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:31.759 11:57:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:32.018 [2024-07-12 11:57:22.012469] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:32.018 11:57:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:32.018 "name": "Existed_Raid", 00:16:32.018 "aliases": [ 00:16:32.018 "1a25a6f3-da8b-4617-8aa0-ba1ce3acb3b2" 00:16:32.018 ], 00:16:32.018 "product_name": "Raid Volume", 00:16:32.018 "block_size": 512, 00:16:32.018 "num_blocks": 253952, 00:16:32.018 "uuid": "1a25a6f3-da8b-4617-8aa0-ba1ce3acb3b2", 00:16:32.018 "assigned_rate_limits": { 00:16:32.018 "rw_ios_per_sec": 0, 00:16:32.018 "rw_mbytes_per_sec": 0, 00:16:32.018 "r_mbytes_per_sec": 0, 00:16:32.018 "w_mbytes_per_sec": 0 00:16:32.018 }, 00:16:32.018 "claimed": false, 00:16:32.018 "zoned": false, 00:16:32.018 "supported_io_types": { 00:16:32.018 "read": true, 00:16:32.018 "write": true, 00:16:32.018 "unmap": true, 00:16:32.018 "flush": true, 00:16:32.018 "reset": true, 00:16:32.018 "nvme_admin": false, 00:16:32.018 "nvme_io": false, 00:16:32.018 "nvme_io_md": false, 00:16:32.018 "write_zeroes": true, 00:16:32.018 "zcopy": false, 00:16:32.018 "get_zone_info": false, 00:16:32.018 "zone_management": false, 00:16:32.018 "zone_append": false, 00:16:32.018 "compare": false, 00:16:32.018 "compare_and_write": false, 00:16:32.018 "abort": false, 00:16:32.018 "seek_hole": false, 00:16:32.018 "seek_data": false, 00:16:32.018 "copy": false, 00:16:32.018 "nvme_iov_md": false 00:16:32.018 }, 00:16:32.018 "memory_domains": [ 00:16:32.018 { 00:16:32.018 "dma_device_id": "system", 00:16:32.018 "dma_device_type": 1 00:16:32.018 }, 00:16:32.018 { 00:16:32.018 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:32.018 "dma_device_type": 2 00:16:32.018 }, 00:16:32.018 { 00:16:32.018 "dma_device_id": "system", 00:16:32.018 "dma_device_type": 1 00:16:32.018 }, 00:16:32.018 { 00:16:32.018 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:32.018 "dma_device_type": 2 00:16:32.018 }, 00:16:32.018 { 00:16:32.018 "dma_device_id": "system", 00:16:32.018 "dma_device_type": 1 00:16:32.018 }, 00:16:32.018 { 00:16:32.018 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:32.018 "dma_device_type": 2 00:16:32.018 }, 00:16:32.018 { 00:16:32.018 "dma_device_id": "system", 00:16:32.018 "dma_device_type": 1 00:16:32.018 }, 00:16:32.018 { 00:16:32.018 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:32.018 "dma_device_type": 2 00:16:32.018 } 00:16:32.018 ], 00:16:32.018 "driver_specific": { 00:16:32.018 "raid": { 00:16:32.018 "uuid": "1a25a6f3-da8b-4617-8aa0-ba1ce3acb3b2", 00:16:32.018 "strip_size_kb": 64, 00:16:32.018 "state": "online", 00:16:32.018 "raid_level": "concat", 00:16:32.018 "superblock": true, 00:16:32.018 "num_base_bdevs": 4, 00:16:32.018 "num_base_bdevs_discovered": 4, 00:16:32.018 "num_base_bdevs_operational": 4, 00:16:32.018 "base_bdevs_list": [ 00:16:32.018 { 00:16:32.018 "name": "NewBaseBdev", 00:16:32.018 "uuid": "272b0043-5dc1-4846-a64b-bc9cf16145b5", 00:16:32.018 "is_configured": true, 00:16:32.018 "data_offset": 2048, 00:16:32.018 "data_size": 63488 00:16:32.018 }, 00:16:32.018 { 00:16:32.018 "name": "BaseBdev2", 00:16:32.018 "uuid": "fed265e8-cab8-43cc-9812-ac04b466b2bd", 00:16:32.018 "is_configured": true, 00:16:32.018 "data_offset": 2048, 00:16:32.018 "data_size": 63488 00:16:32.018 }, 00:16:32.018 { 00:16:32.018 "name": "BaseBdev3", 00:16:32.018 "uuid": "34208269-03ca-4a75-959e-5547ba4fc081", 00:16:32.018 "is_configured": true, 00:16:32.018 "data_offset": 2048, 00:16:32.018 "data_size": 63488 00:16:32.018 }, 00:16:32.018 { 00:16:32.018 "name": "BaseBdev4", 00:16:32.018 "uuid": "47de8451-1f1c-4f1e-bafd-8c85dd2e490e", 00:16:32.018 "is_configured": true, 00:16:32.018 "data_offset": 2048, 00:16:32.018 "data_size": 63488 00:16:32.018 } 00:16:32.018 ] 00:16:32.018 } 00:16:32.018 } 00:16:32.018 }' 00:16:32.018 11:57:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:32.018 11:57:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:32.018 BaseBdev2 00:16:32.018 BaseBdev3 00:16:32.018 BaseBdev4' 00:16:32.018 11:57:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:32.018 11:57:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:32.018 11:57:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:32.018 11:57:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:32.018 "name": "NewBaseBdev", 00:16:32.018 "aliases": [ 00:16:32.018 "272b0043-5dc1-4846-a64b-bc9cf16145b5" 00:16:32.018 ], 00:16:32.018 "product_name": "Malloc disk", 00:16:32.018 "block_size": 512, 00:16:32.018 "num_blocks": 65536, 00:16:32.018 "uuid": "272b0043-5dc1-4846-a64b-bc9cf16145b5", 00:16:32.018 "assigned_rate_limits": { 00:16:32.018 "rw_ios_per_sec": 0, 00:16:32.018 "rw_mbytes_per_sec": 0, 00:16:32.018 "r_mbytes_per_sec": 0, 00:16:32.018 "w_mbytes_per_sec": 0 00:16:32.018 }, 00:16:32.018 "claimed": true, 00:16:32.018 "claim_type": "exclusive_write", 00:16:32.018 "zoned": false, 00:16:32.018 "supported_io_types": { 00:16:32.018 "read": true, 00:16:32.018 "write": true, 00:16:32.018 "unmap": true, 00:16:32.018 "flush": true, 00:16:32.018 "reset": true, 00:16:32.018 "nvme_admin": false, 00:16:32.018 "nvme_io": false, 00:16:32.018 "nvme_io_md": false, 00:16:32.018 "write_zeroes": true, 00:16:32.018 "zcopy": true, 00:16:32.018 "get_zone_info": false, 00:16:32.018 "zone_management": false, 00:16:32.018 "zone_append": false, 00:16:32.018 "compare": false, 00:16:32.018 "compare_and_write": false, 00:16:32.018 "abort": true, 00:16:32.018 "seek_hole": false, 00:16:32.018 "seek_data": false, 00:16:32.018 "copy": true, 00:16:32.018 "nvme_iov_md": false 00:16:32.018 }, 00:16:32.018 "memory_domains": [ 00:16:32.018 { 00:16:32.018 "dma_device_id": "system", 00:16:32.018 "dma_device_type": 1 00:16:32.018 }, 00:16:32.018 { 00:16:32.018 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:32.018 "dma_device_type": 2 00:16:32.018 } 00:16:32.018 ], 00:16:32.018 "driver_specific": {} 00:16:32.018 }' 00:16:32.018 11:57:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:32.276 11:57:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:32.276 11:57:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:32.276 11:57:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:32.276 11:57:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:32.276 11:57:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:32.276 11:57:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:32.276 11:57:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:32.276 11:57:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:32.276 11:57:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:32.276 11:57:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:32.535 11:57:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:32.535 11:57:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:32.535 11:57:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:32.535 11:57:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:32.535 11:57:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:32.535 "name": "BaseBdev2", 00:16:32.535 "aliases": [ 00:16:32.535 "fed265e8-cab8-43cc-9812-ac04b466b2bd" 00:16:32.535 ], 00:16:32.535 "product_name": "Malloc disk", 00:16:32.535 "block_size": 512, 00:16:32.535 "num_blocks": 65536, 00:16:32.535 "uuid": "fed265e8-cab8-43cc-9812-ac04b466b2bd", 00:16:32.535 "assigned_rate_limits": { 00:16:32.535 "rw_ios_per_sec": 0, 00:16:32.535 "rw_mbytes_per_sec": 0, 00:16:32.535 "r_mbytes_per_sec": 0, 00:16:32.535 "w_mbytes_per_sec": 0 00:16:32.535 }, 00:16:32.535 "claimed": true, 00:16:32.535 "claim_type": "exclusive_write", 00:16:32.535 "zoned": false, 00:16:32.535 "supported_io_types": { 00:16:32.535 "read": true, 00:16:32.535 "write": true, 00:16:32.535 "unmap": true, 00:16:32.535 "flush": true, 00:16:32.535 "reset": true, 00:16:32.535 "nvme_admin": false, 00:16:32.535 "nvme_io": false, 00:16:32.535 "nvme_io_md": false, 00:16:32.535 "write_zeroes": true, 00:16:32.535 "zcopy": true, 00:16:32.535 "get_zone_info": false, 00:16:32.535 "zone_management": false, 00:16:32.535 "zone_append": false, 00:16:32.535 "compare": false, 00:16:32.535 "compare_and_write": false, 00:16:32.535 "abort": true, 00:16:32.535 "seek_hole": false, 00:16:32.535 "seek_data": false, 00:16:32.535 "copy": true, 00:16:32.535 "nvme_iov_md": false 00:16:32.535 }, 00:16:32.535 "memory_domains": [ 00:16:32.535 { 00:16:32.535 "dma_device_id": "system", 00:16:32.535 "dma_device_type": 1 00:16:32.535 }, 00:16:32.535 { 00:16:32.535 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:32.535 "dma_device_type": 2 00:16:32.535 } 00:16:32.535 ], 00:16:32.535 "driver_specific": {} 00:16:32.535 }' 00:16:32.535 11:57:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:32.535 11:57:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:32.794 11:57:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:32.794 11:57:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:32.794 11:57:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:32.794 11:57:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:32.794 11:57:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:32.794 11:57:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:32.794 11:57:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:32.794 11:57:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:32.794 11:57:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:32.794 11:57:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:32.794 11:57:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:32.794 11:57:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:32.794 11:57:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:33.052 11:57:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:33.052 "name": "BaseBdev3", 00:16:33.052 "aliases": [ 00:16:33.052 "34208269-03ca-4a75-959e-5547ba4fc081" 00:16:33.052 ], 00:16:33.052 "product_name": "Malloc disk", 00:16:33.052 "block_size": 512, 00:16:33.052 "num_blocks": 65536, 00:16:33.052 "uuid": "34208269-03ca-4a75-959e-5547ba4fc081", 00:16:33.052 "assigned_rate_limits": { 00:16:33.052 "rw_ios_per_sec": 0, 00:16:33.052 "rw_mbytes_per_sec": 0, 00:16:33.052 "r_mbytes_per_sec": 0, 00:16:33.052 "w_mbytes_per_sec": 0 00:16:33.052 }, 00:16:33.052 "claimed": true, 00:16:33.052 "claim_type": "exclusive_write", 00:16:33.052 "zoned": false, 00:16:33.052 "supported_io_types": { 00:16:33.052 "read": true, 00:16:33.052 "write": true, 00:16:33.052 "unmap": true, 00:16:33.052 "flush": true, 00:16:33.052 "reset": true, 00:16:33.052 "nvme_admin": false, 00:16:33.052 "nvme_io": false, 00:16:33.052 "nvme_io_md": false, 00:16:33.052 "write_zeroes": true, 00:16:33.052 "zcopy": true, 00:16:33.052 "get_zone_info": false, 00:16:33.052 "zone_management": false, 00:16:33.052 "zone_append": false, 00:16:33.052 "compare": false, 00:16:33.052 "compare_and_write": false, 00:16:33.052 "abort": true, 00:16:33.052 "seek_hole": false, 00:16:33.052 "seek_data": false, 00:16:33.052 "copy": true, 00:16:33.052 "nvme_iov_md": false 00:16:33.052 }, 00:16:33.052 "memory_domains": [ 00:16:33.052 { 00:16:33.052 "dma_device_id": "system", 00:16:33.052 "dma_device_type": 1 00:16:33.052 }, 00:16:33.052 { 00:16:33.052 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:33.052 "dma_device_type": 2 00:16:33.052 } 00:16:33.052 ], 00:16:33.052 "driver_specific": {} 00:16:33.052 }' 00:16:33.052 11:57:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:33.052 11:57:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:33.052 11:57:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:33.052 11:57:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:33.052 11:57:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:33.311 11:57:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:33.311 11:57:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:33.311 11:57:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:33.311 11:57:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:33.311 11:57:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:33.311 11:57:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:33.311 11:57:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:33.311 11:57:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:33.311 11:57:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:16:33.311 11:57:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:33.569 11:57:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:33.569 "name": "BaseBdev4", 00:16:33.569 "aliases": [ 00:16:33.569 "47de8451-1f1c-4f1e-bafd-8c85dd2e490e" 00:16:33.569 ], 00:16:33.569 "product_name": "Malloc disk", 00:16:33.569 "block_size": 512, 00:16:33.569 "num_blocks": 65536, 00:16:33.569 "uuid": "47de8451-1f1c-4f1e-bafd-8c85dd2e490e", 00:16:33.569 "assigned_rate_limits": { 00:16:33.569 "rw_ios_per_sec": 0, 00:16:33.569 "rw_mbytes_per_sec": 0, 00:16:33.569 "r_mbytes_per_sec": 0, 00:16:33.569 "w_mbytes_per_sec": 0 00:16:33.569 }, 00:16:33.569 "claimed": true, 00:16:33.569 "claim_type": "exclusive_write", 00:16:33.569 "zoned": false, 00:16:33.569 "supported_io_types": { 00:16:33.569 "read": true, 00:16:33.570 "write": true, 00:16:33.570 "unmap": true, 00:16:33.570 "flush": true, 00:16:33.570 "reset": true, 00:16:33.570 "nvme_admin": false, 00:16:33.570 "nvme_io": false, 00:16:33.570 "nvme_io_md": false, 00:16:33.570 "write_zeroes": true, 00:16:33.570 "zcopy": true, 00:16:33.570 "get_zone_info": false, 00:16:33.570 "zone_management": false, 00:16:33.570 "zone_append": false, 00:16:33.570 "compare": false, 00:16:33.570 "compare_and_write": false, 00:16:33.570 "abort": true, 00:16:33.570 "seek_hole": false, 00:16:33.570 "seek_data": false, 00:16:33.570 "copy": true, 00:16:33.570 "nvme_iov_md": false 00:16:33.570 }, 00:16:33.570 "memory_domains": [ 00:16:33.570 { 00:16:33.570 "dma_device_id": "system", 00:16:33.570 "dma_device_type": 1 00:16:33.570 }, 00:16:33.570 { 00:16:33.570 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:33.570 "dma_device_type": 2 00:16:33.570 } 00:16:33.570 ], 00:16:33.570 "driver_specific": {} 00:16:33.570 }' 00:16:33.570 11:57:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:33.570 11:57:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:33.570 11:57:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:33.570 11:57:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:33.570 11:57:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:33.570 11:57:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:33.570 11:57:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:33.570 11:57:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:33.828 11:57:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:33.828 11:57:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:33.828 11:57:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:33.828 11:57:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:33.828 11:57:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:33.828 [2024-07-12 11:57:24.045547] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:33.828 [2024-07-12 11:57:24.045565] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:33.828 [2024-07-12 11:57:24.045600] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:33.828 [2024-07-12 11:57:24.045643] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:33.828 [2024-07-12 11:57:24.045649] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x174f6b0 name Existed_Raid, state offline 00:16:33.828 11:57:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 667824 00:16:33.828 11:57:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 667824 ']' 00:16:33.828 11:57:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 667824 00:16:33.828 11:57:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:16:33.828 11:57:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:33.828 11:57:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 667824 00:16:34.089 11:57:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:34.089 11:57:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:34.089 11:57:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 667824' 00:16:34.089 killing process with pid 667824 00:16:34.089 11:57:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 667824 00:16:34.089 [2024-07-12 11:57:24.101753] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:34.089 11:57:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 667824 00:16:34.089 [2024-07-12 11:57:24.133014] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:34.089 11:57:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:16:34.089 00:16:34.089 real 0m24.324s 00:16:34.089 user 0m45.357s 00:16:34.089 sys 0m3.742s 00:16:34.089 11:57:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:34.089 11:57:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:34.089 ************************************ 00:16:34.089 END TEST raid_state_function_test_sb 00:16:34.089 ************************************ 00:16:34.089 11:57:24 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:34.089 11:57:24 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 4 00:16:34.089 11:57:24 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:16:34.089 11:57:24 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:34.089 11:57:24 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:34.390 ************************************ 00:16:34.390 START TEST raid_superblock_test 00:16:34.390 ************************************ 00:16:34.390 11:57:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 4 00:16:34.390 11:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:16:34.390 11:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:16:34.390 11:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:16:34.390 11:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:16:34.390 11:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:16:34.390 11:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:16:34.390 11:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:16:34.390 11:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:16:34.390 11:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:16:34.390 11:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:16:34.390 11:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:16:34.390 11:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:16:34.390 11:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:16:34.390 11:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:16:34.390 11:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:16:34.390 11:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:16:34.390 11:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=673098 00:16:34.390 11:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 673098 /var/tmp/spdk-raid.sock 00:16:34.390 11:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:16:34.390 11:57:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 673098 ']' 00:16:34.390 11:57:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:34.390 11:57:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:34.390 11:57:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:34.390 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:34.390 11:57:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:34.390 11:57:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:34.390 [2024-07-12 11:57:24.417208] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:16:34.390 [2024-07-12 11:57:24.417245] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid673098 ] 00:16:34.390 [2024-07-12 11:57:24.479724] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:34.390 [2024-07-12 11:57:24.558436] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:16:34.390 [2024-07-12 11:57:24.611238] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:34.390 [2024-07-12 11:57:24.611265] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:34.981 11:57:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:34.981 11:57:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:16:34.981 11:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:16:34.981 11:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:34.981 11:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:16:34.981 11:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:16:34.981 11:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:16:34.981 11:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:34.981 11:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:34.981 11:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:34.981 11:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:16:35.240 malloc1 00:16:35.240 11:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:35.498 [2024-07-12 11:57:25.531578] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:35.498 [2024-07-12 11:57:25.531610] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:35.498 [2024-07-12 11:57:25.531622] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc4e270 00:16:35.498 [2024-07-12 11:57:25.531642] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:35.498 [2024-07-12 11:57:25.532837] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:35.498 [2024-07-12 11:57:25.532857] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:35.498 pt1 00:16:35.498 11:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:35.498 11:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:35.498 11:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:16:35.498 11:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:16:35.498 11:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:16:35.498 11:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:35.498 11:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:35.498 11:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:35.498 11:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:16:35.498 malloc2 00:16:35.498 11:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:35.757 [2024-07-12 11:57:25.868176] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:35.757 [2024-07-12 11:57:25.868205] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:35.757 [2024-07-12 11:57:25.868216] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc4f580 00:16:35.757 [2024-07-12 11:57:25.868226] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:35.757 [2024-07-12 11:57:25.869408] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:35.757 [2024-07-12 11:57:25.869429] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:35.757 pt2 00:16:35.757 11:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:35.757 11:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:35.757 11:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:16:35.757 11:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:16:35.757 11:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:16:35.757 11:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:35.757 11:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:35.757 11:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:35.757 11:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:16:36.015 malloc3 00:16:36.015 11:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:36.015 [2024-07-12 11:57:26.196482] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:36.015 [2024-07-12 11:57:26.196511] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:36.015 [2024-07-12 11:57:26.196527] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdf9e30 00:16:36.015 [2024-07-12 11:57:26.196549] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:36.015 [2024-07-12 11:57:26.197672] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:36.015 [2024-07-12 11:57:26.197691] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:36.015 pt3 00:16:36.015 11:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:36.015 11:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:36.015 11:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:16:36.015 11:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:16:36.015 11:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:16:36.015 11:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:36.015 11:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:36.015 11:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:36.015 11:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:16:36.273 malloc4 00:16:36.273 11:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:16:36.532 [2024-07-12 11:57:26.524746] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:16:36.532 [2024-07-12 11:57:26.524776] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:36.532 [2024-07-12 11:57:26.524787] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdfc570 00:16:36.532 [2024-07-12 11:57:26.524793] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:36.532 [2024-07-12 11:57:26.525872] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:36.532 [2024-07-12 11:57:26.525891] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:16:36.532 pt4 00:16:36.532 11:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:36.532 11:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:36.532 11:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:16:36.532 [2024-07-12 11:57:26.693201] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:36.532 [2024-07-12 11:57:26.694097] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:36.532 [2024-07-12 11:57:26.694137] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:36.532 [2024-07-12 11:57:26.694166] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:16:36.532 [2024-07-12 11:57:26.694279] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xdfda80 00:16:36.532 [2024-07-12 11:57:26.694286] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:16:36.532 [2024-07-12 11:57:26.694420] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc4d5d0 00:16:36.532 [2024-07-12 11:57:26.694527] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xdfda80 00:16:36.532 [2024-07-12 11:57:26.694533] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xdfda80 00:16:36.532 [2024-07-12 11:57:26.694597] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:36.532 11:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:16:36.532 11:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:36.532 11:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:36.532 11:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:36.532 11:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:36.532 11:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:36.532 11:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:36.532 11:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:36.532 11:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:36.532 11:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:36.532 11:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:36.532 11:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:36.790 11:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:36.790 "name": "raid_bdev1", 00:16:36.790 "uuid": "850bfd65-c7e8-4326-9211-eb5a45a15bb0", 00:16:36.790 "strip_size_kb": 64, 00:16:36.790 "state": "online", 00:16:36.790 "raid_level": "concat", 00:16:36.790 "superblock": true, 00:16:36.790 "num_base_bdevs": 4, 00:16:36.790 "num_base_bdevs_discovered": 4, 00:16:36.790 "num_base_bdevs_operational": 4, 00:16:36.790 "base_bdevs_list": [ 00:16:36.790 { 00:16:36.790 "name": "pt1", 00:16:36.790 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:36.790 "is_configured": true, 00:16:36.790 "data_offset": 2048, 00:16:36.790 "data_size": 63488 00:16:36.790 }, 00:16:36.790 { 00:16:36.790 "name": "pt2", 00:16:36.790 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:36.790 "is_configured": true, 00:16:36.790 "data_offset": 2048, 00:16:36.790 "data_size": 63488 00:16:36.790 }, 00:16:36.790 { 00:16:36.790 "name": "pt3", 00:16:36.790 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:36.790 "is_configured": true, 00:16:36.790 "data_offset": 2048, 00:16:36.790 "data_size": 63488 00:16:36.790 }, 00:16:36.790 { 00:16:36.790 "name": "pt4", 00:16:36.790 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:36.790 "is_configured": true, 00:16:36.790 "data_offset": 2048, 00:16:36.790 "data_size": 63488 00:16:36.791 } 00:16:36.791 ] 00:16:36.791 }' 00:16:36.791 11:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:36.791 11:57:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:37.358 11:57:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:16:37.358 11:57:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:16:37.358 11:57:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:37.358 11:57:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:37.358 11:57:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:37.358 11:57:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:37.358 11:57:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:37.358 11:57:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:37.358 [2024-07-12 11:57:27.515492] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:37.358 11:57:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:37.358 "name": "raid_bdev1", 00:16:37.358 "aliases": [ 00:16:37.358 "850bfd65-c7e8-4326-9211-eb5a45a15bb0" 00:16:37.358 ], 00:16:37.358 "product_name": "Raid Volume", 00:16:37.358 "block_size": 512, 00:16:37.358 "num_blocks": 253952, 00:16:37.358 "uuid": "850bfd65-c7e8-4326-9211-eb5a45a15bb0", 00:16:37.358 "assigned_rate_limits": { 00:16:37.358 "rw_ios_per_sec": 0, 00:16:37.358 "rw_mbytes_per_sec": 0, 00:16:37.358 "r_mbytes_per_sec": 0, 00:16:37.358 "w_mbytes_per_sec": 0 00:16:37.358 }, 00:16:37.358 "claimed": false, 00:16:37.358 "zoned": false, 00:16:37.358 "supported_io_types": { 00:16:37.358 "read": true, 00:16:37.358 "write": true, 00:16:37.358 "unmap": true, 00:16:37.358 "flush": true, 00:16:37.358 "reset": true, 00:16:37.358 "nvme_admin": false, 00:16:37.358 "nvme_io": false, 00:16:37.358 "nvme_io_md": false, 00:16:37.358 "write_zeroes": true, 00:16:37.358 "zcopy": false, 00:16:37.358 "get_zone_info": false, 00:16:37.358 "zone_management": false, 00:16:37.358 "zone_append": false, 00:16:37.358 "compare": false, 00:16:37.358 "compare_and_write": false, 00:16:37.358 "abort": false, 00:16:37.358 "seek_hole": false, 00:16:37.358 "seek_data": false, 00:16:37.358 "copy": false, 00:16:37.358 "nvme_iov_md": false 00:16:37.358 }, 00:16:37.358 "memory_domains": [ 00:16:37.358 { 00:16:37.358 "dma_device_id": "system", 00:16:37.358 "dma_device_type": 1 00:16:37.358 }, 00:16:37.358 { 00:16:37.358 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:37.358 "dma_device_type": 2 00:16:37.358 }, 00:16:37.358 { 00:16:37.358 "dma_device_id": "system", 00:16:37.358 "dma_device_type": 1 00:16:37.358 }, 00:16:37.358 { 00:16:37.358 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:37.358 "dma_device_type": 2 00:16:37.358 }, 00:16:37.358 { 00:16:37.358 "dma_device_id": "system", 00:16:37.358 "dma_device_type": 1 00:16:37.358 }, 00:16:37.358 { 00:16:37.358 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:37.358 "dma_device_type": 2 00:16:37.358 }, 00:16:37.358 { 00:16:37.358 "dma_device_id": "system", 00:16:37.358 "dma_device_type": 1 00:16:37.358 }, 00:16:37.358 { 00:16:37.358 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:37.358 "dma_device_type": 2 00:16:37.358 } 00:16:37.358 ], 00:16:37.358 "driver_specific": { 00:16:37.358 "raid": { 00:16:37.358 "uuid": "850bfd65-c7e8-4326-9211-eb5a45a15bb0", 00:16:37.358 "strip_size_kb": 64, 00:16:37.358 "state": "online", 00:16:37.358 "raid_level": "concat", 00:16:37.358 "superblock": true, 00:16:37.358 "num_base_bdevs": 4, 00:16:37.358 "num_base_bdevs_discovered": 4, 00:16:37.358 "num_base_bdevs_operational": 4, 00:16:37.358 "base_bdevs_list": [ 00:16:37.358 { 00:16:37.358 "name": "pt1", 00:16:37.358 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:37.358 "is_configured": true, 00:16:37.358 "data_offset": 2048, 00:16:37.358 "data_size": 63488 00:16:37.358 }, 00:16:37.358 { 00:16:37.359 "name": "pt2", 00:16:37.359 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:37.359 "is_configured": true, 00:16:37.359 "data_offset": 2048, 00:16:37.359 "data_size": 63488 00:16:37.359 }, 00:16:37.359 { 00:16:37.359 "name": "pt3", 00:16:37.359 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:37.359 "is_configured": true, 00:16:37.359 "data_offset": 2048, 00:16:37.359 "data_size": 63488 00:16:37.359 }, 00:16:37.359 { 00:16:37.359 "name": "pt4", 00:16:37.359 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:37.359 "is_configured": true, 00:16:37.359 "data_offset": 2048, 00:16:37.359 "data_size": 63488 00:16:37.359 } 00:16:37.359 ] 00:16:37.359 } 00:16:37.359 } 00:16:37.359 }' 00:16:37.359 11:57:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:37.359 11:57:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:16:37.359 pt2 00:16:37.359 pt3 00:16:37.359 pt4' 00:16:37.359 11:57:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:37.359 11:57:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:37.359 11:57:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:37.617 11:57:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:37.617 "name": "pt1", 00:16:37.617 "aliases": [ 00:16:37.617 "00000000-0000-0000-0000-000000000001" 00:16:37.617 ], 00:16:37.617 "product_name": "passthru", 00:16:37.617 "block_size": 512, 00:16:37.617 "num_blocks": 65536, 00:16:37.617 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:37.617 "assigned_rate_limits": { 00:16:37.617 "rw_ios_per_sec": 0, 00:16:37.617 "rw_mbytes_per_sec": 0, 00:16:37.617 "r_mbytes_per_sec": 0, 00:16:37.617 "w_mbytes_per_sec": 0 00:16:37.617 }, 00:16:37.617 "claimed": true, 00:16:37.617 "claim_type": "exclusive_write", 00:16:37.617 "zoned": false, 00:16:37.617 "supported_io_types": { 00:16:37.617 "read": true, 00:16:37.617 "write": true, 00:16:37.617 "unmap": true, 00:16:37.617 "flush": true, 00:16:37.617 "reset": true, 00:16:37.617 "nvme_admin": false, 00:16:37.617 "nvme_io": false, 00:16:37.617 "nvme_io_md": false, 00:16:37.617 "write_zeroes": true, 00:16:37.617 "zcopy": true, 00:16:37.617 "get_zone_info": false, 00:16:37.617 "zone_management": false, 00:16:37.617 "zone_append": false, 00:16:37.617 "compare": false, 00:16:37.617 "compare_and_write": false, 00:16:37.617 "abort": true, 00:16:37.617 "seek_hole": false, 00:16:37.617 "seek_data": false, 00:16:37.617 "copy": true, 00:16:37.617 "nvme_iov_md": false 00:16:37.617 }, 00:16:37.617 "memory_domains": [ 00:16:37.617 { 00:16:37.617 "dma_device_id": "system", 00:16:37.617 "dma_device_type": 1 00:16:37.617 }, 00:16:37.617 { 00:16:37.617 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:37.617 "dma_device_type": 2 00:16:37.617 } 00:16:37.617 ], 00:16:37.617 "driver_specific": { 00:16:37.617 "passthru": { 00:16:37.617 "name": "pt1", 00:16:37.617 "base_bdev_name": "malloc1" 00:16:37.617 } 00:16:37.617 } 00:16:37.617 }' 00:16:37.617 11:57:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:37.617 11:57:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:37.617 11:57:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:37.617 11:57:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:37.875 11:57:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:37.875 11:57:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:37.875 11:57:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:37.875 11:57:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:37.875 11:57:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:37.875 11:57:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:37.875 11:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:37.875 11:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:37.875 11:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:37.875 11:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:16:37.875 11:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:38.132 11:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:38.132 "name": "pt2", 00:16:38.132 "aliases": [ 00:16:38.132 "00000000-0000-0000-0000-000000000002" 00:16:38.132 ], 00:16:38.132 "product_name": "passthru", 00:16:38.132 "block_size": 512, 00:16:38.132 "num_blocks": 65536, 00:16:38.132 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:38.132 "assigned_rate_limits": { 00:16:38.132 "rw_ios_per_sec": 0, 00:16:38.132 "rw_mbytes_per_sec": 0, 00:16:38.132 "r_mbytes_per_sec": 0, 00:16:38.132 "w_mbytes_per_sec": 0 00:16:38.132 }, 00:16:38.132 "claimed": true, 00:16:38.132 "claim_type": "exclusive_write", 00:16:38.132 "zoned": false, 00:16:38.132 "supported_io_types": { 00:16:38.132 "read": true, 00:16:38.132 "write": true, 00:16:38.132 "unmap": true, 00:16:38.132 "flush": true, 00:16:38.132 "reset": true, 00:16:38.132 "nvme_admin": false, 00:16:38.132 "nvme_io": false, 00:16:38.132 "nvme_io_md": false, 00:16:38.132 "write_zeroes": true, 00:16:38.132 "zcopy": true, 00:16:38.132 "get_zone_info": false, 00:16:38.132 "zone_management": false, 00:16:38.132 "zone_append": false, 00:16:38.132 "compare": false, 00:16:38.132 "compare_and_write": false, 00:16:38.132 "abort": true, 00:16:38.132 "seek_hole": false, 00:16:38.132 "seek_data": false, 00:16:38.132 "copy": true, 00:16:38.132 "nvme_iov_md": false 00:16:38.132 }, 00:16:38.132 "memory_domains": [ 00:16:38.132 { 00:16:38.132 "dma_device_id": "system", 00:16:38.132 "dma_device_type": 1 00:16:38.132 }, 00:16:38.132 { 00:16:38.132 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:38.132 "dma_device_type": 2 00:16:38.132 } 00:16:38.132 ], 00:16:38.132 "driver_specific": { 00:16:38.132 "passthru": { 00:16:38.132 "name": "pt2", 00:16:38.132 "base_bdev_name": "malloc2" 00:16:38.132 } 00:16:38.132 } 00:16:38.132 }' 00:16:38.132 11:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:38.132 11:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:38.132 11:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:38.132 11:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:38.132 11:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:38.390 11:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:38.390 11:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:38.390 11:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:38.390 11:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:38.390 11:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:38.390 11:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:38.390 11:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:38.390 11:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:38.390 11:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:16:38.390 11:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:38.649 11:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:38.649 "name": "pt3", 00:16:38.649 "aliases": [ 00:16:38.649 "00000000-0000-0000-0000-000000000003" 00:16:38.649 ], 00:16:38.649 "product_name": "passthru", 00:16:38.649 "block_size": 512, 00:16:38.649 "num_blocks": 65536, 00:16:38.649 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:38.649 "assigned_rate_limits": { 00:16:38.649 "rw_ios_per_sec": 0, 00:16:38.649 "rw_mbytes_per_sec": 0, 00:16:38.649 "r_mbytes_per_sec": 0, 00:16:38.649 "w_mbytes_per_sec": 0 00:16:38.649 }, 00:16:38.649 "claimed": true, 00:16:38.649 "claim_type": "exclusive_write", 00:16:38.649 "zoned": false, 00:16:38.649 "supported_io_types": { 00:16:38.649 "read": true, 00:16:38.649 "write": true, 00:16:38.649 "unmap": true, 00:16:38.649 "flush": true, 00:16:38.649 "reset": true, 00:16:38.649 "nvme_admin": false, 00:16:38.649 "nvme_io": false, 00:16:38.649 "nvme_io_md": false, 00:16:38.649 "write_zeroes": true, 00:16:38.649 "zcopy": true, 00:16:38.649 "get_zone_info": false, 00:16:38.649 "zone_management": false, 00:16:38.649 "zone_append": false, 00:16:38.649 "compare": false, 00:16:38.649 "compare_and_write": false, 00:16:38.649 "abort": true, 00:16:38.649 "seek_hole": false, 00:16:38.649 "seek_data": false, 00:16:38.649 "copy": true, 00:16:38.649 "nvme_iov_md": false 00:16:38.649 }, 00:16:38.649 "memory_domains": [ 00:16:38.649 { 00:16:38.649 "dma_device_id": "system", 00:16:38.649 "dma_device_type": 1 00:16:38.649 }, 00:16:38.649 { 00:16:38.649 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:38.649 "dma_device_type": 2 00:16:38.649 } 00:16:38.649 ], 00:16:38.649 "driver_specific": { 00:16:38.649 "passthru": { 00:16:38.649 "name": "pt3", 00:16:38.649 "base_bdev_name": "malloc3" 00:16:38.649 } 00:16:38.649 } 00:16:38.649 }' 00:16:38.649 11:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:38.649 11:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:38.649 11:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:38.649 11:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:38.649 11:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:38.649 11:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:38.649 11:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:38.907 11:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:38.907 11:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:38.907 11:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:38.907 11:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:38.907 11:57:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:38.907 11:57:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:38.907 11:57:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:16:38.907 11:57:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:39.166 11:57:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:39.166 "name": "pt4", 00:16:39.166 "aliases": [ 00:16:39.166 "00000000-0000-0000-0000-000000000004" 00:16:39.166 ], 00:16:39.166 "product_name": "passthru", 00:16:39.166 "block_size": 512, 00:16:39.166 "num_blocks": 65536, 00:16:39.166 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:39.166 "assigned_rate_limits": { 00:16:39.166 "rw_ios_per_sec": 0, 00:16:39.166 "rw_mbytes_per_sec": 0, 00:16:39.166 "r_mbytes_per_sec": 0, 00:16:39.166 "w_mbytes_per_sec": 0 00:16:39.166 }, 00:16:39.166 "claimed": true, 00:16:39.166 "claim_type": "exclusive_write", 00:16:39.166 "zoned": false, 00:16:39.166 "supported_io_types": { 00:16:39.166 "read": true, 00:16:39.166 "write": true, 00:16:39.166 "unmap": true, 00:16:39.166 "flush": true, 00:16:39.166 "reset": true, 00:16:39.166 "nvme_admin": false, 00:16:39.166 "nvme_io": false, 00:16:39.166 "nvme_io_md": false, 00:16:39.166 "write_zeroes": true, 00:16:39.166 "zcopy": true, 00:16:39.166 "get_zone_info": false, 00:16:39.166 "zone_management": false, 00:16:39.166 "zone_append": false, 00:16:39.166 "compare": false, 00:16:39.166 "compare_and_write": false, 00:16:39.166 "abort": true, 00:16:39.166 "seek_hole": false, 00:16:39.166 "seek_data": false, 00:16:39.166 "copy": true, 00:16:39.166 "nvme_iov_md": false 00:16:39.166 }, 00:16:39.166 "memory_domains": [ 00:16:39.166 { 00:16:39.166 "dma_device_id": "system", 00:16:39.166 "dma_device_type": 1 00:16:39.166 }, 00:16:39.166 { 00:16:39.166 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:39.166 "dma_device_type": 2 00:16:39.166 } 00:16:39.166 ], 00:16:39.166 "driver_specific": { 00:16:39.166 "passthru": { 00:16:39.166 "name": "pt4", 00:16:39.166 "base_bdev_name": "malloc4" 00:16:39.166 } 00:16:39.166 } 00:16:39.166 }' 00:16:39.166 11:57:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:39.166 11:57:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:39.166 11:57:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:39.166 11:57:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:39.166 11:57:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:39.166 11:57:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:39.166 11:57:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:39.166 11:57:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:39.425 11:57:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:39.425 11:57:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:39.425 11:57:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:39.425 11:57:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:39.425 11:57:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:39.425 11:57:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:16:39.682 [2024-07-12 11:57:29.677211] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:39.682 11:57:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=850bfd65-c7e8-4326-9211-eb5a45a15bb0 00:16:39.682 11:57:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 850bfd65-c7e8-4326-9211-eb5a45a15bb0 ']' 00:16:39.682 11:57:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:39.682 [2024-07-12 11:57:29.841432] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:39.682 [2024-07-12 11:57:29.841447] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:39.682 [2024-07-12 11:57:29.841481] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:39.682 [2024-07-12 11:57:29.841528] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:39.682 [2024-07-12 11:57:29.841534] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xdfda80 name raid_bdev1, state offline 00:16:39.682 11:57:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:39.682 11:57:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:16:39.938 11:57:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:16:39.938 11:57:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:16:39.938 11:57:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:39.938 11:57:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:16:40.195 11:57:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:40.196 11:57:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:40.196 11:57:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:40.196 11:57:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:16:40.454 11:57:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:40.454 11:57:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:16:40.454 11:57:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:16:40.454 11:57:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:16:40.712 11:57:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:16:40.712 11:57:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:16:40.712 11:57:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:16:40.712 11:57:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:16:40.712 11:57:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:40.712 11:57:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:40.712 11:57:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:40.712 11:57:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:40.712 11:57:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:40.712 11:57:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:40.712 11:57:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:40.712 11:57:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:16:40.712 11:57:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:16:40.970 [2024-07-12 11:57:30.964327] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:16:40.970 [2024-07-12 11:57:30.965276] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:16:40.970 [2024-07-12 11:57:30.965305] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:16:40.970 [2024-07-12 11:57:30.965325] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:16:40.970 [2024-07-12 11:57:30.965361] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:16:40.970 [2024-07-12 11:57:30.965386] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:16:40.970 [2024-07-12 11:57:30.965414] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:16:40.970 [2024-07-12 11:57:30.965426] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:16:40.970 [2024-07-12 11:57:30.965435] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:40.970 [2024-07-12 11:57:30.965441] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xdfc7a0 name raid_bdev1, state configuring 00:16:40.970 request: 00:16:40.970 { 00:16:40.970 "name": "raid_bdev1", 00:16:40.970 "raid_level": "concat", 00:16:40.970 "base_bdevs": [ 00:16:40.970 "malloc1", 00:16:40.970 "malloc2", 00:16:40.970 "malloc3", 00:16:40.970 "malloc4" 00:16:40.970 ], 00:16:40.970 "superblock": false, 00:16:40.970 "strip_size_kb": 64, 00:16:40.970 "method": "bdev_raid_create", 00:16:40.970 "req_id": 1 00:16:40.970 } 00:16:40.970 Got JSON-RPC error response 00:16:40.970 response: 00:16:40.970 { 00:16:40.970 "code": -17, 00:16:40.970 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:16:40.970 } 00:16:40.970 11:57:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:16:40.970 11:57:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:16:40.970 11:57:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:16:40.970 11:57:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:16:40.970 11:57:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:40.970 11:57:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:16:40.970 11:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:16:40.970 11:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:16:40.970 11:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:41.229 [2024-07-12 11:57:31.293150] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:41.229 [2024-07-12 11:57:31.293175] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:41.229 [2024-07-12 11:57:31.293185] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdf90c0 00:16:41.229 [2024-07-12 11:57:31.293191] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:41.229 [2024-07-12 11:57:31.294379] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:41.229 [2024-07-12 11:57:31.294398] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:41.229 [2024-07-12 11:57:31.294441] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:16:41.229 [2024-07-12 11:57:31.294458] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:41.229 pt1 00:16:41.229 11:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:16:41.229 11:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:41.229 11:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:41.229 11:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:41.229 11:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:41.229 11:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:41.229 11:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:41.229 11:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:41.229 11:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:41.229 11:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:41.229 11:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:41.229 11:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:41.487 11:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:41.487 "name": "raid_bdev1", 00:16:41.487 "uuid": "850bfd65-c7e8-4326-9211-eb5a45a15bb0", 00:16:41.487 "strip_size_kb": 64, 00:16:41.487 "state": "configuring", 00:16:41.487 "raid_level": "concat", 00:16:41.487 "superblock": true, 00:16:41.487 "num_base_bdevs": 4, 00:16:41.487 "num_base_bdevs_discovered": 1, 00:16:41.487 "num_base_bdevs_operational": 4, 00:16:41.487 "base_bdevs_list": [ 00:16:41.487 { 00:16:41.487 "name": "pt1", 00:16:41.487 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:41.487 "is_configured": true, 00:16:41.487 "data_offset": 2048, 00:16:41.487 "data_size": 63488 00:16:41.487 }, 00:16:41.487 { 00:16:41.487 "name": null, 00:16:41.487 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:41.487 "is_configured": false, 00:16:41.487 "data_offset": 2048, 00:16:41.487 "data_size": 63488 00:16:41.487 }, 00:16:41.487 { 00:16:41.487 "name": null, 00:16:41.487 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:41.487 "is_configured": false, 00:16:41.487 "data_offset": 2048, 00:16:41.487 "data_size": 63488 00:16:41.487 }, 00:16:41.487 { 00:16:41.487 "name": null, 00:16:41.487 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:41.487 "is_configured": false, 00:16:41.487 "data_offset": 2048, 00:16:41.487 "data_size": 63488 00:16:41.487 } 00:16:41.487 ] 00:16:41.487 }' 00:16:41.487 11:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:41.487 11:57:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:41.746 11:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:16:41.746 11:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:42.004 [2024-07-12 11:57:32.115280] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:42.004 [2024-07-12 11:57:32.115314] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:42.004 [2024-07-12 11:57:32.115326] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdff300 00:16:42.004 [2024-07-12 11:57:32.115332] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:42.004 [2024-07-12 11:57:32.115583] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:42.004 [2024-07-12 11:57:32.115594] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:42.004 [2024-07-12 11:57:32.115637] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:16:42.004 [2024-07-12 11:57:32.115650] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:42.004 pt2 00:16:42.004 11:57:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:42.263 [2024-07-12 11:57:32.283731] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:16:42.263 11:57:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:16:42.263 11:57:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:42.263 11:57:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:42.263 11:57:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:42.263 11:57:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:42.263 11:57:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:42.263 11:57:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:42.263 11:57:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:42.263 11:57:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:42.263 11:57:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:42.263 11:57:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:42.263 11:57:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:42.263 11:57:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:42.263 "name": "raid_bdev1", 00:16:42.263 "uuid": "850bfd65-c7e8-4326-9211-eb5a45a15bb0", 00:16:42.263 "strip_size_kb": 64, 00:16:42.263 "state": "configuring", 00:16:42.263 "raid_level": "concat", 00:16:42.263 "superblock": true, 00:16:42.263 "num_base_bdevs": 4, 00:16:42.263 "num_base_bdevs_discovered": 1, 00:16:42.263 "num_base_bdevs_operational": 4, 00:16:42.263 "base_bdevs_list": [ 00:16:42.263 { 00:16:42.263 "name": "pt1", 00:16:42.263 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:42.263 "is_configured": true, 00:16:42.263 "data_offset": 2048, 00:16:42.263 "data_size": 63488 00:16:42.263 }, 00:16:42.263 { 00:16:42.263 "name": null, 00:16:42.263 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:42.263 "is_configured": false, 00:16:42.263 "data_offset": 2048, 00:16:42.263 "data_size": 63488 00:16:42.263 }, 00:16:42.263 { 00:16:42.263 "name": null, 00:16:42.263 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:42.263 "is_configured": false, 00:16:42.263 "data_offset": 2048, 00:16:42.263 "data_size": 63488 00:16:42.263 }, 00:16:42.263 { 00:16:42.263 "name": null, 00:16:42.263 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:42.263 "is_configured": false, 00:16:42.263 "data_offset": 2048, 00:16:42.263 "data_size": 63488 00:16:42.263 } 00:16:42.263 ] 00:16:42.263 }' 00:16:42.263 11:57:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:42.263 11:57:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:42.828 11:57:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:16:42.828 11:57:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:42.828 11:57:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:43.084 [2024-07-12 11:57:33.105842] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:43.084 [2024-07-12 11:57:33.105877] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:43.085 [2024-07-12 11:57:33.105888] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdff630 00:16:43.085 [2024-07-12 11:57:33.105909] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:43.085 [2024-07-12 11:57:33.106154] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:43.085 [2024-07-12 11:57:33.106164] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:43.085 [2024-07-12 11:57:33.106205] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:16:43.085 [2024-07-12 11:57:33.106216] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:43.085 pt2 00:16:43.085 11:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:16:43.085 11:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:43.085 11:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:43.085 [2024-07-12 11:57:33.274274] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:43.085 [2024-07-12 11:57:33.274292] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:43.085 [2024-07-12 11:57:33.274300] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdff8e0 00:16:43.085 [2024-07-12 11:57:33.274305] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:43.085 [2024-07-12 11:57:33.274506] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:43.085 [2024-07-12 11:57:33.274515] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:43.085 [2024-07-12 11:57:33.274551] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:16:43.085 [2024-07-12 11:57:33.274561] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:43.085 pt3 00:16:43.085 11:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:16:43.085 11:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:43.085 11:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:16:43.342 [2024-07-12 11:57:33.442732] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:16:43.342 [2024-07-12 11:57:33.442754] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:43.342 [2024-07-12 11:57:33.442761] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdfaff0 00:16:43.342 [2024-07-12 11:57:33.442766] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:43.342 [2024-07-12 11:57:33.442964] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:43.342 [2024-07-12 11:57:33.442973] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:16:43.342 [2024-07-12 11:57:33.443001] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:16:43.342 [2024-07-12 11:57:33.443011] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:16:43.342 [2024-07-12 11:57:33.443086] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xdf8280 00:16:43.342 [2024-07-12 11:57:33.443092] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:16:43.342 [2024-07-12 11:57:33.443200] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xce5030 00:16:43.342 [2024-07-12 11:57:33.443284] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xdf8280 00:16:43.342 [2024-07-12 11:57:33.443289] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xdf8280 00:16:43.342 [2024-07-12 11:57:33.443350] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:43.342 pt4 00:16:43.342 11:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:16:43.342 11:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:43.342 11:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:16:43.342 11:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:43.342 11:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:43.342 11:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:43.342 11:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:43.342 11:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:43.342 11:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:43.342 11:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:43.343 11:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:43.343 11:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:43.343 11:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:43.343 11:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:43.601 11:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:43.601 "name": "raid_bdev1", 00:16:43.601 "uuid": "850bfd65-c7e8-4326-9211-eb5a45a15bb0", 00:16:43.601 "strip_size_kb": 64, 00:16:43.601 "state": "online", 00:16:43.601 "raid_level": "concat", 00:16:43.601 "superblock": true, 00:16:43.601 "num_base_bdevs": 4, 00:16:43.601 "num_base_bdevs_discovered": 4, 00:16:43.601 "num_base_bdevs_operational": 4, 00:16:43.601 "base_bdevs_list": [ 00:16:43.601 { 00:16:43.601 "name": "pt1", 00:16:43.601 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:43.601 "is_configured": true, 00:16:43.601 "data_offset": 2048, 00:16:43.601 "data_size": 63488 00:16:43.601 }, 00:16:43.601 { 00:16:43.601 "name": "pt2", 00:16:43.601 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:43.601 "is_configured": true, 00:16:43.601 "data_offset": 2048, 00:16:43.601 "data_size": 63488 00:16:43.601 }, 00:16:43.601 { 00:16:43.601 "name": "pt3", 00:16:43.601 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:43.601 "is_configured": true, 00:16:43.601 "data_offset": 2048, 00:16:43.601 "data_size": 63488 00:16:43.601 }, 00:16:43.601 { 00:16:43.601 "name": "pt4", 00:16:43.601 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:43.601 "is_configured": true, 00:16:43.601 "data_offset": 2048, 00:16:43.601 "data_size": 63488 00:16:43.601 } 00:16:43.601 ] 00:16:43.601 }' 00:16:43.601 11:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:43.601 11:57:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:44.168 11:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:16:44.168 11:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:16:44.168 11:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:44.168 11:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:44.168 11:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:44.168 11:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:44.168 11:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:44.168 11:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:44.168 [2024-07-12 11:57:34.277110] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:44.168 11:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:44.168 "name": "raid_bdev1", 00:16:44.168 "aliases": [ 00:16:44.168 "850bfd65-c7e8-4326-9211-eb5a45a15bb0" 00:16:44.168 ], 00:16:44.168 "product_name": "Raid Volume", 00:16:44.168 "block_size": 512, 00:16:44.168 "num_blocks": 253952, 00:16:44.168 "uuid": "850bfd65-c7e8-4326-9211-eb5a45a15bb0", 00:16:44.168 "assigned_rate_limits": { 00:16:44.168 "rw_ios_per_sec": 0, 00:16:44.168 "rw_mbytes_per_sec": 0, 00:16:44.168 "r_mbytes_per_sec": 0, 00:16:44.168 "w_mbytes_per_sec": 0 00:16:44.168 }, 00:16:44.168 "claimed": false, 00:16:44.168 "zoned": false, 00:16:44.168 "supported_io_types": { 00:16:44.168 "read": true, 00:16:44.168 "write": true, 00:16:44.168 "unmap": true, 00:16:44.168 "flush": true, 00:16:44.168 "reset": true, 00:16:44.168 "nvme_admin": false, 00:16:44.168 "nvme_io": false, 00:16:44.168 "nvme_io_md": false, 00:16:44.168 "write_zeroes": true, 00:16:44.168 "zcopy": false, 00:16:44.168 "get_zone_info": false, 00:16:44.168 "zone_management": false, 00:16:44.168 "zone_append": false, 00:16:44.168 "compare": false, 00:16:44.168 "compare_and_write": false, 00:16:44.168 "abort": false, 00:16:44.168 "seek_hole": false, 00:16:44.168 "seek_data": false, 00:16:44.168 "copy": false, 00:16:44.168 "nvme_iov_md": false 00:16:44.168 }, 00:16:44.168 "memory_domains": [ 00:16:44.168 { 00:16:44.168 "dma_device_id": "system", 00:16:44.168 "dma_device_type": 1 00:16:44.168 }, 00:16:44.168 { 00:16:44.168 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:44.168 "dma_device_type": 2 00:16:44.168 }, 00:16:44.168 { 00:16:44.168 "dma_device_id": "system", 00:16:44.168 "dma_device_type": 1 00:16:44.168 }, 00:16:44.168 { 00:16:44.168 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:44.168 "dma_device_type": 2 00:16:44.168 }, 00:16:44.168 { 00:16:44.168 "dma_device_id": "system", 00:16:44.168 "dma_device_type": 1 00:16:44.168 }, 00:16:44.168 { 00:16:44.168 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:44.168 "dma_device_type": 2 00:16:44.168 }, 00:16:44.168 { 00:16:44.168 "dma_device_id": "system", 00:16:44.168 "dma_device_type": 1 00:16:44.168 }, 00:16:44.168 { 00:16:44.168 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:44.168 "dma_device_type": 2 00:16:44.168 } 00:16:44.168 ], 00:16:44.168 "driver_specific": { 00:16:44.168 "raid": { 00:16:44.168 "uuid": "850bfd65-c7e8-4326-9211-eb5a45a15bb0", 00:16:44.168 "strip_size_kb": 64, 00:16:44.168 "state": "online", 00:16:44.168 "raid_level": "concat", 00:16:44.168 "superblock": true, 00:16:44.168 "num_base_bdevs": 4, 00:16:44.168 "num_base_bdevs_discovered": 4, 00:16:44.168 "num_base_bdevs_operational": 4, 00:16:44.168 "base_bdevs_list": [ 00:16:44.168 { 00:16:44.168 "name": "pt1", 00:16:44.168 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:44.168 "is_configured": true, 00:16:44.168 "data_offset": 2048, 00:16:44.168 "data_size": 63488 00:16:44.168 }, 00:16:44.168 { 00:16:44.168 "name": "pt2", 00:16:44.168 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:44.168 "is_configured": true, 00:16:44.168 "data_offset": 2048, 00:16:44.168 "data_size": 63488 00:16:44.168 }, 00:16:44.168 { 00:16:44.168 "name": "pt3", 00:16:44.168 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:44.168 "is_configured": true, 00:16:44.168 "data_offset": 2048, 00:16:44.168 "data_size": 63488 00:16:44.168 }, 00:16:44.168 { 00:16:44.168 "name": "pt4", 00:16:44.168 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:44.168 "is_configured": true, 00:16:44.168 "data_offset": 2048, 00:16:44.168 "data_size": 63488 00:16:44.168 } 00:16:44.168 ] 00:16:44.168 } 00:16:44.168 } 00:16:44.168 }' 00:16:44.168 11:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:44.168 11:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:16:44.168 pt2 00:16:44.168 pt3 00:16:44.168 pt4' 00:16:44.168 11:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:44.169 11:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:44.169 11:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:44.427 11:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:44.427 "name": "pt1", 00:16:44.427 "aliases": [ 00:16:44.427 "00000000-0000-0000-0000-000000000001" 00:16:44.427 ], 00:16:44.427 "product_name": "passthru", 00:16:44.427 "block_size": 512, 00:16:44.427 "num_blocks": 65536, 00:16:44.427 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:44.427 "assigned_rate_limits": { 00:16:44.427 "rw_ios_per_sec": 0, 00:16:44.427 "rw_mbytes_per_sec": 0, 00:16:44.427 "r_mbytes_per_sec": 0, 00:16:44.427 "w_mbytes_per_sec": 0 00:16:44.427 }, 00:16:44.427 "claimed": true, 00:16:44.427 "claim_type": "exclusive_write", 00:16:44.427 "zoned": false, 00:16:44.427 "supported_io_types": { 00:16:44.427 "read": true, 00:16:44.427 "write": true, 00:16:44.427 "unmap": true, 00:16:44.427 "flush": true, 00:16:44.427 "reset": true, 00:16:44.427 "nvme_admin": false, 00:16:44.427 "nvme_io": false, 00:16:44.427 "nvme_io_md": false, 00:16:44.427 "write_zeroes": true, 00:16:44.427 "zcopy": true, 00:16:44.427 "get_zone_info": false, 00:16:44.427 "zone_management": false, 00:16:44.427 "zone_append": false, 00:16:44.427 "compare": false, 00:16:44.427 "compare_and_write": false, 00:16:44.427 "abort": true, 00:16:44.427 "seek_hole": false, 00:16:44.427 "seek_data": false, 00:16:44.427 "copy": true, 00:16:44.427 "nvme_iov_md": false 00:16:44.427 }, 00:16:44.427 "memory_domains": [ 00:16:44.427 { 00:16:44.427 "dma_device_id": "system", 00:16:44.427 "dma_device_type": 1 00:16:44.427 }, 00:16:44.427 { 00:16:44.427 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:44.427 "dma_device_type": 2 00:16:44.427 } 00:16:44.427 ], 00:16:44.427 "driver_specific": { 00:16:44.427 "passthru": { 00:16:44.427 "name": "pt1", 00:16:44.427 "base_bdev_name": "malloc1" 00:16:44.427 } 00:16:44.427 } 00:16:44.427 }' 00:16:44.427 11:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:44.427 11:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:44.427 11:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:44.427 11:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:44.427 11:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:44.427 11:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:44.427 11:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:44.685 11:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:44.685 11:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:44.685 11:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:44.685 11:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:44.685 11:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:44.685 11:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:44.685 11:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:44.685 11:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:16:44.943 11:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:44.943 "name": "pt2", 00:16:44.943 "aliases": [ 00:16:44.943 "00000000-0000-0000-0000-000000000002" 00:16:44.943 ], 00:16:44.943 "product_name": "passthru", 00:16:44.943 "block_size": 512, 00:16:44.943 "num_blocks": 65536, 00:16:44.943 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:44.943 "assigned_rate_limits": { 00:16:44.943 "rw_ios_per_sec": 0, 00:16:44.943 "rw_mbytes_per_sec": 0, 00:16:44.943 "r_mbytes_per_sec": 0, 00:16:44.943 "w_mbytes_per_sec": 0 00:16:44.943 }, 00:16:44.943 "claimed": true, 00:16:44.943 "claim_type": "exclusive_write", 00:16:44.943 "zoned": false, 00:16:44.943 "supported_io_types": { 00:16:44.943 "read": true, 00:16:44.943 "write": true, 00:16:44.943 "unmap": true, 00:16:44.943 "flush": true, 00:16:44.943 "reset": true, 00:16:44.943 "nvme_admin": false, 00:16:44.943 "nvme_io": false, 00:16:44.943 "nvme_io_md": false, 00:16:44.943 "write_zeroes": true, 00:16:44.943 "zcopy": true, 00:16:44.943 "get_zone_info": false, 00:16:44.943 "zone_management": false, 00:16:44.943 "zone_append": false, 00:16:44.943 "compare": false, 00:16:44.943 "compare_and_write": false, 00:16:44.943 "abort": true, 00:16:44.943 "seek_hole": false, 00:16:44.943 "seek_data": false, 00:16:44.943 "copy": true, 00:16:44.943 "nvme_iov_md": false 00:16:44.943 }, 00:16:44.943 "memory_domains": [ 00:16:44.943 { 00:16:44.943 "dma_device_id": "system", 00:16:44.943 "dma_device_type": 1 00:16:44.943 }, 00:16:44.943 { 00:16:44.943 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:44.943 "dma_device_type": 2 00:16:44.943 } 00:16:44.943 ], 00:16:44.943 "driver_specific": { 00:16:44.943 "passthru": { 00:16:44.943 "name": "pt2", 00:16:44.943 "base_bdev_name": "malloc2" 00:16:44.943 } 00:16:44.943 } 00:16:44.943 }' 00:16:44.943 11:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:44.943 11:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:44.943 11:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:44.943 11:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:44.943 11:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:44.943 11:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:44.943 11:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:44.943 11:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:45.202 11:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:45.202 11:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:45.202 11:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:45.202 11:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:45.202 11:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:45.202 11:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:16:45.202 11:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:45.202 11:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:45.202 "name": "pt3", 00:16:45.202 "aliases": [ 00:16:45.202 "00000000-0000-0000-0000-000000000003" 00:16:45.202 ], 00:16:45.202 "product_name": "passthru", 00:16:45.202 "block_size": 512, 00:16:45.202 "num_blocks": 65536, 00:16:45.202 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:45.202 "assigned_rate_limits": { 00:16:45.202 "rw_ios_per_sec": 0, 00:16:45.202 "rw_mbytes_per_sec": 0, 00:16:45.202 "r_mbytes_per_sec": 0, 00:16:45.202 "w_mbytes_per_sec": 0 00:16:45.202 }, 00:16:45.202 "claimed": true, 00:16:45.202 "claim_type": "exclusive_write", 00:16:45.202 "zoned": false, 00:16:45.202 "supported_io_types": { 00:16:45.202 "read": true, 00:16:45.202 "write": true, 00:16:45.202 "unmap": true, 00:16:45.202 "flush": true, 00:16:45.202 "reset": true, 00:16:45.202 "nvme_admin": false, 00:16:45.202 "nvme_io": false, 00:16:45.202 "nvme_io_md": false, 00:16:45.202 "write_zeroes": true, 00:16:45.202 "zcopy": true, 00:16:45.202 "get_zone_info": false, 00:16:45.202 "zone_management": false, 00:16:45.202 "zone_append": false, 00:16:45.202 "compare": false, 00:16:45.202 "compare_and_write": false, 00:16:45.202 "abort": true, 00:16:45.202 "seek_hole": false, 00:16:45.202 "seek_data": false, 00:16:45.202 "copy": true, 00:16:45.202 "nvme_iov_md": false 00:16:45.202 }, 00:16:45.202 "memory_domains": [ 00:16:45.202 { 00:16:45.202 "dma_device_id": "system", 00:16:45.202 "dma_device_type": 1 00:16:45.202 }, 00:16:45.202 { 00:16:45.202 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:45.202 "dma_device_type": 2 00:16:45.202 } 00:16:45.202 ], 00:16:45.202 "driver_specific": { 00:16:45.202 "passthru": { 00:16:45.202 "name": "pt3", 00:16:45.202 "base_bdev_name": "malloc3" 00:16:45.202 } 00:16:45.202 } 00:16:45.202 }' 00:16:45.202 11:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:45.461 11:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:45.461 11:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:45.461 11:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:45.461 11:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:45.461 11:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:45.461 11:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:45.461 11:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:45.461 11:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:45.461 11:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:45.461 11:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:45.720 11:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:45.720 11:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:45.720 11:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:16:45.720 11:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:45.720 11:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:45.720 "name": "pt4", 00:16:45.720 "aliases": [ 00:16:45.720 "00000000-0000-0000-0000-000000000004" 00:16:45.720 ], 00:16:45.720 "product_name": "passthru", 00:16:45.720 "block_size": 512, 00:16:45.720 "num_blocks": 65536, 00:16:45.720 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:45.720 "assigned_rate_limits": { 00:16:45.720 "rw_ios_per_sec": 0, 00:16:45.720 "rw_mbytes_per_sec": 0, 00:16:45.720 "r_mbytes_per_sec": 0, 00:16:45.720 "w_mbytes_per_sec": 0 00:16:45.720 }, 00:16:45.720 "claimed": true, 00:16:45.720 "claim_type": "exclusive_write", 00:16:45.720 "zoned": false, 00:16:45.720 "supported_io_types": { 00:16:45.720 "read": true, 00:16:45.720 "write": true, 00:16:45.720 "unmap": true, 00:16:45.720 "flush": true, 00:16:45.720 "reset": true, 00:16:45.720 "nvme_admin": false, 00:16:45.720 "nvme_io": false, 00:16:45.720 "nvme_io_md": false, 00:16:45.720 "write_zeroes": true, 00:16:45.720 "zcopy": true, 00:16:45.720 "get_zone_info": false, 00:16:45.720 "zone_management": false, 00:16:45.720 "zone_append": false, 00:16:45.720 "compare": false, 00:16:45.720 "compare_and_write": false, 00:16:45.720 "abort": true, 00:16:45.720 "seek_hole": false, 00:16:45.720 "seek_data": false, 00:16:45.720 "copy": true, 00:16:45.720 "nvme_iov_md": false 00:16:45.720 }, 00:16:45.720 "memory_domains": [ 00:16:45.720 { 00:16:45.720 "dma_device_id": "system", 00:16:45.720 "dma_device_type": 1 00:16:45.720 }, 00:16:45.720 { 00:16:45.720 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:45.720 "dma_device_type": 2 00:16:45.720 } 00:16:45.720 ], 00:16:45.720 "driver_specific": { 00:16:45.720 "passthru": { 00:16:45.720 "name": "pt4", 00:16:45.720 "base_bdev_name": "malloc4" 00:16:45.720 } 00:16:45.720 } 00:16:45.720 }' 00:16:45.720 11:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:45.720 11:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:45.979 11:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:45.979 11:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:45.979 11:57:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:45.979 11:57:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:45.979 11:57:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:45.979 11:57:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:45.979 11:57:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:45.979 11:57:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:45.979 11:57:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:45.979 11:57:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:45.980 11:57:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:45.980 11:57:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:16:46.238 [2024-07-12 11:57:36.370527] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:46.238 11:57:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 850bfd65-c7e8-4326-9211-eb5a45a15bb0 '!=' 850bfd65-c7e8-4326-9211-eb5a45a15bb0 ']' 00:16:46.238 11:57:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:16:46.238 11:57:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:46.238 11:57:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:46.238 11:57:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 673098 00:16:46.238 11:57:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 673098 ']' 00:16:46.238 11:57:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 673098 00:16:46.238 11:57:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:16:46.238 11:57:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:46.238 11:57:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 673098 00:16:46.238 11:57:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:46.238 11:57:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:46.238 11:57:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 673098' 00:16:46.238 killing process with pid 673098 00:16:46.238 11:57:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 673098 00:16:46.238 [2024-07-12 11:57:36.434202] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:46.238 [2024-07-12 11:57:36.434247] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:46.238 [2024-07-12 11:57:36.434291] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:46.238 [2024-07-12 11:57:36.434297] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xdf8280 name raid_bdev1, state offline 00:16:46.238 11:57:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 673098 00:16:46.238 [2024-07-12 11:57:36.466415] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:46.498 11:57:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:16:46.498 00:16:46.498 real 0m12.274s 00:16:46.498 user 0m22.507s 00:16:46.498 sys 0m1.845s 00:16:46.498 11:57:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:46.498 11:57:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:46.498 ************************************ 00:16:46.498 END TEST raid_superblock_test 00:16:46.498 ************************************ 00:16:46.498 11:57:36 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:46.498 11:57:36 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 4 read 00:16:46.498 11:57:36 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:46.498 11:57:36 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:46.498 11:57:36 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:46.498 ************************************ 00:16:46.498 START TEST raid_read_error_test 00:16:46.498 ************************************ 00:16:46.498 11:57:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 4 read 00:16:46.498 11:57:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:16:46.498 11:57:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:16:46.498 11:57:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:16:46.498 11:57:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:16:46.498 11:57:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:46.498 11:57:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:16:46.498 11:57:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:46.498 11:57:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:46.498 11:57:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:16:46.498 11:57:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:46.499 11:57:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:46.499 11:57:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:16:46.499 11:57:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:46.499 11:57:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:46.499 11:57:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:16:46.499 11:57:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:46.499 11:57:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:46.499 11:57:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:16:46.499 11:57:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:16:46.499 11:57:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:16:46.499 11:57:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:16:46.499 11:57:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:16:46.499 11:57:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:16:46.499 11:57:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:16:46.499 11:57:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:16:46.499 11:57:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:16:46.499 11:57:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:16:46.499 11:57:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:16:46.499 11:57:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.jJaDGjfZN0 00:16:46.499 11:57:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=675482 00:16:46.499 11:57:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 675482 /var/tmp/spdk-raid.sock 00:16:46.499 11:57:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:16:46.499 11:57:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 675482 ']' 00:16:46.499 11:57:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:46.499 11:57:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:46.499 11:57:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:46.499 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:46.499 11:57:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:46.499 11:57:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:46.757 [2024-07-12 11:57:36.771859] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:16:46.757 [2024-07-12 11:57:36.771895] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid675482 ] 00:16:46.757 [2024-07-12 11:57:36.834157] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:46.757 [2024-07-12 11:57:36.911171] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:16:46.757 [2024-07-12 11:57:36.961009] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:46.757 [2024-07-12 11:57:36.961033] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:47.325 11:57:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:47.325 11:57:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:16:47.325 11:57:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:47.325 11:57:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:16:47.584 BaseBdev1_malloc 00:16:47.584 11:57:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:16:47.844 true 00:16:47.844 11:57:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:16:47.844 [2024-07-12 11:57:38.076686] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:16:47.844 [2024-07-12 11:57:38.076716] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:47.844 [2024-07-12 11:57:38.076729] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x282f2d0 00:16:47.844 [2024-07-12 11:57:38.076735] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:47.844 [2024-07-12 11:57:38.077854] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:47.844 [2024-07-12 11:57:38.077873] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:16:47.844 BaseBdev1 00:16:48.103 11:57:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:48.103 11:57:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:16:48.103 BaseBdev2_malloc 00:16:48.103 11:57:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:16:48.362 true 00:16:48.362 11:57:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:16:48.362 [2024-07-12 11:57:38.585265] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:16:48.362 [2024-07-12 11:57:38.585291] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:48.362 [2024-07-12 11:57:38.585300] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2833f40 00:16:48.362 [2024-07-12 11:57:38.585305] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:48.362 [2024-07-12 11:57:38.586234] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:48.362 [2024-07-12 11:57:38.586253] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:16:48.362 BaseBdev2 00:16:48.362 11:57:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:48.362 11:57:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:16:48.620 BaseBdev3_malloc 00:16:48.620 11:57:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:16:48.879 true 00:16:48.879 11:57:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:16:48.879 [2024-07-12 11:57:39.081756] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:16:48.879 [2024-07-12 11:57:39.081783] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:48.879 [2024-07-12 11:57:39.081792] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2836ea0 00:16:48.879 [2024-07-12 11:57:39.081798] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:48.879 [2024-07-12 11:57:39.082734] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:48.880 [2024-07-12 11:57:39.082753] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:16:48.880 BaseBdev3 00:16:48.880 11:57:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:48.880 11:57:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:16:49.138 BaseBdev4_malloc 00:16:49.138 11:57:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:16:49.397 true 00:16:49.397 11:57:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:16:49.397 [2024-07-12 11:57:39.586511] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:16:49.397 [2024-07-12 11:57:39.586543] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:49.397 [2024-07-12 11:57:39.586553] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28302f0 00:16:49.397 [2024-07-12 11:57:39.586559] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:49.397 [2024-07-12 11:57:39.587447] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:49.397 [2024-07-12 11:57:39.587466] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:16:49.397 BaseBdev4 00:16:49.397 11:57:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:16:49.656 [2024-07-12 11:57:39.758995] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:49.656 [2024-07-12 11:57:39.759770] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:49.656 [2024-07-12 11:57:39.759813] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:49.656 [2024-07-12 11:57:39.759850] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:49.656 [2024-07-12 11:57:39.759988] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2830d30 00:16:49.656 [2024-07-12 11:57:39.759994] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:16:49.656 [2024-07-12 11:57:39.760110] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2719720 00:16:49.656 [2024-07-12 11:57:39.760203] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2830d30 00:16:49.656 [2024-07-12 11:57:39.760207] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2830d30 00:16:49.656 [2024-07-12 11:57:39.760268] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:49.656 11:57:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:16:49.656 11:57:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:49.656 11:57:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:49.656 11:57:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:49.656 11:57:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:49.656 11:57:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:49.656 11:57:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:49.656 11:57:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:49.656 11:57:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:49.656 11:57:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:49.656 11:57:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:49.656 11:57:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:49.921 11:57:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:49.922 "name": "raid_bdev1", 00:16:49.922 "uuid": "7a272dcf-d946-40f5-b617-12289dcbc0f6", 00:16:49.922 "strip_size_kb": 64, 00:16:49.922 "state": "online", 00:16:49.922 "raid_level": "concat", 00:16:49.922 "superblock": true, 00:16:49.922 "num_base_bdevs": 4, 00:16:49.922 "num_base_bdevs_discovered": 4, 00:16:49.922 "num_base_bdevs_operational": 4, 00:16:49.922 "base_bdevs_list": [ 00:16:49.922 { 00:16:49.922 "name": "BaseBdev1", 00:16:49.922 "uuid": "6d105c8a-8a72-5091-b15a-985827424731", 00:16:49.922 "is_configured": true, 00:16:49.922 "data_offset": 2048, 00:16:49.922 "data_size": 63488 00:16:49.922 }, 00:16:49.922 { 00:16:49.922 "name": "BaseBdev2", 00:16:49.922 "uuid": "43d62543-c968-5df8-be12-133a4f076d73", 00:16:49.922 "is_configured": true, 00:16:49.922 "data_offset": 2048, 00:16:49.922 "data_size": 63488 00:16:49.922 }, 00:16:49.922 { 00:16:49.922 "name": "BaseBdev3", 00:16:49.922 "uuid": "51523ece-f916-57e9-9e21-08410180ca68", 00:16:49.922 "is_configured": true, 00:16:49.922 "data_offset": 2048, 00:16:49.922 "data_size": 63488 00:16:49.922 }, 00:16:49.922 { 00:16:49.922 "name": "BaseBdev4", 00:16:49.922 "uuid": "46eebef2-1d64-5c42-8996-de0254ac92c2", 00:16:49.922 "is_configured": true, 00:16:49.922 "data_offset": 2048, 00:16:49.922 "data_size": 63488 00:16:49.922 } 00:16:49.922 ] 00:16:49.922 }' 00:16:49.922 11:57:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:49.922 11:57:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:50.493 11:57:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:16:50.493 11:57:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:16:50.493 [2024-07-12 11:57:40.537212] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x271d020 00:16:51.431 11:57:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:16:51.431 11:57:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:16:51.431 11:57:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:16:51.431 11:57:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:16:51.431 11:57:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:16:51.431 11:57:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:51.431 11:57:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:51.431 11:57:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:51.431 11:57:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:51.431 11:57:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:51.431 11:57:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:51.431 11:57:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:51.431 11:57:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:51.431 11:57:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:51.431 11:57:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:51.431 11:57:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:51.690 11:57:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:51.690 "name": "raid_bdev1", 00:16:51.690 "uuid": "7a272dcf-d946-40f5-b617-12289dcbc0f6", 00:16:51.690 "strip_size_kb": 64, 00:16:51.690 "state": "online", 00:16:51.690 "raid_level": "concat", 00:16:51.690 "superblock": true, 00:16:51.690 "num_base_bdevs": 4, 00:16:51.690 "num_base_bdevs_discovered": 4, 00:16:51.690 "num_base_bdevs_operational": 4, 00:16:51.690 "base_bdevs_list": [ 00:16:51.690 { 00:16:51.690 "name": "BaseBdev1", 00:16:51.690 "uuid": "6d105c8a-8a72-5091-b15a-985827424731", 00:16:51.690 "is_configured": true, 00:16:51.690 "data_offset": 2048, 00:16:51.690 "data_size": 63488 00:16:51.690 }, 00:16:51.690 { 00:16:51.690 "name": "BaseBdev2", 00:16:51.690 "uuid": "43d62543-c968-5df8-be12-133a4f076d73", 00:16:51.690 "is_configured": true, 00:16:51.690 "data_offset": 2048, 00:16:51.690 "data_size": 63488 00:16:51.690 }, 00:16:51.690 { 00:16:51.690 "name": "BaseBdev3", 00:16:51.690 "uuid": "51523ece-f916-57e9-9e21-08410180ca68", 00:16:51.690 "is_configured": true, 00:16:51.690 "data_offset": 2048, 00:16:51.690 "data_size": 63488 00:16:51.690 }, 00:16:51.690 { 00:16:51.690 "name": "BaseBdev4", 00:16:51.690 "uuid": "46eebef2-1d64-5c42-8996-de0254ac92c2", 00:16:51.690 "is_configured": true, 00:16:51.690 "data_offset": 2048, 00:16:51.690 "data_size": 63488 00:16:51.690 } 00:16:51.690 ] 00:16:51.690 }' 00:16:51.690 11:57:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:51.690 11:57:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:52.258 11:57:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:52.258 [2024-07-12 11:57:42.425926] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:52.258 [2024-07-12 11:57:42.425956] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:52.258 [2024-07-12 11:57:42.428056] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:52.258 [2024-07-12 11:57:42.428082] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:52.258 [2024-07-12 11:57:42.428107] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:52.258 [2024-07-12 11:57:42.428113] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2830d30 name raid_bdev1, state offline 00:16:52.258 0 00:16:52.258 11:57:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 675482 00:16:52.258 11:57:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 675482 ']' 00:16:52.258 11:57:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 675482 00:16:52.258 11:57:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:16:52.258 11:57:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:52.258 11:57:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 675482 00:16:52.258 11:57:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:52.258 11:57:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:52.258 11:57:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 675482' 00:16:52.258 killing process with pid 675482 00:16:52.258 11:57:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 675482 00:16:52.258 [2024-07-12 11:57:42.491174] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:52.258 11:57:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 675482 00:16:52.517 [2024-07-12 11:57:42.517998] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:52.517 11:57:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.jJaDGjfZN0 00:16:52.517 11:57:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:16:52.517 11:57:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:16:52.517 11:57:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.53 00:16:52.517 11:57:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:16:52.517 11:57:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:52.517 11:57:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:52.517 11:57:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.53 != \0\.\0\0 ]] 00:16:52.517 00:16:52.517 real 0m6.000s 00:16:52.517 user 0m9.456s 00:16:52.517 sys 0m0.865s 00:16:52.517 11:57:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:52.517 11:57:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:52.517 ************************************ 00:16:52.517 END TEST raid_read_error_test 00:16:52.517 ************************************ 00:16:52.517 11:57:42 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:52.517 11:57:42 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 4 write 00:16:52.517 11:57:42 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:52.517 11:57:42 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:52.517 11:57:42 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:52.776 ************************************ 00:16:52.776 START TEST raid_write_error_test 00:16:52.776 ************************************ 00:16:52.776 11:57:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 4 write 00:16:52.776 11:57:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:16:52.776 11:57:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:16:52.776 11:57:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:16:52.776 11:57:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:16:52.776 11:57:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:52.776 11:57:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:16:52.776 11:57:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:52.776 11:57:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:52.776 11:57:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:16:52.776 11:57:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:52.776 11:57:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:52.776 11:57:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:16:52.776 11:57:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:52.776 11:57:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:52.776 11:57:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:16:52.776 11:57:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:52.776 11:57:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:52.776 11:57:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:16:52.776 11:57:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:16:52.776 11:57:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:16:52.776 11:57:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:16:52.776 11:57:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:16:52.776 11:57:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:16:52.776 11:57:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:16:52.776 11:57:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:16:52.776 11:57:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:16:52.776 11:57:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:16:52.776 11:57:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:16:52.776 11:57:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.XtIi8Lh3u8 00:16:52.776 11:57:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=676500 00:16:52.776 11:57:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 676500 /var/tmp/spdk-raid.sock 00:16:52.776 11:57:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:16:52.776 11:57:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 676500 ']' 00:16:52.776 11:57:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:52.776 11:57:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:52.776 11:57:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:52.776 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:52.776 11:57:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:52.776 11:57:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:52.777 [2024-07-12 11:57:42.834043] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:16:52.777 [2024-07-12 11:57:42.834081] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid676500 ] 00:16:52.777 [2024-07-12 11:57:42.898101] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:52.777 [2024-07-12 11:57:42.967863] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:16:52.777 [2024-07-12 11:57:43.018145] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:52.777 [2024-07-12 11:57:43.018176] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:53.712 11:57:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:53.712 11:57:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:16:53.712 11:57:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:53.712 11:57:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:16:53.712 BaseBdev1_malloc 00:16:53.712 11:57:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:16:53.712 true 00:16:53.971 11:57:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:16:53.971 [2024-07-12 11:57:44.106128] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:16:53.971 [2024-07-12 11:57:44.106159] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:53.971 [2024-07-12 11:57:44.106168] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe2c2d0 00:16:53.971 [2024-07-12 11:57:44.106173] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:53.971 [2024-07-12 11:57:44.107277] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:53.971 [2024-07-12 11:57:44.107296] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:16:53.971 BaseBdev1 00:16:53.971 11:57:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:53.971 11:57:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:16:54.229 BaseBdev2_malloc 00:16:54.229 11:57:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:16:54.229 true 00:16:54.230 11:57:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:16:54.488 [2024-07-12 11:57:44.606687] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:16:54.488 [2024-07-12 11:57:44.606714] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:54.488 [2024-07-12 11:57:44.606723] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe30f40 00:16:54.488 [2024-07-12 11:57:44.606729] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:54.488 [2024-07-12 11:57:44.607648] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:54.488 [2024-07-12 11:57:44.607666] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:16:54.488 BaseBdev2 00:16:54.488 11:57:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:54.488 11:57:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:16:54.747 BaseBdev3_malloc 00:16:54.747 11:57:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:16:54.747 true 00:16:54.747 11:57:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:16:55.007 [2024-07-12 11:57:45.119451] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:16:55.007 [2024-07-12 11:57:45.119478] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:55.007 [2024-07-12 11:57:45.119491] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe33ea0 00:16:55.007 [2024-07-12 11:57:45.119497] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:55.007 [2024-07-12 11:57:45.120439] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:55.007 [2024-07-12 11:57:45.120458] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:16:55.007 BaseBdev3 00:16:55.007 11:57:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:55.007 11:57:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:16:55.266 BaseBdev4_malloc 00:16:55.266 11:57:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:16:55.266 true 00:16:55.266 11:57:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:16:55.524 [2024-07-12 11:57:45.611969] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:16:55.524 [2024-07-12 11:57:45.611993] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:55.524 [2024-07-12 11:57:45.612002] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe2d2f0 00:16:55.524 [2024-07-12 11:57:45.612007] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:55.524 [2024-07-12 11:57:45.612947] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:55.524 [2024-07-12 11:57:45.612965] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:16:55.524 BaseBdev4 00:16:55.524 11:57:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:16:55.783 [2024-07-12 11:57:45.772410] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:55.783 [2024-07-12 11:57:45.773234] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:55.783 [2024-07-12 11:57:45.773279] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:55.783 [2024-07-12 11:57:45.773318] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:55.783 [2024-07-12 11:57:45.773471] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe2dd30 00:16:55.783 [2024-07-12 11:57:45.773477] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:16:55.783 [2024-07-12 11:57:45.773596] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd16720 00:16:55.783 [2024-07-12 11:57:45.773693] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe2dd30 00:16:55.783 [2024-07-12 11:57:45.773698] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe2dd30 00:16:55.783 [2024-07-12 11:57:45.773763] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:55.783 11:57:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:16:55.783 11:57:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:55.783 11:57:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:55.783 11:57:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:55.783 11:57:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:55.783 11:57:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:55.783 11:57:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:55.783 11:57:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:55.783 11:57:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:55.783 11:57:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:55.783 11:57:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:55.783 11:57:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:55.783 11:57:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:55.783 "name": "raid_bdev1", 00:16:55.783 "uuid": "28d9ae77-d35b-4cf5-b25c-4f85ac1b1c8a", 00:16:55.783 "strip_size_kb": 64, 00:16:55.783 "state": "online", 00:16:55.783 "raid_level": "concat", 00:16:55.783 "superblock": true, 00:16:55.783 "num_base_bdevs": 4, 00:16:55.783 "num_base_bdevs_discovered": 4, 00:16:55.783 "num_base_bdevs_operational": 4, 00:16:55.783 "base_bdevs_list": [ 00:16:55.783 { 00:16:55.783 "name": "BaseBdev1", 00:16:55.783 "uuid": "b7c8790f-9edb-599a-880f-a6864ce992cb", 00:16:55.783 "is_configured": true, 00:16:55.783 "data_offset": 2048, 00:16:55.783 "data_size": 63488 00:16:55.783 }, 00:16:55.783 { 00:16:55.783 "name": "BaseBdev2", 00:16:55.783 "uuid": "f01d19b9-0ea0-50e0-8a22-59513143f898", 00:16:55.783 "is_configured": true, 00:16:55.783 "data_offset": 2048, 00:16:55.783 "data_size": 63488 00:16:55.783 }, 00:16:55.783 { 00:16:55.783 "name": "BaseBdev3", 00:16:55.783 "uuid": "327157f7-e7d8-51cf-9ae0-2116e20ef4cd", 00:16:55.783 "is_configured": true, 00:16:55.783 "data_offset": 2048, 00:16:55.783 "data_size": 63488 00:16:55.783 }, 00:16:55.783 { 00:16:55.783 "name": "BaseBdev4", 00:16:55.783 "uuid": "d97539aa-e9bd-5d2a-af2d-11754d9e78d8", 00:16:55.783 "is_configured": true, 00:16:55.783 "data_offset": 2048, 00:16:55.783 "data_size": 63488 00:16:55.783 } 00:16:55.783 ] 00:16:55.783 }' 00:16:55.783 11:57:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:55.783 11:57:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:56.352 11:57:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:16:56.352 11:57:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:16:56.352 [2024-07-12 11:57:46.514598] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd1a020 00:16:57.327 11:57:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:16:57.620 11:57:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:16:57.620 11:57:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:16:57.620 11:57:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:16:57.620 11:57:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:16:57.620 11:57:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:57.620 11:57:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:57.620 11:57:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:57.620 11:57:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:57.620 11:57:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:57.620 11:57:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:57.620 11:57:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:57.620 11:57:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:57.620 11:57:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:57.620 11:57:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:57.620 11:57:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:57.620 11:57:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:57.620 "name": "raid_bdev1", 00:16:57.620 "uuid": "28d9ae77-d35b-4cf5-b25c-4f85ac1b1c8a", 00:16:57.620 "strip_size_kb": 64, 00:16:57.620 "state": "online", 00:16:57.620 "raid_level": "concat", 00:16:57.620 "superblock": true, 00:16:57.620 "num_base_bdevs": 4, 00:16:57.620 "num_base_bdevs_discovered": 4, 00:16:57.620 "num_base_bdevs_operational": 4, 00:16:57.620 "base_bdevs_list": [ 00:16:57.620 { 00:16:57.620 "name": "BaseBdev1", 00:16:57.620 "uuid": "b7c8790f-9edb-599a-880f-a6864ce992cb", 00:16:57.620 "is_configured": true, 00:16:57.620 "data_offset": 2048, 00:16:57.620 "data_size": 63488 00:16:57.620 }, 00:16:57.620 { 00:16:57.620 "name": "BaseBdev2", 00:16:57.620 "uuid": "f01d19b9-0ea0-50e0-8a22-59513143f898", 00:16:57.620 "is_configured": true, 00:16:57.620 "data_offset": 2048, 00:16:57.620 "data_size": 63488 00:16:57.620 }, 00:16:57.620 { 00:16:57.620 "name": "BaseBdev3", 00:16:57.620 "uuid": "327157f7-e7d8-51cf-9ae0-2116e20ef4cd", 00:16:57.620 "is_configured": true, 00:16:57.620 "data_offset": 2048, 00:16:57.620 "data_size": 63488 00:16:57.620 }, 00:16:57.620 { 00:16:57.620 "name": "BaseBdev4", 00:16:57.620 "uuid": "d97539aa-e9bd-5d2a-af2d-11754d9e78d8", 00:16:57.620 "is_configured": true, 00:16:57.620 "data_offset": 2048, 00:16:57.620 "data_size": 63488 00:16:57.620 } 00:16:57.620 ] 00:16:57.620 }' 00:16:57.620 11:57:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:57.620 11:57:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:58.189 11:57:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:58.448 [2024-07-12 11:57:48.443341] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:58.448 [2024-07-12 11:57:48.443370] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:58.448 [2024-07-12 11:57:48.445418] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:58.448 [2024-07-12 11:57:48.445442] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:58.448 [2024-07-12 11:57:48.445468] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:58.448 [2024-07-12 11:57:48.445473] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe2dd30 name raid_bdev1, state offline 00:16:58.448 0 00:16:58.448 11:57:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 676500 00:16:58.448 11:57:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 676500 ']' 00:16:58.448 11:57:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 676500 00:16:58.448 11:57:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:16:58.448 11:57:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:58.448 11:57:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 676500 00:16:58.448 11:57:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:58.448 11:57:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:58.448 11:57:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 676500' 00:16:58.448 killing process with pid 676500 00:16:58.448 11:57:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 676500 00:16:58.448 [2024-07-12 11:57:48.507901] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:58.448 11:57:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 676500 00:16:58.448 [2024-07-12 11:57:48.534503] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:58.707 11:57:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.XtIi8Lh3u8 00:16:58.707 11:57:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:16:58.707 11:57:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:16:58.707 11:57:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:16:58.707 11:57:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:16:58.707 11:57:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:58.707 11:57:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:58.707 11:57:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:16:58.707 00:16:58.707 real 0m5.952s 00:16:58.707 user 0m9.335s 00:16:58.707 sys 0m0.871s 00:16:58.707 11:57:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:58.707 11:57:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:58.707 ************************************ 00:16:58.707 END TEST raid_write_error_test 00:16:58.707 ************************************ 00:16:58.707 11:57:48 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:58.707 11:57:48 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:16:58.707 11:57:48 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 4 false 00:16:58.707 11:57:48 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:58.707 11:57:48 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:58.707 11:57:48 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:58.707 ************************************ 00:16:58.707 START TEST raid_state_function_test 00:16:58.707 ************************************ 00:16:58.707 11:57:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 4 false 00:16:58.707 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:16:58.707 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:16:58.707 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:16:58.707 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:58.707 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:58.707 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:58.707 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:58.707 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:58.707 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:58.707 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:58.708 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:58.708 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:58.708 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:16:58.708 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:58.708 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:58.708 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:16:58.708 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:58.708 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:58.708 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:16:58.708 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:58.708 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:58.708 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:58.708 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:58.708 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:58.708 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:16:58.708 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:16:58.708 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:16:58.708 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:16:58.708 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=677556 00:16:58.708 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 677556' 00:16:58.708 Process raid pid: 677556 00:16:58.708 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:58.708 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 677556 /var/tmp/spdk-raid.sock 00:16:58.708 11:57:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 677556 ']' 00:16:58.708 11:57:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:58.708 11:57:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:58.708 11:57:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:58.708 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:58.708 11:57:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:58.708 11:57:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:58.708 [2024-07-12 11:57:48.855230] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:16:58.708 [2024-07-12 11:57:48.855268] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:58.708 [2024-07-12 11:57:48.920904] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:58.966 [2024-07-12 11:57:49.003539] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:16:58.966 [2024-07-12 11:57:49.057698] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:58.966 [2024-07-12 11:57:49.057719] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:59.535 11:57:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:59.535 11:57:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:16:59.535 11:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:59.793 [2024-07-12 11:57:49.804719] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:59.793 [2024-07-12 11:57:49.804748] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:59.793 [2024-07-12 11:57:49.804754] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:59.793 [2024-07-12 11:57:49.804760] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:59.793 [2024-07-12 11:57:49.804764] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:59.793 [2024-07-12 11:57:49.804769] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:59.793 [2024-07-12 11:57:49.804773] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:59.793 [2024-07-12 11:57:49.804778] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:59.793 11:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:16:59.793 11:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:59.793 11:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:59.793 11:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:59.793 11:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:59.793 11:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:59.793 11:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:59.793 11:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:59.793 11:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:59.793 11:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:59.793 11:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:59.793 11:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:59.793 11:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:59.793 "name": "Existed_Raid", 00:16:59.793 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:59.793 "strip_size_kb": 0, 00:16:59.793 "state": "configuring", 00:16:59.793 "raid_level": "raid1", 00:16:59.793 "superblock": false, 00:16:59.793 "num_base_bdevs": 4, 00:16:59.793 "num_base_bdevs_discovered": 0, 00:16:59.793 "num_base_bdevs_operational": 4, 00:16:59.793 "base_bdevs_list": [ 00:16:59.793 { 00:16:59.793 "name": "BaseBdev1", 00:16:59.793 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:59.793 "is_configured": false, 00:16:59.793 "data_offset": 0, 00:16:59.793 "data_size": 0 00:16:59.794 }, 00:16:59.794 { 00:16:59.794 "name": "BaseBdev2", 00:16:59.794 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:59.794 "is_configured": false, 00:16:59.794 "data_offset": 0, 00:16:59.794 "data_size": 0 00:16:59.794 }, 00:16:59.794 { 00:16:59.794 "name": "BaseBdev3", 00:16:59.794 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:59.794 "is_configured": false, 00:16:59.794 "data_offset": 0, 00:16:59.794 "data_size": 0 00:16:59.794 }, 00:16:59.794 { 00:16:59.794 "name": "BaseBdev4", 00:16:59.794 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:59.794 "is_configured": false, 00:16:59.794 "data_offset": 0, 00:16:59.794 "data_size": 0 00:16:59.794 } 00:16:59.794 ] 00:16:59.794 }' 00:16:59.794 11:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:59.794 11:57:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:00.361 11:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:00.619 [2024-07-12 11:57:50.626764] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:00.619 [2024-07-12 11:57:50.626786] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24801f0 name Existed_Raid, state configuring 00:17:00.619 11:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:00.619 [2024-07-12 11:57:50.795213] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:00.619 [2024-07-12 11:57:50.795235] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:00.619 [2024-07-12 11:57:50.795240] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:00.619 [2024-07-12 11:57:50.795245] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:00.619 [2024-07-12 11:57:50.795249] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:00.619 [2024-07-12 11:57:50.795254] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:00.619 [2024-07-12 11:57:50.795258] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:00.619 [2024-07-12 11:57:50.795263] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:00.619 11:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:00.878 [2024-07-12 11:57:50.971810] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:00.878 BaseBdev1 00:17:00.878 11:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:17:00.878 11:57:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:00.878 11:57:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:00.878 11:57:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:00.878 11:57:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:00.878 11:57:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:00.878 11:57:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:01.137 11:57:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:01.137 [ 00:17:01.137 { 00:17:01.137 "name": "BaseBdev1", 00:17:01.137 "aliases": [ 00:17:01.137 "ed832ffb-82d3-467d-b33d-d2fe11dd2a47" 00:17:01.137 ], 00:17:01.137 "product_name": "Malloc disk", 00:17:01.137 "block_size": 512, 00:17:01.137 "num_blocks": 65536, 00:17:01.137 "uuid": "ed832ffb-82d3-467d-b33d-d2fe11dd2a47", 00:17:01.137 "assigned_rate_limits": { 00:17:01.137 "rw_ios_per_sec": 0, 00:17:01.137 "rw_mbytes_per_sec": 0, 00:17:01.137 "r_mbytes_per_sec": 0, 00:17:01.137 "w_mbytes_per_sec": 0 00:17:01.137 }, 00:17:01.137 "claimed": true, 00:17:01.137 "claim_type": "exclusive_write", 00:17:01.137 "zoned": false, 00:17:01.137 "supported_io_types": { 00:17:01.137 "read": true, 00:17:01.137 "write": true, 00:17:01.137 "unmap": true, 00:17:01.137 "flush": true, 00:17:01.137 "reset": true, 00:17:01.137 "nvme_admin": false, 00:17:01.137 "nvme_io": false, 00:17:01.137 "nvme_io_md": false, 00:17:01.137 "write_zeroes": true, 00:17:01.137 "zcopy": true, 00:17:01.137 "get_zone_info": false, 00:17:01.137 "zone_management": false, 00:17:01.137 "zone_append": false, 00:17:01.137 "compare": false, 00:17:01.137 "compare_and_write": false, 00:17:01.137 "abort": true, 00:17:01.137 "seek_hole": false, 00:17:01.137 "seek_data": false, 00:17:01.137 "copy": true, 00:17:01.137 "nvme_iov_md": false 00:17:01.137 }, 00:17:01.137 "memory_domains": [ 00:17:01.137 { 00:17:01.137 "dma_device_id": "system", 00:17:01.137 "dma_device_type": 1 00:17:01.137 }, 00:17:01.137 { 00:17:01.137 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:01.137 "dma_device_type": 2 00:17:01.137 } 00:17:01.137 ], 00:17:01.137 "driver_specific": {} 00:17:01.137 } 00:17:01.137 ] 00:17:01.137 11:57:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:01.137 11:57:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:01.137 11:57:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:01.137 11:57:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:01.137 11:57:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:01.137 11:57:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:01.137 11:57:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:01.137 11:57:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:01.137 11:57:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:01.137 11:57:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:01.137 11:57:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:01.137 11:57:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:01.137 11:57:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:01.396 11:57:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:01.396 "name": "Existed_Raid", 00:17:01.396 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:01.396 "strip_size_kb": 0, 00:17:01.396 "state": "configuring", 00:17:01.396 "raid_level": "raid1", 00:17:01.396 "superblock": false, 00:17:01.396 "num_base_bdevs": 4, 00:17:01.396 "num_base_bdevs_discovered": 1, 00:17:01.396 "num_base_bdevs_operational": 4, 00:17:01.396 "base_bdevs_list": [ 00:17:01.396 { 00:17:01.396 "name": "BaseBdev1", 00:17:01.396 "uuid": "ed832ffb-82d3-467d-b33d-d2fe11dd2a47", 00:17:01.396 "is_configured": true, 00:17:01.396 "data_offset": 0, 00:17:01.396 "data_size": 65536 00:17:01.396 }, 00:17:01.396 { 00:17:01.396 "name": "BaseBdev2", 00:17:01.396 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:01.396 "is_configured": false, 00:17:01.396 "data_offset": 0, 00:17:01.396 "data_size": 0 00:17:01.396 }, 00:17:01.396 { 00:17:01.396 "name": "BaseBdev3", 00:17:01.396 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:01.396 "is_configured": false, 00:17:01.396 "data_offset": 0, 00:17:01.396 "data_size": 0 00:17:01.396 }, 00:17:01.396 { 00:17:01.396 "name": "BaseBdev4", 00:17:01.396 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:01.396 "is_configured": false, 00:17:01.396 "data_offset": 0, 00:17:01.396 "data_size": 0 00:17:01.396 } 00:17:01.396 ] 00:17:01.396 }' 00:17:01.396 11:57:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:01.396 11:57:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:01.965 11:57:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:01.965 [2024-07-12 11:57:52.102738] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:01.965 [2024-07-12 11:57:52.102769] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x247fa60 name Existed_Raid, state configuring 00:17:01.965 11:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:02.225 [2024-07-12 11:57:52.275203] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:02.225 [2024-07-12 11:57:52.276236] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:02.225 [2024-07-12 11:57:52.276259] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:02.225 [2024-07-12 11:57:52.276264] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:02.225 [2024-07-12 11:57:52.276269] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:02.225 [2024-07-12 11:57:52.276274] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:02.225 [2024-07-12 11:57:52.276279] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:02.225 11:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:17:02.225 11:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:02.225 11:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:02.225 11:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:02.225 11:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:02.225 11:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:02.225 11:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:02.225 11:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:02.225 11:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:02.225 11:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:02.225 11:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:02.225 11:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:02.225 11:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:02.225 11:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:02.225 11:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:02.225 "name": "Existed_Raid", 00:17:02.225 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:02.225 "strip_size_kb": 0, 00:17:02.225 "state": "configuring", 00:17:02.225 "raid_level": "raid1", 00:17:02.225 "superblock": false, 00:17:02.225 "num_base_bdevs": 4, 00:17:02.225 "num_base_bdevs_discovered": 1, 00:17:02.225 "num_base_bdevs_operational": 4, 00:17:02.225 "base_bdevs_list": [ 00:17:02.225 { 00:17:02.225 "name": "BaseBdev1", 00:17:02.225 "uuid": "ed832ffb-82d3-467d-b33d-d2fe11dd2a47", 00:17:02.225 "is_configured": true, 00:17:02.225 "data_offset": 0, 00:17:02.225 "data_size": 65536 00:17:02.225 }, 00:17:02.225 { 00:17:02.225 "name": "BaseBdev2", 00:17:02.225 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:02.225 "is_configured": false, 00:17:02.225 "data_offset": 0, 00:17:02.225 "data_size": 0 00:17:02.225 }, 00:17:02.225 { 00:17:02.225 "name": "BaseBdev3", 00:17:02.225 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:02.225 "is_configured": false, 00:17:02.225 "data_offset": 0, 00:17:02.225 "data_size": 0 00:17:02.225 }, 00:17:02.225 { 00:17:02.225 "name": "BaseBdev4", 00:17:02.225 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:02.225 "is_configured": false, 00:17:02.225 "data_offset": 0, 00:17:02.225 "data_size": 0 00:17:02.225 } 00:17:02.225 ] 00:17:02.225 }' 00:17:02.225 11:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:02.225 11:57:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:02.793 11:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:03.051 [2024-07-12 11:57:53.148246] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:03.051 BaseBdev2 00:17:03.051 11:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:17:03.051 11:57:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:03.051 11:57:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:03.051 11:57:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:03.051 11:57:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:03.051 11:57:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:03.051 11:57:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:03.310 11:57:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:03.310 [ 00:17:03.310 { 00:17:03.310 "name": "BaseBdev2", 00:17:03.310 "aliases": [ 00:17:03.310 "3991cb2e-0edb-4e13-9df1-eef24f296756" 00:17:03.310 ], 00:17:03.310 "product_name": "Malloc disk", 00:17:03.310 "block_size": 512, 00:17:03.310 "num_blocks": 65536, 00:17:03.310 "uuid": "3991cb2e-0edb-4e13-9df1-eef24f296756", 00:17:03.310 "assigned_rate_limits": { 00:17:03.310 "rw_ios_per_sec": 0, 00:17:03.310 "rw_mbytes_per_sec": 0, 00:17:03.310 "r_mbytes_per_sec": 0, 00:17:03.310 "w_mbytes_per_sec": 0 00:17:03.310 }, 00:17:03.310 "claimed": true, 00:17:03.310 "claim_type": "exclusive_write", 00:17:03.310 "zoned": false, 00:17:03.310 "supported_io_types": { 00:17:03.310 "read": true, 00:17:03.310 "write": true, 00:17:03.310 "unmap": true, 00:17:03.310 "flush": true, 00:17:03.310 "reset": true, 00:17:03.310 "nvme_admin": false, 00:17:03.310 "nvme_io": false, 00:17:03.310 "nvme_io_md": false, 00:17:03.310 "write_zeroes": true, 00:17:03.310 "zcopy": true, 00:17:03.310 "get_zone_info": false, 00:17:03.310 "zone_management": false, 00:17:03.310 "zone_append": false, 00:17:03.310 "compare": false, 00:17:03.310 "compare_and_write": false, 00:17:03.310 "abort": true, 00:17:03.310 "seek_hole": false, 00:17:03.310 "seek_data": false, 00:17:03.310 "copy": true, 00:17:03.310 "nvme_iov_md": false 00:17:03.310 }, 00:17:03.310 "memory_domains": [ 00:17:03.310 { 00:17:03.310 "dma_device_id": "system", 00:17:03.310 "dma_device_type": 1 00:17:03.310 }, 00:17:03.310 { 00:17:03.310 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:03.310 "dma_device_type": 2 00:17:03.310 } 00:17:03.310 ], 00:17:03.310 "driver_specific": {} 00:17:03.310 } 00:17:03.310 ] 00:17:03.310 11:57:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:03.310 11:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:03.311 11:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:03.311 11:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:03.311 11:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:03.311 11:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:03.311 11:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:03.311 11:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:03.311 11:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:03.311 11:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:03.311 11:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:03.311 11:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:03.311 11:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:03.311 11:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:03.311 11:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:03.569 11:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:03.569 "name": "Existed_Raid", 00:17:03.569 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:03.569 "strip_size_kb": 0, 00:17:03.569 "state": "configuring", 00:17:03.569 "raid_level": "raid1", 00:17:03.569 "superblock": false, 00:17:03.569 "num_base_bdevs": 4, 00:17:03.569 "num_base_bdevs_discovered": 2, 00:17:03.569 "num_base_bdevs_operational": 4, 00:17:03.569 "base_bdevs_list": [ 00:17:03.569 { 00:17:03.569 "name": "BaseBdev1", 00:17:03.569 "uuid": "ed832ffb-82d3-467d-b33d-d2fe11dd2a47", 00:17:03.569 "is_configured": true, 00:17:03.569 "data_offset": 0, 00:17:03.569 "data_size": 65536 00:17:03.569 }, 00:17:03.569 { 00:17:03.569 "name": "BaseBdev2", 00:17:03.569 "uuid": "3991cb2e-0edb-4e13-9df1-eef24f296756", 00:17:03.569 "is_configured": true, 00:17:03.569 "data_offset": 0, 00:17:03.569 "data_size": 65536 00:17:03.569 }, 00:17:03.569 { 00:17:03.569 "name": "BaseBdev3", 00:17:03.569 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:03.569 "is_configured": false, 00:17:03.569 "data_offset": 0, 00:17:03.569 "data_size": 0 00:17:03.569 }, 00:17:03.569 { 00:17:03.569 "name": "BaseBdev4", 00:17:03.569 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:03.569 "is_configured": false, 00:17:03.569 "data_offset": 0, 00:17:03.569 "data_size": 0 00:17:03.569 } 00:17:03.569 ] 00:17:03.569 }' 00:17:03.569 11:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:03.569 11:57:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:04.135 11:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:04.136 [2024-07-12 11:57:54.293965] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:04.136 BaseBdev3 00:17:04.136 11:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:17:04.136 11:57:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:04.136 11:57:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:04.136 11:57:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:04.136 11:57:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:04.136 11:57:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:04.136 11:57:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:04.394 11:57:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:04.653 [ 00:17:04.653 { 00:17:04.653 "name": "BaseBdev3", 00:17:04.653 "aliases": [ 00:17:04.653 "5b06c6e6-b865-4a49-9fd8-2870609e6d94" 00:17:04.653 ], 00:17:04.653 "product_name": "Malloc disk", 00:17:04.653 "block_size": 512, 00:17:04.653 "num_blocks": 65536, 00:17:04.653 "uuid": "5b06c6e6-b865-4a49-9fd8-2870609e6d94", 00:17:04.653 "assigned_rate_limits": { 00:17:04.653 "rw_ios_per_sec": 0, 00:17:04.653 "rw_mbytes_per_sec": 0, 00:17:04.653 "r_mbytes_per_sec": 0, 00:17:04.653 "w_mbytes_per_sec": 0 00:17:04.653 }, 00:17:04.653 "claimed": true, 00:17:04.653 "claim_type": "exclusive_write", 00:17:04.653 "zoned": false, 00:17:04.653 "supported_io_types": { 00:17:04.653 "read": true, 00:17:04.653 "write": true, 00:17:04.653 "unmap": true, 00:17:04.653 "flush": true, 00:17:04.653 "reset": true, 00:17:04.653 "nvme_admin": false, 00:17:04.653 "nvme_io": false, 00:17:04.653 "nvme_io_md": false, 00:17:04.653 "write_zeroes": true, 00:17:04.653 "zcopy": true, 00:17:04.653 "get_zone_info": false, 00:17:04.653 "zone_management": false, 00:17:04.653 "zone_append": false, 00:17:04.653 "compare": false, 00:17:04.653 "compare_and_write": false, 00:17:04.653 "abort": true, 00:17:04.653 "seek_hole": false, 00:17:04.653 "seek_data": false, 00:17:04.653 "copy": true, 00:17:04.653 "nvme_iov_md": false 00:17:04.653 }, 00:17:04.653 "memory_domains": [ 00:17:04.653 { 00:17:04.653 "dma_device_id": "system", 00:17:04.653 "dma_device_type": 1 00:17:04.653 }, 00:17:04.653 { 00:17:04.653 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:04.653 "dma_device_type": 2 00:17:04.653 } 00:17:04.653 ], 00:17:04.653 "driver_specific": {} 00:17:04.653 } 00:17:04.653 ] 00:17:04.653 11:57:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:04.653 11:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:04.653 11:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:04.653 11:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:04.653 11:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:04.653 11:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:04.653 11:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:04.653 11:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:04.653 11:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:04.653 11:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:04.653 11:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:04.653 11:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:04.653 11:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:04.653 11:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:04.653 11:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:04.653 11:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:04.653 "name": "Existed_Raid", 00:17:04.653 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:04.653 "strip_size_kb": 0, 00:17:04.653 "state": "configuring", 00:17:04.653 "raid_level": "raid1", 00:17:04.653 "superblock": false, 00:17:04.653 "num_base_bdevs": 4, 00:17:04.653 "num_base_bdevs_discovered": 3, 00:17:04.653 "num_base_bdevs_operational": 4, 00:17:04.653 "base_bdevs_list": [ 00:17:04.653 { 00:17:04.653 "name": "BaseBdev1", 00:17:04.653 "uuid": "ed832ffb-82d3-467d-b33d-d2fe11dd2a47", 00:17:04.653 "is_configured": true, 00:17:04.653 "data_offset": 0, 00:17:04.653 "data_size": 65536 00:17:04.653 }, 00:17:04.653 { 00:17:04.653 "name": "BaseBdev2", 00:17:04.653 "uuid": "3991cb2e-0edb-4e13-9df1-eef24f296756", 00:17:04.653 "is_configured": true, 00:17:04.653 "data_offset": 0, 00:17:04.653 "data_size": 65536 00:17:04.653 }, 00:17:04.653 { 00:17:04.653 "name": "BaseBdev3", 00:17:04.653 "uuid": "5b06c6e6-b865-4a49-9fd8-2870609e6d94", 00:17:04.653 "is_configured": true, 00:17:04.653 "data_offset": 0, 00:17:04.653 "data_size": 65536 00:17:04.653 }, 00:17:04.653 { 00:17:04.653 "name": "BaseBdev4", 00:17:04.653 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:04.653 "is_configured": false, 00:17:04.653 "data_offset": 0, 00:17:04.653 "data_size": 0 00:17:04.653 } 00:17:04.653 ] 00:17:04.653 }' 00:17:04.653 11:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:04.653 11:57:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:05.220 11:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:05.479 [2024-07-12 11:57:55.479696] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:05.479 [2024-07-12 11:57:55.479723] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2480b90 00:17:05.479 [2024-07-12 11:57:55.479728] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:17:05.479 [2024-07-12 11:57:55.479861] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2480700 00:17:05.479 [2024-07-12 11:57:55.479949] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2480b90 00:17:05.479 [2024-07-12 11:57:55.479954] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2480b90 00:17:05.479 [2024-07-12 11:57:55.480066] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:05.479 BaseBdev4 00:17:05.479 11:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:17:05.479 11:57:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:17:05.479 11:57:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:05.479 11:57:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:05.479 11:57:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:05.479 11:57:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:05.479 11:57:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:05.479 11:57:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:05.738 [ 00:17:05.738 { 00:17:05.738 "name": "BaseBdev4", 00:17:05.738 "aliases": [ 00:17:05.738 "ff3fc9c9-1d42-4542-a286-3678d089560b" 00:17:05.738 ], 00:17:05.738 "product_name": "Malloc disk", 00:17:05.738 "block_size": 512, 00:17:05.738 "num_blocks": 65536, 00:17:05.738 "uuid": "ff3fc9c9-1d42-4542-a286-3678d089560b", 00:17:05.738 "assigned_rate_limits": { 00:17:05.738 "rw_ios_per_sec": 0, 00:17:05.738 "rw_mbytes_per_sec": 0, 00:17:05.738 "r_mbytes_per_sec": 0, 00:17:05.738 "w_mbytes_per_sec": 0 00:17:05.738 }, 00:17:05.738 "claimed": true, 00:17:05.738 "claim_type": "exclusive_write", 00:17:05.738 "zoned": false, 00:17:05.738 "supported_io_types": { 00:17:05.738 "read": true, 00:17:05.738 "write": true, 00:17:05.738 "unmap": true, 00:17:05.738 "flush": true, 00:17:05.738 "reset": true, 00:17:05.738 "nvme_admin": false, 00:17:05.738 "nvme_io": false, 00:17:05.738 "nvme_io_md": false, 00:17:05.738 "write_zeroes": true, 00:17:05.738 "zcopy": true, 00:17:05.738 "get_zone_info": false, 00:17:05.738 "zone_management": false, 00:17:05.738 "zone_append": false, 00:17:05.738 "compare": false, 00:17:05.738 "compare_and_write": false, 00:17:05.738 "abort": true, 00:17:05.738 "seek_hole": false, 00:17:05.738 "seek_data": false, 00:17:05.738 "copy": true, 00:17:05.738 "nvme_iov_md": false 00:17:05.738 }, 00:17:05.738 "memory_domains": [ 00:17:05.738 { 00:17:05.738 "dma_device_id": "system", 00:17:05.738 "dma_device_type": 1 00:17:05.738 }, 00:17:05.738 { 00:17:05.738 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:05.738 "dma_device_type": 2 00:17:05.739 } 00:17:05.739 ], 00:17:05.739 "driver_specific": {} 00:17:05.739 } 00:17:05.739 ] 00:17:05.739 11:57:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:05.739 11:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:05.739 11:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:05.739 11:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:17:05.739 11:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:05.739 11:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:05.739 11:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:05.739 11:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:05.739 11:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:05.739 11:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:05.739 11:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:05.739 11:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:05.739 11:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:05.739 11:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:05.739 11:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:05.998 11:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:05.998 "name": "Existed_Raid", 00:17:05.998 "uuid": "c0d8add2-7ffc-4f15-9c99-20c4dba12505", 00:17:05.998 "strip_size_kb": 0, 00:17:05.998 "state": "online", 00:17:05.998 "raid_level": "raid1", 00:17:05.998 "superblock": false, 00:17:05.998 "num_base_bdevs": 4, 00:17:05.998 "num_base_bdevs_discovered": 4, 00:17:05.998 "num_base_bdevs_operational": 4, 00:17:05.998 "base_bdevs_list": [ 00:17:05.998 { 00:17:05.998 "name": "BaseBdev1", 00:17:05.998 "uuid": "ed832ffb-82d3-467d-b33d-d2fe11dd2a47", 00:17:05.998 "is_configured": true, 00:17:05.998 "data_offset": 0, 00:17:05.998 "data_size": 65536 00:17:05.998 }, 00:17:05.998 { 00:17:05.998 "name": "BaseBdev2", 00:17:05.998 "uuid": "3991cb2e-0edb-4e13-9df1-eef24f296756", 00:17:05.998 "is_configured": true, 00:17:05.998 "data_offset": 0, 00:17:05.998 "data_size": 65536 00:17:05.998 }, 00:17:05.998 { 00:17:05.998 "name": "BaseBdev3", 00:17:05.998 "uuid": "5b06c6e6-b865-4a49-9fd8-2870609e6d94", 00:17:05.998 "is_configured": true, 00:17:05.998 "data_offset": 0, 00:17:05.998 "data_size": 65536 00:17:05.998 }, 00:17:05.998 { 00:17:05.998 "name": "BaseBdev4", 00:17:05.998 "uuid": "ff3fc9c9-1d42-4542-a286-3678d089560b", 00:17:05.998 "is_configured": true, 00:17:05.998 "data_offset": 0, 00:17:05.998 "data_size": 65536 00:17:05.998 } 00:17:05.998 ] 00:17:05.998 }' 00:17:05.998 11:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:05.998 11:57:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:06.566 11:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:17:06.566 11:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:06.566 11:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:06.566 11:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:06.566 11:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:06.566 11:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:06.566 11:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:06.566 11:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:06.566 [2024-07-12 11:57:56.658951] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:06.566 11:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:06.566 "name": "Existed_Raid", 00:17:06.566 "aliases": [ 00:17:06.566 "c0d8add2-7ffc-4f15-9c99-20c4dba12505" 00:17:06.566 ], 00:17:06.566 "product_name": "Raid Volume", 00:17:06.566 "block_size": 512, 00:17:06.566 "num_blocks": 65536, 00:17:06.566 "uuid": "c0d8add2-7ffc-4f15-9c99-20c4dba12505", 00:17:06.566 "assigned_rate_limits": { 00:17:06.566 "rw_ios_per_sec": 0, 00:17:06.566 "rw_mbytes_per_sec": 0, 00:17:06.566 "r_mbytes_per_sec": 0, 00:17:06.566 "w_mbytes_per_sec": 0 00:17:06.566 }, 00:17:06.566 "claimed": false, 00:17:06.566 "zoned": false, 00:17:06.566 "supported_io_types": { 00:17:06.566 "read": true, 00:17:06.566 "write": true, 00:17:06.566 "unmap": false, 00:17:06.566 "flush": false, 00:17:06.566 "reset": true, 00:17:06.566 "nvme_admin": false, 00:17:06.566 "nvme_io": false, 00:17:06.566 "nvme_io_md": false, 00:17:06.566 "write_zeroes": true, 00:17:06.566 "zcopy": false, 00:17:06.566 "get_zone_info": false, 00:17:06.566 "zone_management": false, 00:17:06.566 "zone_append": false, 00:17:06.566 "compare": false, 00:17:06.566 "compare_and_write": false, 00:17:06.566 "abort": false, 00:17:06.566 "seek_hole": false, 00:17:06.566 "seek_data": false, 00:17:06.566 "copy": false, 00:17:06.566 "nvme_iov_md": false 00:17:06.566 }, 00:17:06.566 "memory_domains": [ 00:17:06.566 { 00:17:06.566 "dma_device_id": "system", 00:17:06.566 "dma_device_type": 1 00:17:06.566 }, 00:17:06.566 { 00:17:06.566 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:06.566 "dma_device_type": 2 00:17:06.566 }, 00:17:06.566 { 00:17:06.566 "dma_device_id": "system", 00:17:06.566 "dma_device_type": 1 00:17:06.566 }, 00:17:06.566 { 00:17:06.566 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:06.566 "dma_device_type": 2 00:17:06.566 }, 00:17:06.566 { 00:17:06.566 "dma_device_id": "system", 00:17:06.566 "dma_device_type": 1 00:17:06.566 }, 00:17:06.566 { 00:17:06.566 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:06.566 "dma_device_type": 2 00:17:06.566 }, 00:17:06.566 { 00:17:06.566 "dma_device_id": "system", 00:17:06.566 "dma_device_type": 1 00:17:06.566 }, 00:17:06.566 { 00:17:06.566 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:06.566 "dma_device_type": 2 00:17:06.566 } 00:17:06.566 ], 00:17:06.566 "driver_specific": { 00:17:06.566 "raid": { 00:17:06.566 "uuid": "c0d8add2-7ffc-4f15-9c99-20c4dba12505", 00:17:06.566 "strip_size_kb": 0, 00:17:06.566 "state": "online", 00:17:06.566 "raid_level": "raid1", 00:17:06.566 "superblock": false, 00:17:06.566 "num_base_bdevs": 4, 00:17:06.566 "num_base_bdevs_discovered": 4, 00:17:06.566 "num_base_bdevs_operational": 4, 00:17:06.566 "base_bdevs_list": [ 00:17:06.566 { 00:17:06.566 "name": "BaseBdev1", 00:17:06.566 "uuid": "ed832ffb-82d3-467d-b33d-d2fe11dd2a47", 00:17:06.566 "is_configured": true, 00:17:06.566 "data_offset": 0, 00:17:06.566 "data_size": 65536 00:17:06.566 }, 00:17:06.566 { 00:17:06.566 "name": "BaseBdev2", 00:17:06.566 "uuid": "3991cb2e-0edb-4e13-9df1-eef24f296756", 00:17:06.566 "is_configured": true, 00:17:06.566 "data_offset": 0, 00:17:06.566 "data_size": 65536 00:17:06.566 }, 00:17:06.566 { 00:17:06.566 "name": "BaseBdev3", 00:17:06.566 "uuid": "5b06c6e6-b865-4a49-9fd8-2870609e6d94", 00:17:06.566 "is_configured": true, 00:17:06.566 "data_offset": 0, 00:17:06.566 "data_size": 65536 00:17:06.566 }, 00:17:06.566 { 00:17:06.566 "name": "BaseBdev4", 00:17:06.566 "uuid": "ff3fc9c9-1d42-4542-a286-3678d089560b", 00:17:06.566 "is_configured": true, 00:17:06.566 "data_offset": 0, 00:17:06.567 "data_size": 65536 00:17:06.567 } 00:17:06.567 ] 00:17:06.567 } 00:17:06.567 } 00:17:06.567 }' 00:17:06.567 11:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:06.567 11:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:17:06.567 BaseBdev2 00:17:06.567 BaseBdev3 00:17:06.567 BaseBdev4' 00:17:06.567 11:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:06.567 11:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:06.567 11:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:06.825 11:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:06.825 "name": "BaseBdev1", 00:17:06.825 "aliases": [ 00:17:06.825 "ed832ffb-82d3-467d-b33d-d2fe11dd2a47" 00:17:06.825 ], 00:17:06.825 "product_name": "Malloc disk", 00:17:06.825 "block_size": 512, 00:17:06.825 "num_blocks": 65536, 00:17:06.825 "uuid": "ed832ffb-82d3-467d-b33d-d2fe11dd2a47", 00:17:06.825 "assigned_rate_limits": { 00:17:06.825 "rw_ios_per_sec": 0, 00:17:06.825 "rw_mbytes_per_sec": 0, 00:17:06.825 "r_mbytes_per_sec": 0, 00:17:06.825 "w_mbytes_per_sec": 0 00:17:06.825 }, 00:17:06.825 "claimed": true, 00:17:06.825 "claim_type": "exclusive_write", 00:17:06.825 "zoned": false, 00:17:06.825 "supported_io_types": { 00:17:06.825 "read": true, 00:17:06.825 "write": true, 00:17:06.825 "unmap": true, 00:17:06.825 "flush": true, 00:17:06.825 "reset": true, 00:17:06.825 "nvme_admin": false, 00:17:06.825 "nvme_io": false, 00:17:06.826 "nvme_io_md": false, 00:17:06.826 "write_zeroes": true, 00:17:06.826 "zcopy": true, 00:17:06.826 "get_zone_info": false, 00:17:06.826 "zone_management": false, 00:17:06.826 "zone_append": false, 00:17:06.826 "compare": false, 00:17:06.826 "compare_and_write": false, 00:17:06.826 "abort": true, 00:17:06.826 "seek_hole": false, 00:17:06.826 "seek_data": false, 00:17:06.826 "copy": true, 00:17:06.826 "nvme_iov_md": false 00:17:06.826 }, 00:17:06.826 "memory_domains": [ 00:17:06.826 { 00:17:06.826 "dma_device_id": "system", 00:17:06.826 "dma_device_type": 1 00:17:06.826 }, 00:17:06.826 { 00:17:06.826 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:06.826 "dma_device_type": 2 00:17:06.826 } 00:17:06.826 ], 00:17:06.826 "driver_specific": {} 00:17:06.826 }' 00:17:06.826 11:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:06.826 11:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:06.826 11:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:06.826 11:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:06.826 11:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:06.826 11:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:06.826 11:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:07.084 11:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:07.084 11:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:07.084 11:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:07.084 11:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:07.084 11:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:07.084 11:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:07.084 11:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:07.084 11:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:07.343 11:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:07.343 "name": "BaseBdev2", 00:17:07.343 "aliases": [ 00:17:07.343 "3991cb2e-0edb-4e13-9df1-eef24f296756" 00:17:07.343 ], 00:17:07.343 "product_name": "Malloc disk", 00:17:07.343 "block_size": 512, 00:17:07.343 "num_blocks": 65536, 00:17:07.343 "uuid": "3991cb2e-0edb-4e13-9df1-eef24f296756", 00:17:07.343 "assigned_rate_limits": { 00:17:07.343 "rw_ios_per_sec": 0, 00:17:07.343 "rw_mbytes_per_sec": 0, 00:17:07.343 "r_mbytes_per_sec": 0, 00:17:07.343 "w_mbytes_per_sec": 0 00:17:07.343 }, 00:17:07.343 "claimed": true, 00:17:07.343 "claim_type": "exclusive_write", 00:17:07.343 "zoned": false, 00:17:07.343 "supported_io_types": { 00:17:07.343 "read": true, 00:17:07.343 "write": true, 00:17:07.343 "unmap": true, 00:17:07.343 "flush": true, 00:17:07.343 "reset": true, 00:17:07.343 "nvme_admin": false, 00:17:07.343 "nvme_io": false, 00:17:07.343 "nvme_io_md": false, 00:17:07.343 "write_zeroes": true, 00:17:07.343 "zcopy": true, 00:17:07.343 "get_zone_info": false, 00:17:07.343 "zone_management": false, 00:17:07.343 "zone_append": false, 00:17:07.343 "compare": false, 00:17:07.343 "compare_and_write": false, 00:17:07.343 "abort": true, 00:17:07.343 "seek_hole": false, 00:17:07.343 "seek_data": false, 00:17:07.343 "copy": true, 00:17:07.343 "nvme_iov_md": false 00:17:07.343 }, 00:17:07.343 "memory_domains": [ 00:17:07.343 { 00:17:07.343 "dma_device_id": "system", 00:17:07.343 "dma_device_type": 1 00:17:07.343 }, 00:17:07.343 { 00:17:07.343 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:07.343 "dma_device_type": 2 00:17:07.343 } 00:17:07.343 ], 00:17:07.343 "driver_specific": {} 00:17:07.343 }' 00:17:07.343 11:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:07.343 11:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:07.343 11:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:07.343 11:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:07.343 11:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:07.343 11:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:07.343 11:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:07.343 11:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:07.603 11:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:07.603 11:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:07.603 11:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:07.603 11:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:07.603 11:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:07.603 11:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:07.603 11:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:07.603 11:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:07.603 "name": "BaseBdev3", 00:17:07.603 "aliases": [ 00:17:07.603 "5b06c6e6-b865-4a49-9fd8-2870609e6d94" 00:17:07.603 ], 00:17:07.603 "product_name": "Malloc disk", 00:17:07.603 "block_size": 512, 00:17:07.603 "num_blocks": 65536, 00:17:07.603 "uuid": "5b06c6e6-b865-4a49-9fd8-2870609e6d94", 00:17:07.603 "assigned_rate_limits": { 00:17:07.603 "rw_ios_per_sec": 0, 00:17:07.603 "rw_mbytes_per_sec": 0, 00:17:07.603 "r_mbytes_per_sec": 0, 00:17:07.603 "w_mbytes_per_sec": 0 00:17:07.603 }, 00:17:07.603 "claimed": true, 00:17:07.603 "claim_type": "exclusive_write", 00:17:07.603 "zoned": false, 00:17:07.603 "supported_io_types": { 00:17:07.603 "read": true, 00:17:07.603 "write": true, 00:17:07.603 "unmap": true, 00:17:07.603 "flush": true, 00:17:07.603 "reset": true, 00:17:07.603 "nvme_admin": false, 00:17:07.603 "nvme_io": false, 00:17:07.603 "nvme_io_md": false, 00:17:07.603 "write_zeroes": true, 00:17:07.603 "zcopy": true, 00:17:07.603 "get_zone_info": false, 00:17:07.603 "zone_management": false, 00:17:07.603 "zone_append": false, 00:17:07.603 "compare": false, 00:17:07.603 "compare_and_write": false, 00:17:07.603 "abort": true, 00:17:07.603 "seek_hole": false, 00:17:07.603 "seek_data": false, 00:17:07.603 "copy": true, 00:17:07.603 "nvme_iov_md": false 00:17:07.603 }, 00:17:07.603 "memory_domains": [ 00:17:07.603 { 00:17:07.603 "dma_device_id": "system", 00:17:07.603 "dma_device_type": 1 00:17:07.603 }, 00:17:07.603 { 00:17:07.603 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:07.603 "dma_device_type": 2 00:17:07.603 } 00:17:07.603 ], 00:17:07.603 "driver_specific": {} 00:17:07.603 }' 00:17:07.603 11:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:07.861 11:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:07.861 11:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:07.861 11:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:07.861 11:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:07.861 11:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:07.861 11:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:07.861 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:07.861 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:07.861 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:07.861 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:08.120 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:08.120 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:08.120 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:08.120 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:08.120 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:08.120 "name": "BaseBdev4", 00:17:08.120 "aliases": [ 00:17:08.120 "ff3fc9c9-1d42-4542-a286-3678d089560b" 00:17:08.120 ], 00:17:08.120 "product_name": "Malloc disk", 00:17:08.120 "block_size": 512, 00:17:08.120 "num_blocks": 65536, 00:17:08.120 "uuid": "ff3fc9c9-1d42-4542-a286-3678d089560b", 00:17:08.120 "assigned_rate_limits": { 00:17:08.120 "rw_ios_per_sec": 0, 00:17:08.120 "rw_mbytes_per_sec": 0, 00:17:08.120 "r_mbytes_per_sec": 0, 00:17:08.120 "w_mbytes_per_sec": 0 00:17:08.120 }, 00:17:08.120 "claimed": true, 00:17:08.120 "claim_type": "exclusive_write", 00:17:08.120 "zoned": false, 00:17:08.120 "supported_io_types": { 00:17:08.120 "read": true, 00:17:08.120 "write": true, 00:17:08.120 "unmap": true, 00:17:08.120 "flush": true, 00:17:08.120 "reset": true, 00:17:08.120 "nvme_admin": false, 00:17:08.120 "nvme_io": false, 00:17:08.120 "nvme_io_md": false, 00:17:08.120 "write_zeroes": true, 00:17:08.120 "zcopy": true, 00:17:08.120 "get_zone_info": false, 00:17:08.120 "zone_management": false, 00:17:08.120 "zone_append": false, 00:17:08.120 "compare": false, 00:17:08.120 "compare_and_write": false, 00:17:08.120 "abort": true, 00:17:08.120 "seek_hole": false, 00:17:08.120 "seek_data": false, 00:17:08.120 "copy": true, 00:17:08.120 "nvme_iov_md": false 00:17:08.120 }, 00:17:08.120 "memory_domains": [ 00:17:08.120 { 00:17:08.120 "dma_device_id": "system", 00:17:08.120 "dma_device_type": 1 00:17:08.120 }, 00:17:08.120 { 00:17:08.120 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:08.120 "dma_device_type": 2 00:17:08.120 } 00:17:08.120 ], 00:17:08.120 "driver_specific": {} 00:17:08.120 }' 00:17:08.120 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:08.120 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:08.378 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:08.378 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:08.378 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:08.378 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:08.378 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:08.378 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:08.378 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:08.378 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:08.378 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:08.378 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:08.378 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:08.637 [2024-07-12 11:57:58.748194] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:08.637 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:17:08.637 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:17:08.637 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:08.637 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:08.637 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:17:08.637 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:17:08.637 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:08.637 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:08.637 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:08.637 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:08.637 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:08.637 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:08.637 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:08.637 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:08.637 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:08.637 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:08.637 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:08.896 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:08.896 "name": "Existed_Raid", 00:17:08.896 "uuid": "c0d8add2-7ffc-4f15-9c99-20c4dba12505", 00:17:08.896 "strip_size_kb": 0, 00:17:08.896 "state": "online", 00:17:08.896 "raid_level": "raid1", 00:17:08.896 "superblock": false, 00:17:08.896 "num_base_bdevs": 4, 00:17:08.896 "num_base_bdevs_discovered": 3, 00:17:08.896 "num_base_bdevs_operational": 3, 00:17:08.896 "base_bdevs_list": [ 00:17:08.896 { 00:17:08.896 "name": null, 00:17:08.896 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:08.896 "is_configured": false, 00:17:08.896 "data_offset": 0, 00:17:08.896 "data_size": 65536 00:17:08.896 }, 00:17:08.896 { 00:17:08.896 "name": "BaseBdev2", 00:17:08.896 "uuid": "3991cb2e-0edb-4e13-9df1-eef24f296756", 00:17:08.896 "is_configured": true, 00:17:08.896 "data_offset": 0, 00:17:08.896 "data_size": 65536 00:17:08.896 }, 00:17:08.896 { 00:17:08.896 "name": "BaseBdev3", 00:17:08.896 "uuid": "5b06c6e6-b865-4a49-9fd8-2870609e6d94", 00:17:08.896 "is_configured": true, 00:17:08.896 "data_offset": 0, 00:17:08.896 "data_size": 65536 00:17:08.896 }, 00:17:08.896 { 00:17:08.896 "name": "BaseBdev4", 00:17:08.896 "uuid": "ff3fc9c9-1d42-4542-a286-3678d089560b", 00:17:08.896 "is_configured": true, 00:17:08.896 "data_offset": 0, 00:17:08.896 "data_size": 65536 00:17:08.896 } 00:17:08.896 ] 00:17:08.896 }' 00:17:08.896 11:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:08.896 11:57:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:09.463 11:57:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:17:09.463 11:57:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:09.463 11:57:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:09.463 11:57:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:09.463 11:57:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:09.463 11:57:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:09.463 11:57:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:17:09.723 [2024-07-12 11:57:59.723651] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:09.723 11:57:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:09.723 11:57:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:09.723 11:57:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:09.723 11:57:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:09.723 11:57:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:09.723 11:57:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:09.723 11:57:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:17:09.982 [2024-07-12 11:58:00.082213] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:09.982 11:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:09.982 11:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:09.982 11:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:09.982 11:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:10.240 11:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:10.240 11:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:10.241 11:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:17:10.241 [2024-07-12 11:58:00.428662] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:17:10.241 [2024-07-12 11:58:00.428717] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:10.241 [2024-07-12 11:58:00.438740] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:10.241 [2024-07-12 11:58:00.438764] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:10.241 [2024-07-12 11:58:00.438769] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2480b90 name Existed_Raid, state offline 00:17:10.241 11:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:10.241 11:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:10.241 11:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:10.241 11:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:17:10.500 11:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:17:10.500 11:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:17:10.500 11:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:17:10.500 11:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:17:10.500 11:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:10.500 11:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:10.759 BaseBdev2 00:17:10.759 11:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:17:10.759 11:58:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:10.759 11:58:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:10.759 11:58:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:10.759 11:58:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:10.759 11:58:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:10.759 11:58:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:10.759 11:58:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:11.019 [ 00:17:11.019 { 00:17:11.019 "name": "BaseBdev2", 00:17:11.020 "aliases": [ 00:17:11.020 "4c64537f-23b4-4ed7-94c6-ac8d22953c6f" 00:17:11.020 ], 00:17:11.020 "product_name": "Malloc disk", 00:17:11.020 "block_size": 512, 00:17:11.020 "num_blocks": 65536, 00:17:11.020 "uuid": "4c64537f-23b4-4ed7-94c6-ac8d22953c6f", 00:17:11.020 "assigned_rate_limits": { 00:17:11.020 "rw_ios_per_sec": 0, 00:17:11.020 "rw_mbytes_per_sec": 0, 00:17:11.020 "r_mbytes_per_sec": 0, 00:17:11.020 "w_mbytes_per_sec": 0 00:17:11.020 }, 00:17:11.020 "claimed": false, 00:17:11.020 "zoned": false, 00:17:11.020 "supported_io_types": { 00:17:11.020 "read": true, 00:17:11.020 "write": true, 00:17:11.020 "unmap": true, 00:17:11.020 "flush": true, 00:17:11.020 "reset": true, 00:17:11.020 "nvme_admin": false, 00:17:11.020 "nvme_io": false, 00:17:11.020 "nvme_io_md": false, 00:17:11.020 "write_zeroes": true, 00:17:11.020 "zcopy": true, 00:17:11.020 "get_zone_info": false, 00:17:11.020 "zone_management": false, 00:17:11.020 "zone_append": false, 00:17:11.020 "compare": false, 00:17:11.020 "compare_and_write": false, 00:17:11.020 "abort": true, 00:17:11.020 "seek_hole": false, 00:17:11.020 "seek_data": false, 00:17:11.020 "copy": true, 00:17:11.020 "nvme_iov_md": false 00:17:11.020 }, 00:17:11.020 "memory_domains": [ 00:17:11.020 { 00:17:11.020 "dma_device_id": "system", 00:17:11.020 "dma_device_type": 1 00:17:11.020 }, 00:17:11.020 { 00:17:11.020 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:11.020 "dma_device_type": 2 00:17:11.020 } 00:17:11.020 ], 00:17:11.020 "driver_specific": {} 00:17:11.020 } 00:17:11.020 ] 00:17:11.020 11:58:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:11.020 11:58:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:11.020 11:58:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:11.020 11:58:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:11.020 BaseBdev3 00:17:11.278 11:58:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:17:11.278 11:58:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:11.278 11:58:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:11.278 11:58:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:11.278 11:58:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:11.278 11:58:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:11.278 11:58:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:11.278 11:58:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:11.535 [ 00:17:11.535 { 00:17:11.535 "name": "BaseBdev3", 00:17:11.535 "aliases": [ 00:17:11.535 "1af89eaf-0825-41b6-a855-9913160a5e58" 00:17:11.535 ], 00:17:11.535 "product_name": "Malloc disk", 00:17:11.535 "block_size": 512, 00:17:11.535 "num_blocks": 65536, 00:17:11.535 "uuid": "1af89eaf-0825-41b6-a855-9913160a5e58", 00:17:11.535 "assigned_rate_limits": { 00:17:11.535 "rw_ios_per_sec": 0, 00:17:11.535 "rw_mbytes_per_sec": 0, 00:17:11.535 "r_mbytes_per_sec": 0, 00:17:11.535 "w_mbytes_per_sec": 0 00:17:11.535 }, 00:17:11.535 "claimed": false, 00:17:11.535 "zoned": false, 00:17:11.535 "supported_io_types": { 00:17:11.535 "read": true, 00:17:11.536 "write": true, 00:17:11.536 "unmap": true, 00:17:11.536 "flush": true, 00:17:11.536 "reset": true, 00:17:11.536 "nvme_admin": false, 00:17:11.536 "nvme_io": false, 00:17:11.536 "nvme_io_md": false, 00:17:11.536 "write_zeroes": true, 00:17:11.536 "zcopy": true, 00:17:11.536 "get_zone_info": false, 00:17:11.536 "zone_management": false, 00:17:11.536 "zone_append": false, 00:17:11.536 "compare": false, 00:17:11.536 "compare_and_write": false, 00:17:11.536 "abort": true, 00:17:11.536 "seek_hole": false, 00:17:11.536 "seek_data": false, 00:17:11.536 "copy": true, 00:17:11.536 "nvme_iov_md": false 00:17:11.536 }, 00:17:11.536 "memory_domains": [ 00:17:11.536 { 00:17:11.536 "dma_device_id": "system", 00:17:11.536 "dma_device_type": 1 00:17:11.536 }, 00:17:11.536 { 00:17:11.536 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:11.536 "dma_device_type": 2 00:17:11.536 } 00:17:11.536 ], 00:17:11.536 "driver_specific": {} 00:17:11.536 } 00:17:11.536 ] 00:17:11.536 11:58:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:11.536 11:58:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:11.536 11:58:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:11.536 11:58:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:11.536 BaseBdev4 00:17:11.536 11:58:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:17:11.536 11:58:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:17:11.536 11:58:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:11.794 11:58:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:11.794 11:58:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:11.794 11:58:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:11.794 11:58:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:11.794 11:58:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:12.052 [ 00:17:12.052 { 00:17:12.052 "name": "BaseBdev4", 00:17:12.052 "aliases": [ 00:17:12.052 "dc5014d8-3908-4b0d-9900-dea7e901a722" 00:17:12.052 ], 00:17:12.052 "product_name": "Malloc disk", 00:17:12.052 "block_size": 512, 00:17:12.052 "num_blocks": 65536, 00:17:12.052 "uuid": "dc5014d8-3908-4b0d-9900-dea7e901a722", 00:17:12.052 "assigned_rate_limits": { 00:17:12.052 "rw_ios_per_sec": 0, 00:17:12.052 "rw_mbytes_per_sec": 0, 00:17:12.052 "r_mbytes_per_sec": 0, 00:17:12.052 "w_mbytes_per_sec": 0 00:17:12.052 }, 00:17:12.052 "claimed": false, 00:17:12.052 "zoned": false, 00:17:12.052 "supported_io_types": { 00:17:12.052 "read": true, 00:17:12.052 "write": true, 00:17:12.052 "unmap": true, 00:17:12.052 "flush": true, 00:17:12.052 "reset": true, 00:17:12.052 "nvme_admin": false, 00:17:12.052 "nvme_io": false, 00:17:12.052 "nvme_io_md": false, 00:17:12.052 "write_zeroes": true, 00:17:12.052 "zcopy": true, 00:17:12.052 "get_zone_info": false, 00:17:12.052 "zone_management": false, 00:17:12.052 "zone_append": false, 00:17:12.052 "compare": false, 00:17:12.052 "compare_and_write": false, 00:17:12.052 "abort": true, 00:17:12.052 "seek_hole": false, 00:17:12.052 "seek_data": false, 00:17:12.052 "copy": true, 00:17:12.052 "nvme_iov_md": false 00:17:12.052 }, 00:17:12.052 "memory_domains": [ 00:17:12.052 { 00:17:12.052 "dma_device_id": "system", 00:17:12.052 "dma_device_type": 1 00:17:12.052 }, 00:17:12.052 { 00:17:12.052 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:12.052 "dma_device_type": 2 00:17:12.052 } 00:17:12.052 ], 00:17:12.052 "driver_specific": {} 00:17:12.052 } 00:17:12.052 ] 00:17:12.052 11:58:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:12.052 11:58:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:12.052 11:58:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:12.052 11:58:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:12.052 [2024-07-12 11:58:02.262311] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:12.052 [2024-07-12 11:58:02.262339] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:12.052 [2024-07-12 11:58:02.262350] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:12.052 [2024-07-12 11:58:02.263312] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:12.052 [2024-07-12 11:58:02.263342] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:12.052 11:58:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:12.052 11:58:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:12.052 11:58:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:12.052 11:58:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:12.052 11:58:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:12.052 11:58:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:12.052 11:58:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:12.052 11:58:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:12.052 11:58:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:12.052 11:58:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:12.052 11:58:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:12.052 11:58:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:12.310 11:58:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:12.310 "name": "Existed_Raid", 00:17:12.310 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:12.310 "strip_size_kb": 0, 00:17:12.310 "state": "configuring", 00:17:12.310 "raid_level": "raid1", 00:17:12.310 "superblock": false, 00:17:12.310 "num_base_bdevs": 4, 00:17:12.310 "num_base_bdevs_discovered": 3, 00:17:12.310 "num_base_bdevs_operational": 4, 00:17:12.310 "base_bdevs_list": [ 00:17:12.310 { 00:17:12.310 "name": "BaseBdev1", 00:17:12.310 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:12.310 "is_configured": false, 00:17:12.310 "data_offset": 0, 00:17:12.310 "data_size": 0 00:17:12.310 }, 00:17:12.310 { 00:17:12.310 "name": "BaseBdev2", 00:17:12.310 "uuid": "4c64537f-23b4-4ed7-94c6-ac8d22953c6f", 00:17:12.310 "is_configured": true, 00:17:12.310 "data_offset": 0, 00:17:12.310 "data_size": 65536 00:17:12.310 }, 00:17:12.310 { 00:17:12.310 "name": "BaseBdev3", 00:17:12.310 "uuid": "1af89eaf-0825-41b6-a855-9913160a5e58", 00:17:12.310 "is_configured": true, 00:17:12.310 "data_offset": 0, 00:17:12.310 "data_size": 65536 00:17:12.310 }, 00:17:12.310 { 00:17:12.310 "name": "BaseBdev4", 00:17:12.310 "uuid": "dc5014d8-3908-4b0d-9900-dea7e901a722", 00:17:12.310 "is_configured": true, 00:17:12.310 "data_offset": 0, 00:17:12.310 "data_size": 65536 00:17:12.310 } 00:17:12.310 ] 00:17:12.310 }' 00:17:12.310 11:58:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:12.310 11:58:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:12.876 11:58:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:17:12.876 [2024-07-12 11:58:03.072409] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:12.876 11:58:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:12.876 11:58:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:12.876 11:58:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:12.876 11:58:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:12.876 11:58:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:12.876 11:58:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:12.876 11:58:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:12.876 11:58:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:12.876 11:58:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:12.876 11:58:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:12.876 11:58:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:12.876 11:58:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:13.133 11:58:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:13.133 "name": "Existed_Raid", 00:17:13.133 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:13.133 "strip_size_kb": 0, 00:17:13.133 "state": "configuring", 00:17:13.133 "raid_level": "raid1", 00:17:13.133 "superblock": false, 00:17:13.133 "num_base_bdevs": 4, 00:17:13.133 "num_base_bdevs_discovered": 2, 00:17:13.133 "num_base_bdevs_operational": 4, 00:17:13.133 "base_bdevs_list": [ 00:17:13.133 { 00:17:13.133 "name": "BaseBdev1", 00:17:13.133 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:13.133 "is_configured": false, 00:17:13.133 "data_offset": 0, 00:17:13.133 "data_size": 0 00:17:13.133 }, 00:17:13.133 { 00:17:13.133 "name": null, 00:17:13.133 "uuid": "4c64537f-23b4-4ed7-94c6-ac8d22953c6f", 00:17:13.133 "is_configured": false, 00:17:13.133 "data_offset": 0, 00:17:13.133 "data_size": 65536 00:17:13.133 }, 00:17:13.133 { 00:17:13.133 "name": "BaseBdev3", 00:17:13.133 "uuid": "1af89eaf-0825-41b6-a855-9913160a5e58", 00:17:13.133 "is_configured": true, 00:17:13.133 "data_offset": 0, 00:17:13.133 "data_size": 65536 00:17:13.133 }, 00:17:13.133 { 00:17:13.133 "name": "BaseBdev4", 00:17:13.133 "uuid": "dc5014d8-3908-4b0d-9900-dea7e901a722", 00:17:13.133 "is_configured": true, 00:17:13.133 "data_offset": 0, 00:17:13.133 "data_size": 65536 00:17:13.133 } 00:17:13.133 ] 00:17:13.133 }' 00:17:13.133 11:58:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:13.133 11:58:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:13.695 11:58:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:13.695 11:58:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:13.695 11:58:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:17:13.695 11:58:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:13.952 [2024-07-12 11:58:04.065911] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:13.952 BaseBdev1 00:17:13.952 11:58:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:17:13.952 11:58:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:13.952 11:58:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:13.952 11:58:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:13.952 11:58:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:13.952 11:58:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:13.952 11:58:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:14.210 11:58:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:14.210 [ 00:17:14.210 { 00:17:14.210 "name": "BaseBdev1", 00:17:14.210 "aliases": [ 00:17:14.210 "0b3e3bfa-4c2f-4e1f-bbf4-ef3b89af18df" 00:17:14.210 ], 00:17:14.210 "product_name": "Malloc disk", 00:17:14.210 "block_size": 512, 00:17:14.210 "num_blocks": 65536, 00:17:14.210 "uuid": "0b3e3bfa-4c2f-4e1f-bbf4-ef3b89af18df", 00:17:14.210 "assigned_rate_limits": { 00:17:14.210 "rw_ios_per_sec": 0, 00:17:14.210 "rw_mbytes_per_sec": 0, 00:17:14.210 "r_mbytes_per_sec": 0, 00:17:14.210 "w_mbytes_per_sec": 0 00:17:14.210 }, 00:17:14.210 "claimed": true, 00:17:14.210 "claim_type": "exclusive_write", 00:17:14.210 "zoned": false, 00:17:14.210 "supported_io_types": { 00:17:14.210 "read": true, 00:17:14.210 "write": true, 00:17:14.210 "unmap": true, 00:17:14.210 "flush": true, 00:17:14.210 "reset": true, 00:17:14.210 "nvme_admin": false, 00:17:14.210 "nvme_io": false, 00:17:14.210 "nvme_io_md": false, 00:17:14.210 "write_zeroes": true, 00:17:14.210 "zcopy": true, 00:17:14.210 "get_zone_info": false, 00:17:14.210 "zone_management": false, 00:17:14.210 "zone_append": false, 00:17:14.210 "compare": false, 00:17:14.210 "compare_and_write": false, 00:17:14.210 "abort": true, 00:17:14.210 "seek_hole": false, 00:17:14.210 "seek_data": false, 00:17:14.210 "copy": true, 00:17:14.210 "nvme_iov_md": false 00:17:14.210 }, 00:17:14.210 "memory_domains": [ 00:17:14.210 { 00:17:14.210 "dma_device_id": "system", 00:17:14.210 "dma_device_type": 1 00:17:14.210 }, 00:17:14.210 { 00:17:14.210 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:14.210 "dma_device_type": 2 00:17:14.210 } 00:17:14.210 ], 00:17:14.210 "driver_specific": {} 00:17:14.210 } 00:17:14.210 ] 00:17:14.210 11:58:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:14.210 11:58:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:14.210 11:58:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:14.210 11:58:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:14.210 11:58:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:14.210 11:58:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:14.210 11:58:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:14.210 11:58:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:14.210 11:58:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:14.210 11:58:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:14.210 11:58:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:14.210 11:58:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:14.210 11:58:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:14.469 11:58:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:14.469 "name": "Existed_Raid", 00:17:14.469 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:14.469 "strip_size_kb": 0, 00:17:14.469 "state": "configuring", 00:17:14.469 "raid_level": "raid1", 00:17:14.469 "superblock": false, 00:17:14.469 "num_base_bdevs": 4, 00:17:14.469 "num_base_bdevs_discovered": 3, 00:17:14.469 "num_base_bdevs_operational": 4, 00:17:14.469 "base_bdevs_list": [ 00:17:14.469 { 00:17:14.469 "name": "BaseBdev1", 00:17:14.469 "uuid": "0b3e3bfa-4c2f-4e1f-bbf4-ef3b89af18df", 00:17:14.469 "is_configured": true, 00:17:14.469 "data_offset": 0, 00:17:14.469 "data_size": 65536 00:17:14.469 }, 00:17:14.469 { 00:17:14.469 "name": null, 00:17:14.469 "uuid": "4c64537f-23b4-4ed7-94c6-ac8d22953c6f", 00:17:14.469 "is_configured": false, 00:17:14.469 "data_offset": 0, 00:17:14.469 "data_size": 65536 00:17:14.469 }, 00:17:14.469 { 00:17:14.469 "name": "BaseBdev3", 00:17:14.469 "uuid": "1af89eaf-0825-41b6-a855-9913160a5e58", 00:17:14.469 "is_configured": true, 00:17:14.469 "data_offset": 0, 00:17:14.469 "data_size": 65536 00:17:14.469 }, 00:17:14.469 { 00:17:14.469 "name": "BaseBdev4", 00:17:14.469 "uuid": "dc5014d8-3908-4b0d-9900-dea7e901a722", 00:17:14.469 "is_configured": true, 00:17:14.469 "data_offset": 0, 00:17:14.469 "data_size": 65536 00:17:14.469 } 00:17:14.469 ] 00:17:14.469 }' 00:17:14.469 11:58:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:14.469 11:58:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:15.036 11:58:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:15.036 11:58:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:15.036 11:58:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:17:15.036 11:58:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:17:15.295 [2024-07-12 11:58:05.405386] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:15.295 11:58:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:15.295 11:58:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:15.295 11:58:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:15.295 11:58:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:15.295 11:58:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:15.295 11:58:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:15.295 11:58:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:15.295 11:58:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:15.295 11:58:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:15.295 11:58:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:15.295 11:58:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:15.295 11:58:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:15.554 11:58:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:15.554 "name": "Existed_Raid", 00:17:15.554 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:15.554 "strip_size_kb": 0, 00:17:15.554 "state": "configuring", 00:17:15.554 "raid_level": "raid1", 00:17:15.554 "superblock": false, 00:17:15.554 "num_base_bdevs": 4, 00:17:15.554 "num_base_bdevs_discovered": 2, 00:17:15.554 "num_base_bdevs_operational": 4, 00:17:15.554 "base_bdevs_list": [ 00:17:15.554 { 00:17:15.554 "name": "BaseBdev1", 00:17:15.554 "uuid": "0b3e3bfa-4c2f-4e1f-bbf4-ef3b89af18df", 00:17:15.554 "is_configured": true, 00:17:15.554 "data_offset": 0, 00:17:15.554 "data_size": 65536 00:17:15.554 }, 00:17:15.554 { 00:17:15.554 "name": null, 00:17:15.554 "uuid": "4c64537f-23b4-4ed7-94c6-ac8d22953c6f", 00:17:15.554 "is_configured": false, 00:17:15.554 "data_offset": 0, 00:17:15.554 "data_size": 65536 00:17:15.554 }, 00:17:15.554 { 00:17:15.555 "name": null, 00:17:15.555 "uuid": "1af89eaf-0825-41b6-a855-9913160a5e58", 00:17:15.555 "is_configured": false, 00:17:15.555 "data_offset": 0, 00:17:15.555 "data_size": 65536 00:17:15.555 }, 00:17:15.555 { 00:17:15.555 "name": "BaseBdev4", 00:17:15.555 "uuid": "dc5014d8-3908-4b0d-9900-dea7e901a722", 00:17:15.555 "is_configured": true, 00:17:15.555 "data_offset": 0, 00:17:15.555 "data_size": 65536 00:17:15.555 } 00:17:15.555 ] 00:17:15.555 }' 00:17:15.555 11:58:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:15.555 11:58:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:16.122 11:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:16.122 11:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:16.122 11:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:17:16.122 11:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:17:16.381 [2024-07-12 11:58:06.383947] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:16.381 11:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:16.381 11:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:16.381 11:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:16.381 11:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:16.381 11:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:16.381 11:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:16.381 11:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:16.381 11:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:16.381 11:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:16.381 11:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:16.381 11:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:16.381 11:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:16.381 11:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:16.381 "name": "Existed_Raid", 00:17:16.381 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:16.381 "strip_size_kb": 0, 00:17:16.381 "state": "configuring", 00:17:16.381 "raid_level": "raid1", 00:17:16.381 "superblock": false, 00:17:16.381 "num_base_bdevs": 4, 00:17:16.381 "num_base_bdevs_discovered": 3, 00:17:16.381 "num_base_bdevs_operational": 4, 00:17:16.381 "base_bdevs_list": [ 00:17:16.381 { 00:17:16.381 "name": "BaseBdev1", 00:17:16.381 "uuid": "0b3e3bfa-4c2f-4e1f-bbf4-ef3b89af18df", 00:17:16.381 "is_configured": true, 00:17:16.381 "data_offset": 0, 00:17:16.381 "data_size": 65536 00:17:16.381 }, 00:17:16.381 { 00:17:16.381 "name": null, 00:17:16.381 "uuid": "4c64537f-23b4-4ed7-94c6-ac8d22953c6f", 00:17:16.381 "is_configured": false, 00:17:16.381 "data_offset": 0, 00:17:16.381 "data_size": 65536 00:17:16.381 }, 00:17:16.381 { 00:17:16.381 "name": "BaseBdev3", 00:17:16.381 "uuid": "1af89eaf-0825-41b6-a855-9913160a5e58", 00:17:16.381 "is_configured": true, 00:17:16.381 "data_offset": 0, 00:17:16.381 "data_size": 65536 00:17:16.381 }, 00:17:16.381 { 00:17:16.381 "name": "BaseBdev4", 00:17:16.381 "uuid": "dc5014d8-3908-4b0d-9900-dea7e901a722", 00:17:16.381 "is_configured": true, 00:17:16.381 "data_offset": 0, 00:17:16.381 "data_size": 65536 00:17:16.381 } 00:17:16.381 ] 00:17:16.381 }' 00:17:16.381 11:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:16.381 11:58:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:16.947 11:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:16.947 11:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:16.947 11:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:17:16.947 11:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:17.205 [2024-07-12 11:58:07.334440] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:17.205 11:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:17.205 11:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:17.205 11:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:17.205 11:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:17.205 11:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:17.205 11:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:17.205 11:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:17.205 11:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:17.205 11:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:17.205 11:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:17.205 11:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:17.205 11:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:17.463 11:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:17.463 "name": "Existed_Raid", 00:17:17.463 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:17.463 "strip_size_kb": 0, 00:17:17.463 "state": "configuring", 00:17:17.463 "raid_level": "raid1", 00:17:17.463 "superblock": false, 00:17:17.463 "num_base_bdevs": 4, 00:17:17.463 "num_base_bdevs_discovered": 2, 00:17:17.463 "num_base_bdevs_operational": 4, 00:17:17.463 "base_bdevs_list": [ 00:17:17.463 { 00:17:17.463 "name": null, 00:17:17.463 "uuid": "0b3e3bfa-4c2f-4e1f-bbf4-ef3b89af18df", 00:17:17.463 "is_configured": false, 00:17:17.463 "data_offset": 0, 00:17:17.463 "data_size": 65536 00:17:17.463 }, 00:17:17.463 { 00:17:17.463 "name": null, 00:17:17.463 "uuid": "4c64537f-23b4-4ed7-94c6-ac8d22953c6f", 00:17:17.463 "is_configured": false, 00:17:17.463 "data_offset": 0, 00:17:17.463 "data_size": 65536 00:17:17.463 }, 00:17:17.463 { 00:17:17.463 "name": "BaseBdev3", 00:17:17.463 "uuid": "1af89eaf-0825-41b6-a855-9913160a5e58", 00:17:17.463 "is_configured": true, 00:17:17.463 "data_offset": 0, 00:17:17.463 "data_size": 65536 00:17:17.463 }, 00:17:17.463 { 00:17:17.463 "name": "BaseBdev4", 00:17:17.463 "uuid": "dc5014d8-3908-4b0d-9900-dea7e901a722", 00:17:17.463 "is_configured": true, 00:17:17.463 "data_offset": 0, 00:17:17.463 "data_size": 65536 00:17:17.463 } 00:17:17.463 ] 00:17:17.463 }' 00:17:17.463 11:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:17.463 11:58:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:18.031 11:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:18.031 11:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:18.031 11:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:17:18.031 11:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:17:18.289 [2024-07-12 11:58:08.302711] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:18.289 11:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:18.289 11:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:18.289 11:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:18.289 11:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:18.289 11:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:18.289 11:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:18.289 11:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:18.289 11:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:18.289 11:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:18.289 11:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:18.289 11:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:18.289 11:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:18.290 11:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:18.290 "name": "Existed_Raid", 00:17:18.290 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:18.290 "strip_size_kb": 0, 00:17:18.290 "state": "configuring", 00:17:18.290 "raid_level": "raid1", 00:17:18.290 "superblock": false, 00:17:18.290 "num_base_bdevs": 4, 00:17:18.290 "num_base_bdevs_discovered": 3, 00:17:18.290 "num_base_bdevs_operational": 4, 00:17:18.290 "base_bdevs_list": [ 00:17:18.290 { 00:17:18.290 "name": null, 00:17:18.290 "uuid": "0b3e3bfa-4c2f-4e1f-bbf4-ef3b89af18df", 00:17:18.290 "is_configured": false, 00:17:18.290 "data_offset": 0, 00:17:18.290 "data_size": 65536 00:17:18.290 }, 00:17:18.290 { 00:17:18.290 "name": "BaseBdev2", 00:17:18.290 "uuid": "4c64537f-23b4-4ed7-94c6-ac8d22953c6f", 00:17:18.290 "is_configured": true, 00:17:18.290 "data_offset": 0, 00:17:18.290 "data_size": 65536 00:17:18.290 }, 00:17:18.290 { 00:17:18.290 "name": "BaseBdev3", 00:17:18.290 "uuid": "1af89eaf-0825-41b6-a855-9913160a5e58", 00:17:18.290 "is_configured": true, 00:17:18.290 "data_offset": 0, 00:17:18.290 "data_size": 65536 00:17:18.290 }, 00:17:18.290 { 00:17:18.290 "name": "BaseBdev4", 00:17:18.290 "uuid": "dc5014d8-3908-4b0d-9900-dea7e901a722", 00:17:18.290 "is_configured": true, 00:17:18.290 "data_offset": 0, 00:17:18.290 "data_size": 65536 00:17:18.290 } 00:17:18.290 ] 00:17:18.290 }' 00:17:18.290 11:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:18.290 11:58:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:18.855 11:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:18.855 11:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:19.113 11:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:17:19.113 11:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:19.113 11:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:17:19.113 11:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 0b3e3bfa-4c2f-4e1f-bbf4-ef3b89af18df 00:17:19.371 [2024-07-12 11:58:09.456304] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:17:19.371 [2024-07-12 11:58:09.456333] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2633b50 00:17:19.371 [2024-07-12 11:58:09.456337] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:17:19.372 [2024-07-12 11:58:09.456472] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x246c940 00:17:19.372 [2024-07-12 11:58:09.456576] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2633b50 00:17:19.372 [2024-07-12 11:58:09.456583] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2633b50 00:17:19.372 [2024-07-12 11:58:09.456733] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:19.372 NewBaseBdev 00:17:19.372 11:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:17:19.372 11:58:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:17:19.372 11:58:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:19.372 11:58:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:19.372 11:58:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:19.372 11:58:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:19.372 11:58:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:19.631 11:58:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:17:19.631 [ 00:17:19.631 { 00:17:19.631 "name": "NewBaseBdev", 00:17:19.631 "aliases": [ 00:17:19.631 "0b3e3bfa-4c2f-4e1f-bbf4-ef3b89af18df" 00:17:19.631 ], 00:17:19.631 "product_name": "Malloc disk", 00:17:19.631 "block_size": 512, 00:17:19.631 "num_blocks": 65536, 00:17:19.631 "uuid": "0b3e3bfa-4c2f-4e1f-bbf4-ef3b89af18df", 00:17:19.631 "assigned_rate_limits": { 00:17:19.631 "rw_ios_per_sec": 0, 00:17:19.631 "rw_mbytes_per_sec": 0, 00:17:19.631 "r_mbytes_per_sec": 0, 00:17:19.631 "w_mbytes_per_sec": 0 00:17:19.631 }, 00:17:19.631 "claimed": true, 00:17:19.631 "claim_type": "exclusive_write", 00:17:19.631 "zoned": false, 00:17:19.631 "supported_io_types": { 00:17:19.631 "read": true, 00:17:19.631 "write": true, 00:17:19.631 "unmap": true, 00:17:19.631 "flush": true, 00:17:19.631 "reset": true, 00:17:19.631 "nvme_admin": false, 00:17:19.631 "nvme_io": false, 00:17:19.631 "nvme_io_md": false, 00:17:19.631 "write_zeroes": true, 00:17:19.631 "zcopy": true, 00:17:19.631 "get_zone_info": false, 00:17:19.631 "zone_management": false, 00:17:19.631 "zone_append": false, 00:17:19.631 "compare": false, 00:17:19.631 "compare_and_write": false, 00:17:19.631 "abort": true, 00:17:19.631 "seek_hole": false, 00:17:19.631 "seek_data": false, 00:17:19.631 "copy": true, 00:17:19.631 "nvme_iov_md": false 00:17:19.631 }, 00:17:19.631 "memory_domains": [ 00:17:19.631 { 00:17:19.631 "dma_device_id": "system", 00:17:19.631 "dma_device_type": 1 00:17:19.631 }, 00:17:19.631 { 00:17:19.631 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:19.631 "dma_device_type": 2 00:17:19.631 } 00:17:19.631 ], 00:17:19.631 "driver_specific": {} 00:17:19.631 } 00:17:19.631 ] 00:17:19.631 11:58:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:19.631 11:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:17:19.631 11:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:19.631 11:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:19.631 11:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:19.631 11:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:19.631 11:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:19.631 11:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:19.631 11:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:19.631 11:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:19.631 11:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:19.631 11:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:19.631 11:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:19.890 11:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:19.890 "name": "Existed_Raid", 00:17:19.890 "uuid": "313ba880-8ab4-4f90-b746-f4158ab253a7", 00:17:19.890 "strip_size_kb": 0, 00:17:19.890 "state": "online", 00:17:19.890 "raid_level": "raid1", 00:17:19.890 "superblock": false, 00:17:19.890 "num_base_bdevs": 4, 00:17:19.890 "num_base_bdevs_discovered": 4, 00:17:19.890 "num_base_bdevs_operational": 4, 00:17:19.890 "base_bdevs_list": [ 00:17:19.890 { 00:17:19.890 "name": "NewBaseBdev", 00:17:19.890 "uuid": "0b3e3bfa-4c2f-4e1f-bbf4-ef3b89af18df", 00:17:19.890 "is_configured": true, 00:17:19.890 "data_offset": 0, 00:17:19.890 "data_size": 65536 00:17:19.890 }, 00:17:19.890 { 00:17:19.890 "name": "BaseBdev2", 00:17:19.890 "uuid": "4c64537f-23b4-4ed7-94c6-ac8d22953c6f", 00:17:19.890 "is_configured": true, 00:17:19.890 "data_offset": 0, 00:17:19.890 "data_size": 65536 00:17:19.890 }, 00:17:19.890 { 00:17:19.890 "name": "BaseBdev3", 00:17:19.890 "uuid": "1af89eaf-0825-41b6-a855-9913160a5e58", 00:17:19.890 "is_configured": true, 00:17:19.890 "data_offset": 0, 00:17:19.890 "data_size": 65536 00:17:19.890 }, 00:17:19.890 { 00:17:19.890 "name": "BaseBdev4", 00:17:19.890 "uuid": "dc5014d8-3908-4b0d-9900-dea7e901a722", 00:17:19.890 "is_configured": true, 00:17:19.890 "data_offset": 0, 00:17:19.890 "data_size": 65536 00:17:19.890 } 00:17:19.890 ] 00:17:19.890 }' 00:17:19.890 11:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:19.890 11:58:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:20.476 11:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:17:20.476 11:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:20.476 11:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:20.476 11:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:20.476 11:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:20.476 11:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:20.476 11:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:20.476 11:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:20.476 [2024-07-12 11:58:10.563387] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:20.476 11:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:20.476 "name": "Existed_Raid", 00:17:20.476 "aliases": [ 00:17:20.476 "313ba880-8ab4-4f90-b746-f4158ab253a7" 00:17:20.476 ], 00:17:20.476 "product_name": "Raid Volume", 00:17:20.476 "block_size": 512, 00:17:20.476 "num_blocks": 65536, 00:17:20.476 "uuid": "313ba880-8ab4-4f90-b746-f4158ab253a7", 00:17:20.476 "assigned_rate_limits": { 00:17:20.476 "rw_ios_per_sec": 0, 00:17:20.476 "rw_mbytes_per_sec": 0, 00:17:20.476 "r_mbytes_per_sec": 0, 00:17:20.476 "w_mbytes_per_sec": 0 00:17:20.476 }, 00:17:20.476 "claimed": false, 00:17:20.476 "zoned": false, 00:17:20.476 "supported_io_types": { 00:17:20.476 "read": true, 00:17:20.476 "write": true, 00:17:20.476 "unmap": false, 00:17:20.476 "flush": false, 00:17:20.476 "reset": true, 00:17:20.476 "nvme_admin": false, 00:17:20.476 "nvme_io": false, 00:17:20.476 "nvme_io_md": false, 00:17:20.476 "write_zeroes": true, 00:17:20.476 "zcopy": false, 00:17:20.476 "get_zone_info": false, 00:17:20.476 "zone_management": false, 00:17:20.476 "zone_append": false, 00:17:20.476 "compare": false, 00:17:20.476 "compare_and_write": false, 00:17:20.476 "abort": false, 00:17:20.476 "seek_hole": false, 00:17:20.476 "seek_data": false, 00:17:20.476 "copy": false, 00:17:20.476 "nvme_iov_md": false 00:17:20.476 }, 00:17:20.476 "memory_domains": [ 00:17:20.476 { 00:17:20.476 "dma_device_id": "system", 00:17:20.476 "dma_device_type": 1 00:17:20.476 }, 00:17:20.476 { 00:17:20.476 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:20.476 "dma_device_type": 2 00:17:20.476 }, 00:17:20.476 { 00:17:20.476 "dma_device_id": "system", 00:17:20.476 "dma_device_type": 1 00:17:20.476 }, 00:17:20.476 { 00:17:20.476 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:20.476 "dma_device_type": 2 00:17:20.476 }, 00:17:20.476 { 00:17:20.476 "dma_device_id": "system", 00:17:20.476 "dma_device_type": 1 00:17:20.476 }, 00:17:20.476 { 00:17:20.476 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:20.476 "dma_device_type": 2 00:17:20.476 }, 00:17:20.476 { 00:17:20.476 "dma_device_id": "system", 00:17:20.476 "dma_device_type": 1 00:17:20.476 }, 00:17:20.476 { 00:17:20.476 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:20.476 "dma_device_type": 2 00:17:20.476 } 00:17:20.476 ], 00:17:20.476 "driver_specific": { 00:17:20.476 "raid": { 00:17:20.476 "uuid": "313ba880-8ab4-4f90-b746-f4158ab253a7", 00:17:20.476 "strip_size_kb": 0, 00:17:20.476 "state": "online", 00:17:20.476 "raid_level": "raid1", 00:17:20.476 "superblock": false, 00:17:20.476 "num_base_bdevs": 4, 00:17:20.476 "num_base_bdevs_discovered": 4, 00:17:20.476 "num_base_bdevs_operational": 4, 00:17:20.476 "base_bdevs_list": [ 00:17:20.476 { 00:17:20.476 "name": "NewBaseBdev", 00:17:20.476 "uuid": "0b3e3bfa-4c2f-4e1f-bbf4-ef3b89af18df", 00:17:20.476 "is_configured": true, 00:17:20.476 "data_offset": 0, 00:17:20.476 "data_size": 65536 00:17:20.476 }, 00:17:20.476 { 00:17:20.476 "name": "BaseBdev2", 00:17:20.476 "uuid": "4c64537f-23b4-4ed7-94c6-ac8d22953c6f", 00:17:20.476 "is_configured": true, 00:17:20.476 "data_offset": 0, 00:17:20.476 "data_size": 65536 00:17:20.476 }, 00:17:20.476 { 00:17:20.476 "name": "BaseBdev3", 00:17:20.476 "uuid": "1af89eaf-0825-41b6-a855-9913160a5e58", 00:17:20.476 "is_configured": true, 00:17:20.476 "data_offset": 0, 00:17:20.476 "data_size": 65536 00:17:20.476 }, 00:17:20.476 { 00:17:20.476 "name": "BaseBdev4", 00:17:20.476 "uuid": "dc5014d8-3908-4b0d-9900-dea7e901a722", 00:17:20.476 "is_configured": true, 00:17:20.476 "data_offset": 0, 00:17:20.476 "data_size": 65536 00:17:20.476 } 00:17:20.476 ] 00:17:20.476 } 00:17:20.476 } 00:17:20.476 }' 00:17:20.476 11:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:20.476 11:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:17:20.476 BaseBdev2 00:17:20.476 BaseBdev3 00:17:20.476 BaseBdev4' 00:17:20.476 11:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:20.476 11:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:17:20.476 11:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:20.743 11:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:20.743 "name": "NewBaseBdev", 00:17:20.743 "aliases": [ 00:17:20.743 "0b3e3bfa-4c2f-4e1f-bbf4-ef3b89af18df" 00:17:20.743 ], 00:17:20.743 "product_name": "Malloc disk", 00:17:20.743 "block_size": 512, 00:17:20.743 "num_blocks": 65536, 00:17:20.743 "uuid": "0b3e3bfa-4c2f-4e1f-bbf4-ef3b89af18df", 00:17:20.743 "assigned_rate_limits": { 00:17:20.743 "rw_ios_per_sec": 0, 00:17:20.743 "rw_mbytes_per_sec": 0, 00:17:20.743 "r_mbytes_per_sec": 0, 00:17:20.743 "w_mbytes_per_sec": 0 00:17:20.743 }, 00:17:20.743 "claimed": true, 00:17:20.743 "claim_type": "exclusive_write", 00:17:20.743 "zoned": false, 00:17:20.743 "supported_io_types": { 00:17:20.743 "read": true, 00:17:20.743 "write": true, 00:17:20.743 "unmap": true, 00:17:20.743 "flush": true, 00:17:20.743 "reset": true, 00:17:20.743 "nvme_admin": false, 00:17:20.743 "nvme_io": false, 00:17:20.743 "nvme_io_md": false, 00:17:20.743 "write_zeroes": true, 00:17:20.743 "zcopy": true, 00:17:20.743 "get_zone_info": false, 00:17:20.743 "zone_management": false, 00:17:20.743 "zone_append": false, 00:17:20.743 "compare": false, 00:17:20.743 "compare_and_write": false, 00:17:20.743 "abort": true, 00:17:20.743 "seek_hole": false, 00:17:20.743 "seek_data": false, 00:17:20.743 "copy": true, 00:17:20.743 "nvme_iov_md": false 00:17:20.743 }, 00:17:20.743 "memory_domains": [ 00:17:20.743 { 00:17:20.743 "dma_device_id": "system", 00:17:20.743 "dma_device_type": 1 00:17:20.743 }, 00:17:20.743 { 00:17:20.743 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:20.743 "dma_device_type": 2 00:17:20.743 } 00:17:20.743 ], 00:17:20.743 "driver_specific": {} 00:17:20.743 }' 00:17:20.743 11:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:20.743 11:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:20.743 11:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:20.743 11:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:20.743 11:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:20.743 11:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:20.743 11:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:20.743 11:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:20.743 11:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:20.743 11:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:21.002 11:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:21.002 11:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:21.002 11:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:21.002 11:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:21.002 11:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:21.002 11:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:21.002 "name": "BaseBdev2", 00:17:21.002 "aliases": [ 00:17:21.002 "4c64537f-23b4-4ed7-94c6-ac8d22953c6f" 00:17:21.002 ], 00:17:21.002 "product_name": "Malloc disk", 00:17:21.002 "block_size": 512, 00:17:21.002 "num_blocks": 65536, 00:17:21.002 "uuid": "4c64537f-23b4-4ed7-94c6-ac8d22953c6f", 00:17:21.002 "assigned_rate_limits": { 00:17:21.002 "rw_ios_per_sec": 0, 00:17:21.002 "rw_mbytes_per_sec": 0, 00:17:21.002 "r_mbytes_per_sec": 0, 00:17:21.002 "w_mbytes_per_sec": 0 00:17:21.002 }, 00:17:21.002 "claimed": true, 00:17:21.002 "claim_type": "exclusive_write", 00:17:21.002 "zoned": false, 00:17:21.002 "supported_io_types": { 00:17:21.002 "read": true, 00:17:21.002 "write": true, 00:17:21.002 "unmap": true, 00:17:21.002 "flush": true, 00:17:21.002 "reset": true, 00:17:21.002 "nvme_admin": false, 00:17:21.002 "nvme_io": false, 00:17:21.002 "nvme_io_md": false, 00:17:21.002 "write_zeroes": true, 00:17:21.002 "zcopy": true, 00:17:21.002 "get_zone_info": false, 00:17:21.002 "zone_management": false, 00:17:21.002 "zone_append": false, 00:17:21.002 "compare": false, 00:17:21.002 "compare_and_write": false, 00:17:21.002 "abort": true, 00:17:21.002 "seek_hole": false, 00:17:21.002 "seek_data": false, 00:17:21.002 "copy": true, 00:17:21.002 "nvme_iov_md": false 00:17:21.003 }, 00:17:21.003 "memory_domains": [ 00:17:21.003 { 00:17:21.003 "dma_device_id": "system", 00:17:21.003 "dma_device_type": 1 00:17:21.003 }, 00:17:21.003 { 00:17:21.003 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:21.003 "dma_device_type": 2 00:17:21.003 } 00:17:21.003 ], 00:17:21.003 "driver_specific": {} 00:17:21.003 }' 00:17:21.003 11:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:21.261 11:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:21.261 11:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:21.261 11:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:21.261 11:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:21.261 11:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:21.261 11:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:21.261 11:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:21.261 11:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:21.261 11:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:21.261 11:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:21.520 11:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:21.521 11:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:21.521 11:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:21.521 11:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:21.521 11:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:21.521 "name": "BaseBdev3", 00:17:21.521 "aliases": [ 00:17:21.521 "1af89eaf-0825-41b6-a855-9913160a5e58" 00:17:21.521 ], 00:17:21.521 "product_name": "Malloc disk", 00:17:21.521 "block_size": 512, 00:17:21.521 "num_blocks": 65536, 00:17:21.521 "uuid": "1af89eaf-0825-41b6-a855-9913160a5e58", 00:17:21.521 "assigned_rate_limits": { 00:17:21.521 "rw_ios_per_sec": 0, 00:17:21.521 "rw_mbytes_per_sec": 0, 00:17:21.521 "r_mbytes_per_sec": 0, 00:17:21.521 "w_mbytes_per_sec": 0 00:17:21.521 }, 00:17:21.521 "claimed": true, 00:17:21.521 "claim_type": "exclusive_write", 00:17:21.521 "zoned": false, 00:17:21.521 "supported_io_types": { 00:17:21.521 "read": true, 00:17:21.521 "write": true, 00:17:21.521 "unmap": true, 00:17:21.521 "flush": true, 00:17:21.521 "reset": true, 00:17:21.521 "nvme_admin": false, 00:17:21.521 "nvme_io": false, 00:17:21.521 "nvme_io_md": false, 00:17:21.521 "write_zeroes": true, 00:17:21.521 "zcopy": true, 00:17:21.521 "get_zone_info": false, 00:17:21.521 "zone_management": false, 00:17:21.521 "zone_append": false, 00:17:21.521 "compare": false, 00:17:21.521 "compare_and_write": false, 00:17:21.521 "abort": true, 00:17:21.521 "seek_hole": false, 00:17:21.521 "seek_data": false, 00:17:21.521 "copy": true, 00:17:21.521 "nvme_iov_md": false 00:17:21.521 }, 00:17:21.521 "memory_domains": [ 00:17:21.521 { 00:17:21.521 "dma_device_id": "system", 00:17:21.521 "dma_device_type": 1 00:17:21.521 }, 00:17:21.521 { 00:17:21.521 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:21.521 "dma_device_type": 2 00:17:21.521 } 00:17:21.521 ], 00:17:21.521 "driver_specific": {} 00:17:21.521 }' 00:17:21.521 11:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:21.521 11:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:21.780 11:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:21.780 11:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:21.780 11:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:21.780 11:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:21.780 11:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:21.780 11:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:21.780 11:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:21.780 11:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:21.780 11:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:21.780 11:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:21.780 11:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:21.780 11:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:21.781 11:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:22.039 11:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:22.039 "name": "BaseBdev4", 00:17:22.039 "aliases": [ 00:17:22.039 "dc5014d8-3908-4b0d-9900-dea7e901a722" 00:17:22.039 ], 00:17:22.040 "product_name": "Malloc disk", 00:17:22.040 "block_size": 512, 00:17:22.040 "num_blocks": 65536, 00:17:22.040 "uuid": "dc5014d8-3908-4b0d-9900-dea7e901a722", 00:17:22.040 "assigned_rate_limits": { 00:17:22.040 "rw_ios_per_sec": 0, 00:17:22.040 "rw_mbytes_per_sec": 0, 00:17:22.040 "r_mbytes_per_sec": 0, 00:17:22.040 "w_mbytes_per_sec": 0 00:17:22.040 }, 00:17:22.040 "claimed": true, 00:17:22.040 "claim_type": "exclusive_write", 00:17:22.040 "zoned": false, 00:17:22.040 "supported_io_types": { 00:17:22.040 "read": true, 00:17:22.040 "write": true, 00:17:22.040 "unmap": true, 00:17:22.040 "flush": true, 00:17:22.040 "reset": true, 00:17:22.040 "nvme_admin": false, 00:17:22.040 "nvme_io": false, 00:17:22.040 "nvme_io_md": false, 00:17:22.040 "write_zeroes": true, 00:17:22.040 "zcopy": true, 00:17:22.040 "get_zone_info": false, 00:17:22.040 "zone_management": false, 00:17:22.040 "zone_append": false, 00:17:22.040 "compare": false, 00:17:22.040 "compare_and_write": false, 00:17:22.040 "abort": true, 00:17:22.040 "seek_hole": false, 00:17:22.040 "seek_data": false, 00:17:22.040 "copy": true, 00:17:22.040 "nvme_iov_md": false 00:17:22.040 }, 00:17:22.040 "memory_domains": [ 00:17:22.040 { 00:17:22.040 "dma_device_id": "system", 00:17:22.040 "dma_device_type": 1 00:17:22.040 }, 00:17:22.040 { 00:17:22.040 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:22.040 "dma_device_type": 2 00:17:22.040 } 00:17:22.040 ], 00:17:22.040 "driver_specific": {} 00:17:22.040 }' 00:17:22.040 11:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:22.040 11:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:22.040 11:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:22.040 11:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:22.299 11:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:22.299 11:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:22.299 11:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:22.299 11:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:22.299 11:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:22.299 11:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:22.299 11:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:22.299 11:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:22.299 11:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:22.558 [2024-07-12 11:58:12.632556] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:22.558 [2024-07-12 11:58:12.632576] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:22.558 [2024-07-12 11:58:12.632610] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:22.558 [2024-07-12 11:58:12.632791] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:22.558 [2024-07-12 11:58:12.632797] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2633b50 name Existed_Raid, state offline 00:17:22.558 11:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 677556 00:17:22.558 11:58:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 677556 ']' 00:17:22.558 11:58:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 677556 00:17:22.558 11:58:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:17:22.558 11:58:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:22.558 11:58:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 677556 00:17:22.558 11:58:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:22.558 11:58:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:22.558 11:58:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 677556' 00:17:22.558 killing process with pid 677556 00:17:22.558 11:58:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 677556 00:17:22.558 [2024-07-12 11:58:12.695084] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:22.558 11:58:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 677556 00:17:22.558 [2024-07-12 11:58:12.725740] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:22.817 11:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:17:22.817 00:17:22.817 real 0m24.102s 00:17:22.817 user 0m44.897s 00:17:22.817 sys 0m3.763s 00:17:22.817 11:58:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:22.817 11:58:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:22.817 ************************************ 00:17:22.817 END TEST raid_state_function_test 00:17:22.817 ************************************ 00:17:22.817 11:58:12 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:22.817 11:58:12 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 4 true 00:17:22.817 11:58:12 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:22.817 11:58:12 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:22.817 11:58:12 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:22.817 ************************************ 00:17:22.817 START TEST raid_state_function_test_sb 00:17:22.817 ************************************ 00:17:22.817 11:58:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 4 true 00:17:22.817 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:17:22.817 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:17:22.817 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:17:22.817 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:17:22.817 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:17:22.817 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:22.817 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:17:22.818 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:22.818 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:22.818 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:17:22.818 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:22.818 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:22.818 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:17:22.818 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:22.818 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:22.818 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:17:22.818 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:22.818 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:22.818 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:22.818 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:17:22.818 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:17:22.818 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:17:22.818 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:17:22.818 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:17:22.818 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:17:22.818 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:17:22.818 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:17:22.818 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:17:22.818 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=682277 00:17:22.818 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 682277' 00:17:22.818 Process raid pid: 682277 00:17:22.818 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 682277 /var/tmp/spdk-raid.sock 00:17:22.818 11:58:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 682277 ']' 00:17:22.818 11:58:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:22.818 11:58:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:22.818 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:22.818 11:58:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:22.818 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:22.818 11:58:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:22.818 11:58:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:22.818 [2024-07-12 11:58:13.017053] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:17:22.818 [2024-07-12 11:58:13.017088] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:23.075 [2024-07-12 11:58:13.079635] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:23.075 [2024-07-12 11:58:13.157136] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:17:23.075 [2024-07-12 11:58:13.207458] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:23.075 [2024-07-12 11:58:13.207480] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:23.642 11:58:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:23.642 11:58:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:17:23.642 11:58:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:23.902 [2024-07-12 11:58:13.938173] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:23.902 [2024-07-12 11:58:13.938202] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:23.902 [2024-07-12 11:58:13.938208] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:23.902 [2024-07-12 11:58:13.938213] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:23.902 [2024-07-12 11:58:13.938218] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:23.902 [2024-07-12 11:58:13.938222] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:23.902 [2024-07-12 11:58:13.938226] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:23.902 [2024-07-12 11:58:13.938231] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:23.902 11:58:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:23.902 11:58:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:23.902 11:58:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:23.902 11:58:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:23.902 11:58:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:23.902 11:58:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:23.902 11:58:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:23.902 11:58:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:23.902 11:58:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:23.902 11:58:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:23.902 11:58:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:23.902 11:58:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:23.902 11:58:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:23.902 "name": "Existed_Raid", 00:17:23.902 "uuid": "adf77503-0ace-480a-8e9a-3e7923bcaef6", 00:17:23.902 "strip_size_kb": 0, 00:17:23.902 "state": "configuring", 00:17:23.902 "raid_level": "raid1", 00:17:23.902 "superblock": true, 00:17:23.902 "num_base_bdevs": 4, 00:17:23.902 "num_base_bdevs_discovered": 0, 00:17:23.902 "num_base_bdevs_operational": 4, 00:17:23.902 "base_bdevs_list": [ 00:17:23.902 { 00:17:23.902 "name": "BaseBdev1", 00:17:23.902 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:23.902 "is_configured": false, 00:17:23.902 "data_offset": 0, 00:17:23.902 "data_size": 0 00:17:23.902 }, 00:17:23.902 { 00:17:23.902 "name": "BaseBdev2", 00:17:23.902 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:23.902 "is_configured": false, 00:17:23.902 "data_offset": 0, 00:17:23.902 "data_size": 0 00:17:23.902 }, 00:17:23.902 { 00:17:23.902 "name": "BaseBdev3", 00:17:23.902 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:23.902 "is_configured": false, 00:17:23.902 "data_offset": 0, 00:17:23.902 "data_size": 0 00:17:23.902 }, 00:17:23.902 { 00:17:23.902 "name": "BaseBdev4", 00:17:23.902 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:23.902 "is_configured": false, 00:17:23.902 "data_offset": 0, 00:17:23.902 "data_size": 0 00:17:23.902 } 00:17:23.902 ] 00:17:23.902 }' 00:17:23.902 11:58:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:23.902 11:58:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:24.470 11:58:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:24.728 [2024-07-12 11:58:14.748289] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:24.728 [2024-07-12 11:58:14.748308] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22a81f0 name Existed_Raid, state configuring 00:17:24.729 11:58:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:24.729 [2024-07-12 11:58:14.920831] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:24.729 [2024-07-12 11:58:14.920849] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:24.729 [2024-07-12 11:58:14.920853] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:24.729 [2024-07-12 11:58:14.920858] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:24.729 [2024-07-12 11:58:14.920863] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:24.729 [2024-07-12 11:58:14.920868] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:24.729 [2024-07-12 11:58:14.920887] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:24.729 [2024-07-12 11:58:14.920892] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:24.729 11:58:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:24.987 [2024-07-12 11:58:15.101838] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:24.987 BaseBdev1 00:17:24.987 11:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:17:24.987 11:58:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:24.987 11:58:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:24.987 11:58:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:24.987 11:58:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:24.987 11:58:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:24.987 11:58:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:25.246 11:58:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:25.246 [ 00:17:25.246 { 00:17:25.246 "name": "BaseBdev1", 00:17:25.246 "aliases": [ 00:17:25.246 "2d2faeea-2257-4491-a758-42696a09d8ce" 00:17:25.246 ], 00:17:25.246 "product_name": "Malloc disk", 00:17:25.246 "block_size": 512, 00:17:25.246 "num_blocks": 65536, 00:17:25.246 "uuid": "2d2faeea-2257-4491-a758-42696a09d8ce", 00:17:25.246 "assigned_rate_limits": { 00:17:25.246 "rw_ios_per_sec": 0, 00:17:25.246 "rw_mbytes_per_sec": 0, 00:17:25.246 "r_mbytes_per_sec": 0, 00:17:25.246 "w_mbytes_per_sec": 0 00:17:25.246 }, 00:17:25.246 "claimed": true, 00:17:25.246 "claim_type": "exclusive_write", 00:17:25.246 "zoned": false, 00:17:25.246 "supported_io_types": { 00:17:25.246 "read": true, 00:17:25.246 "write": true, 00:17:25.246 "unmap": true, 00:17:25.246 "flush": true, 00:17:25.246 "reset": true, 00:17:25.246 "nvme_admin": false, 00:17:25.246 "nvme_io": false, 00:17:25.246 "nvme_io_md": false, 00:17:25.246 "write_zeroes": true, 00:17:25.246 "zcopy": true, 00:17:25.246 "get_zone_info": false, 00:17:25.246 "zone_management": false, 00:17:25.246 "zone_append": false, 00:17:25.246 "compare": false, 00:17:25.246 "compare_and_write": false, 00:17:25.246 "abort": true, 00:17:25.246 "seek_hole": false, 00:17:25.246 "seek_data": false, 00:17:25.246 "copy": true, 00:17:25.246 "nvme_iov_md": false 00:17:25.246 }, 00:17:25.246 "memory_domains": [ 00:17:25.246 { 00:17:25.246 "dma_device_id": "system", 00:17:25.246 "dma_device_type": 1 00:17:25.246 }, 00:17:25.246 { 00:17:25.246 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:25.246 "dma_device_type": 2 00:17:25.246 } 00:17:25.246 ], 00:17:25.246 "driver_specific": {} 00:17:25.246 } 00:17:25.246 ] 00:17:25.246 11:58:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:25.246 11:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:25.246 11:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:25.246 11:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:25.246 11:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:25.246 11:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:25.246 11:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:25.246 11:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:25.246 11:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:25.246 11:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:25.246 11:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:25.246 11:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:25.246 11:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:25.505 11:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:25.505 "name": "Existed_Raid", 00:17:25.505 "uuid": "9e236317-ed71-44e8-abf7-370d36fb54a8", 00:17:25.505 "strip_size_kb": 0, 00:17:25.505 "state": "configuring", 00:17:25.505 "raid_level": "raid1", 00:17:25.505 "superblock": true, 00:17:25.505 "num_base_bdevs": 4, 00:17:25.505 "num_base_bdevs_discovered": 1, 00:17:25.505 "num_base_bdevs_operational": 4, 00:17:25.505 "base_bdevs_list": [ 00:17:25.505 { 00:17:25.505 "name": "BaseBdev1", 00:17:25.505 "uuid": "2d2faeea-2257-4491-a758-42696a09d8ce", 00:17:25.505 "is_configured": true, 00:17:25.505 "data_offset": 2048, 00:17:25.505 "data_size": 63488 00:17:25.505 }, 00:17:25.505 { 00:17:25.505 "name": "BaseBdev2", 00:17:25.505 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:25.505 "is_configured": false, 00:17:25.505 "data_offset": 0, 00:17:25.505 "data_size": 0 00:17:25.505 }, 00:17:25.505 { 00:17:25.505 "name": "BaseBdev3", 00:17:25.505 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:25.505 "is_configured": false, 00:17:25.505 "data_offset": 0, 00:17:25.505 "data_size": 0 00:17:25.505 }, 00:17:25.505 { 00:17:25.505 "name": "BaseBdev4", 00:17:25.505 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:25.505 "is_configured": false, 00:17:25.505 "data_offset": 0, 00:17:25.505 "data_size": 0 00:17:25.505 } 00:17:25.505 ] 00:17:25.505 }' 00:17:25.505 11:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:25.505 11:58:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:26.073 11:58:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:26.073 [2024-07-12 11:58:16.260834] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:26.073 [2024-07-12 11:58:16.260861] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22a7a60 name Existed_Raid, state configuring 00:17:26.073 11:58:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:26.333 [2024-07-12 11:58:16.437316] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:26.333 [2024-07-12 11:58:16.438415] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:26.333 [2024-07-12 11:58:16.438439] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:26.333 [2024-07-12 11:58:16.438445] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:26.333 [2024-07-12 11:58:16.438450] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:26.333 [2024-07-12 11:58:16.438455] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:26.333 [2024-07-12 11:58:16.438461] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:26.333 11:58:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:17:26.333 11:58:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:26.333 11:58:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:26.333 11:58:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:26.333 11:58:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:26.333 11:58:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:26.333 11:58:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:26.333 11:58:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:26.333 11:58:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:26.333 11:58:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:26.333 11:58:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:26.333 11:58:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:26.333 11:58:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:26.333 11:58:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:26.592 11:58:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:26.592 "name": "Existed_Raid", 00:17:26.592 "uuid": "19ef10ed-f580-41fc-8db4-39b5ea8b915f", 00:17:26.592 "strip_size_kb": 0, 00:17:26.592 "state": "configuring", 00:17:26.592 "raid_level": "raid1", 00:17:26.592 "superblock": true, 00:17:26.592 "num_base_bdevs": 4, 00:17:26.592 "num_base_bdevs_discovered": 1, 00:17:26.592 "num_base_bdevs_operational": 4, 00:17:26.592 "base_bdevs_list": [ 00:17:26.592 { 00:17:26.592 "name": "BaseBdev1", 00:17:26.592 "uuid": "2d2faeea-2257-4491-a758-42696a09d8ce", 00:17:26.592 "is_configured": true, 00:17:26.592 "data_offset": 2048, 00:17:26.592 "data_size": 63488 00:17:26.592 }, 00:17:26.592 { 00:17:26.592 "name": "BaseBdev2", 00:17:26.592 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:26.592 "is_configured": false, 00:17:26.592 "data_offset": 0, 00:17:26.592 "data_size": 0 00:17:26.592 }, 00:17:26.592 { 00:17:26.592 "name": "BaseBdev3", 00:17:26.592 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:26.592 "is_configured": false, 00:17:26.592 "data_offset": 0, 00:17:26.592 "data_size": 0 00:17:26.592 }, 00:17:26.592 { 00:17:26.592 "name": "BaseBdev4", 00:17:26.592 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:26.592 "is_configured": false, 00:17:26.592 "data_offset": 0, 00:17:26.592 "data_size": 0 00:17:26.592 } 00:17:26.592 ] 00:17:26.592 }' 00:17:26.592 11:58:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:26.592 11:58:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:27.160 11:58:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:27.160 [2024-07-12 11:58:17.270101] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:27.160 BaseBdev2 00:17:27.160 11:58:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:17:27.160 11:58:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:27.160 11:58:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:27.160 11:58:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:27.160 11:58:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:27.160 11:58:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:27.160 11:58:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:27.418 11:58:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:27.418 [ 00:17:27.418 { 00:17:27.418 "name": "BaseBdev2", 00:17:27.418 "aliases": [ 00:17:27.418 "631b6f48-7c3a-4b22-8e07-d29e85857eb4" 00:17:27.418 ], 00:17:27.418 "product_name": "Malloc disk", 00:17:27.418 "block_size": 512, 00:17:27.418 "num_blocks": 65536, 00:17:27.418 "uuid": "631b6f48-7c3a-4b22-8e07-d29e85857eb4", 00:17:27.418 "assigned_rate_limits": { 00:17:27.418 "rw_ios_per_sec": 0, 00:17:27.418 "rw_mbytes_per_sec": 0, 00:17:27.418 "r_mbytes_per_sec": 0, 00:17:27.418 "w_mbytes_per_sec": 0 00:17:27.418 }, 00:17:27.418 "claimed": true, 00:17:27.418 "claim_type": "exclusive_write", 00:17:27.418 "zoned": false, 00:17:27.418 "supported_io_types": { 00:17:27.418 "read": true, 00:17:27.418 "write": true, 00:17:27.418 "unmap": true, 00:17:27.418 "flush": true, 00:17:27.418 "reset": true, 00:17:27.418 "nvme_admin": false, 00:17:27.418 "nvme_io": false, 00:17:27.418 "nvme_io_md": false, 00:17:27.418 "write_zeroes": true, 00:17:27.418 "zcopy": true, 00:17:27.418 "get_zone_info": false, 00:17:27.418 "zone_management": false, 00:17:27.418 "zone_append": false, 00:17:27.418 "compare": false, 00:17:27.418 "compare_and_write": false, 00:17:27.418 "abort": true, 00:17:27.418 "seek_hole": false, 00:17:27.418 "seek_data": false, 00:17:27.418 "copy": true, 00:17:27.418 "nvme_iov_md": false 00:17:27.418 }, 00:17:27.418 "memory_domains": [ 00:17:27.418 { 00:17:27.418 "dma_device_id": "system", 00:17:27.418 "dma_device_type": 1 00:17:27.418 }, 00:17:27.418 { 00:17:27.418 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:27.418 "dma_device_type": 2 00:17:27.418 } 00:17:27.418 ], 00:17:27.418 "driver_specific": {} 00:17:27.418 } 00:17:27.418 ] 00:17:27.418 11:58:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:27.418 11:58:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:27.418 11:58:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:27.418 11:58:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:27.418 11:58:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:27.418 11:58:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:27.418 11:58:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:27.418 11:58:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:27.418 11:58:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:27.418 11:58:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:27.418 11:58:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:27.418 11:58:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:27.418 11:58:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:27.418 11:58:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:27.418 11:58:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:27.677 11:58:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:27.677 "name": "Existed_Raid", 00:17:27.677 "uuid": "19ef10ed-f580-41fc-8db4-39b5ea8b915f", 00:17:27.677 "strip_size_kb": 0, 00:17:27.677 "state": "configuring", 00:17:27.677 "raid_level": "raid1", 00:17:27.677 "superblock": true, 00:17:27.677 "num_base_bdevs": 4, 00:17:27.677 "num_base_bdevs_discovered": 2, 00:17:27.677 "num_base_bdevs_operational": 4, 00:17:27.677 "base_bdevs_list": [ 00:17:27.677 { 00:17:27.677 "name": "BaseBdev1", 00:17:27.677 "uuid": "2d2faeea-2257-4491-a758-42696a09d8ce", 00:17:27.677 "is_configured": true, 00:17:27.677 "data_offset": 2048, 00:17:27.677 "data_size": 63488 00:17:27.677 }, 00:17:27.677 { 00:17:27.677 "name": "BaseBdev2", 00:17:27.677 "uuid": "631b6f48-7c3a-4b22-8e07-d29e85857eb4", 00:17:27.677 "is_configured": true, 00:17:27.677 "data_offset": 2048, 00:17:27.677 "data_size": 63488 00:17:27.677 }, 00:17:27.677 { 00:17:27.677 "name": "BaseBdev3", 00:17:27.677 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:27.677 "is_configured": false, 00:17:27.677 "data_offset": 0, 00:17:27.677 "data_size": 0 00:17:27.677 }, 00:17:27.677 { 00:17:27.677 "name": "BaseBdev4", 00:17:27.677 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:27.677 "is_configured": false, 00:17:27.677 "data_offset": 0, 00:17:27.677 "data_size": 0 00:17:27.677 } 00:17:27.677 ] 00:17:27.677 }' 00:17:27.677 11:58:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:27.677 11:58:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:28.245 11:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:28.245 [2024-07-12 11:58:18.451751] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:28.245 BaseBdev3 00:17:28.245 11:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:17:28.245 11:58:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:28.245 11:58:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:28.245 11:58:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:28.245 11:58:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:28.245 11:58:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:28.245 11:58:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:28.504 11:58:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:28.764 [ 00:17:28.764 { 00:17:28.764 "name": "BaseBdev3", 00:17:28.764 "aliases": [ 00:17:28.764 "54c351a0-a2cc-4542-93c3-a7bba44fb2bb" 00:17:28.764 ], 00:17:28.764 "product_name": "Malloc disk", 00:17:28.764 "block_size": 512, 00:17:28.764 "num_blocks": 65536, 00:17:28.764 "uuid": "54c351a0-a2cc-4542-93c3-a7bba44fb2bb", 00:17:28.764 "assigned_rate_limits": { 00:17:28.764 "rw_ios_per_sec": 0, 00:17:28.764 "rw_mbytes_per_sec": 0, 00:17:28.764 "r_mbytes_per_sec": 0, 00:17:28.764 "w_mbytes_per_sec": 0 00:17:28.764 }, 00:17:28.764 "claimed": true, 00:17:28.764 "claim_type": "exclusive_write", 00:17:28.764 "zoned": false, 00:17:28.764 "supported_io_types": { 00:17:28.764 "read": true, 00:17:28.764 "write": true, 00:17:28.764 "unmap": true, 00:17:28.764 "flush": true, 00:17:28.764 "reset": true, 00:17:28.764 "nvme_admin": false, 00:17:28.764 "nvme_io": false, 00:17:28.764 "nvme_io_md": false, 00:17:28.764 "write_zeroes": true, 00:17:28.764 "zcopy": true, 00:17:28.764 "get_zone_info": false, 00:17:28.764 "zone_management": false, 00:17:28.764 "zone_append": false, 00:17:28.764 "compare": false, 00:17:28.764 "compare_and_write": false, 00:17:28.764 "abort": true, 00:17:28.764 "seek_hole": false, 00:17:28.764 "seek_data": false, 00:17:28.764 "copy": true, 00:17:28.764 "nvme_iov_md": false 00:17:28.764 }, 00:17:28.764 "memory_domains": [ 00:17:28.764 { 00:17:28.764 "dma_device_id": "system", 00:17:28.764 "dma_device_type": 1 00:17:28.764 }, 00:17:28.764 { 00:17:28.764 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:28.764 "dma_device_type": 2 00:17:28.764 } 00:17:28.764 ], 00:17:28.764 "driver_specific": {} 00:17:28.764 } 00:17:28.764 ] 00:17:28.764 11:58:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:28.764 11:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:28.764 11:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:28.764 11:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:28.764 11:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:28.764 11:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:28.764 11:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:28.764 11:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:28.764 11:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:28.764 11:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:28.764 11:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:28.764 11:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:28.764 11:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:28.765 11:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:28.765 11:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:28.765 11:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:28.765 "name": "Existed_Raid", 00:17:28.765 "uuid": "19ef10ed-f580-41fc-8db4-39b5ea8b915f", 00:17:28.765 "strip_size_kb": 0, 00:17:28.765 "state": "configuring", 00:17:28.765 "raid_level": "raid1", 00:17:28.765 "superblock": true, 00:17:28.765 "num_base_bdevs": 4, 00:17:28.765 "num_base_bdevs_discovered": 3, 00:17:28.765 "num_base_bdevs_operational": 4, 00:17:28.765 "base_bdevs_list": [ 00:17:28.765 { 00:17:28.765 "name": "BaseBdev1", 00:17:28.765 "uuid": "2d2faeea-2257-4491-a758-42696a09d8ce", 00:17:28.765 "is_configured": true, 00:17:28.765 "data_offset": 2048, 00:17:28.765 "data_size": 63488 00:17:28.765 }, 00:17:28.765 { 00:17:28.765 "name": "BaseBdev2", 00:17:28.765 "uuid": "631b6f48-7c3a-4b22-8e07-d29e85857eb4", 00:17:28.765 "is_configured": true, 00:17:28.765 "data_offset": 2048, 00:17:28.765 "data_size": 63488 00:17:28.765 }, 00:17:28.765 { 00:17:28.765 "name": "BaseBdev3", 00:17:28.765 "uuid": "54c351a0-a2cc-4542-93c3-a7bba44fb2bb", 00:17:28.765 "is_configured": true, 00:17:28.765 "data_offset": 2048, 00:17:28.765 "data_size": 63488 00:17:28.765 }, 00:17:28.765 { 00:17:28.765 "name": "BaseBdev4", 00:17:28.765 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:28.765 "is_configured": false, 00:17:28.765 "data_offset": 0, 00:17:28.765 "data_size": 0 00:17:28.765 } 00:17:28.765 ] 00:17:28.765 }' 00:17:28.765 11:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:28.765 11:58:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:29.332 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:29.591 [2024-07-12 11:58:19.625332] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:29.591 [2024-07-12 11:58:19.625448] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x22a8b90 00:17:29.591 [2024-07-12 11:58:19.625457] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:29.591 [2024-07-12 11:58:19.625593] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22a8700 00:17:29.591 [2024-07-12 11:58:19.625681] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22a8b90 00:17:29.591 [2024-07-12 11:58:19.625687] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x22a8b90 00:17:29.591 [2024-07-12 11:58:19.625749] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:29.591 BaseBdev4 00:17:29.591 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:17:29.591 11:58:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:17:29.591 11:58:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:29.591 11:58:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:29.591 11:58:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:29.591 11:58:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:29.591 11:58:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:29.591 11:58:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:29.851 [ 00:17:29.851 { 00:17:29.851 "name": "BaseBdev4", 00:17:29.851 "aliases": [ 00:17:29.851 "315b85aa-0fca-4a7d-9a21-9b42a898c2cb" 00:17:29.851 ], 00:17:29.851 "product_name": "Malloc disk", 00:17:29.851 "block_size": 512, 00:17:29.851 "num_blocks": 65536, 00:17:29.851 "uuid": "315b85aa-0fca-4a7d-9a21-9b42a898c2cb", 00:17:29.851 "assigned_rate_limits": { 00:17:29.851 "rw_ios_per_sec": 0, 00:17:29.851 "rw_mbytes_per_sec": 0, 00:17:29.851 "r_mbytes_per_sec": 0, 00:17:29.851 "w_mbytes_per_sec": 0 00:17:29.851 }, 00:17:29.851 "claimed": true, 00:17:29.851 "claim_type": "exclusive_write", 00:17:29.851 "zoned": false, 00:17:29.851 "supported_io_types": { 00:17:29.851 "read": true, 00:17:29.851 "write": true, 00:17:29.851 "unmap": true, 00:17:29.851 "flush": true, 00:17:29.851 "reset": true, 00:17:29.851 "nvme_admin": false, 00:17:29.851 "nvme_io": false, 00:17:29.851 "nvme_io_md": false, 00:17:29.851 "write_zeroes": true, 00:17:29.851 "zcopy": true, 00:17:29.851 "get_zone_info": false, 00:17:29.851 "zone_management": false, 00:17:29.851 "zone_append": false, 00:17:29.851 "compare": false, 00:17:29.851 "compare_and_write": false, 00:17:29.851 "abort": true, 00:17:29.851 "seek_hole": false, 00:17:29.851 "seek_data": false, 00:17:29.851 "copy": true, 00:17:29.851 "nvme_iov_md": false 00:17:29.851 }, 00:17:29.851 "memory_domains": [ 00:17:29.851 { 00:17:29.851 "dma_device_id": "system", 00:17:29.851 "dma_device_type": 1 00:17:29.851 }, 00:17:29.851 { 00:17:29.851 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:29.851 "dma_device_type": 2 00:17:29.851 } 00:17:29.851 ], 00:17:29.851 "driver_specific": {} 00:17:29.851 } 00:17:29.851 ] 00:17:29.851 11:58:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:29.851 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:29.851 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:29.851 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:17:29.851 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:29.851 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:29.851 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:29.851 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:29.851 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:29.851 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:29.851 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:29.851 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:29.851 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:29.851 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:29.851 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:30.110 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:30.110 "name": "Existed_Raid", 00:17:30.110 "uuid": "19ef10ed-f580-41fc-8db4-39b5ea8b915f", 00:17:30.110 "strip_size_kb": 0, 00:17:30.110 "state": "online", 00:17:30.110 "raid_level": "raid1", 00:17:30.110 "superblock": true, 00:17:30.110 "num_base_bdevs": 4, 00:17:30.110 "num_base_bdevs_discovered": 4, 00:17:30.110 "num_base_bdevs_operational": 4, 00:17:30.110 "base_bdevs_list": [ 00:17:30.110 { 00:17:30.111 "name": "BaseBdev1", 00:17:30.111 "uuid": "2d2faeea-2257-4491-a758-42696a09d8ce", 00:17:30.111 "is_configured": true, 00:17:30.111 "data_offset": 2048, 00:17:30.111 "data_size": 63488 00:17:30.111 }, 00:17:30.111 { 00:17:30.111 "name": "BaseBdev2", 00:17:30.111 "uuid": "631b6f48-7c3a-4b22-8e07-d29e85857eb4", 00:17:30.111 "is_configured": true, 00:17:30.111 "data_offset": 2048, 00:17:30.111 "data_size": 63488 00:17:30.111 }, 00:17:30.111 { 00:17:30.111 "name": "BaseBdev3", 00:17:30.111 "uuid": "54c351a0-a2cc-4542-93c3-a7bba44fb2bb", 00:17:30.111 "is_configured": true, 00:17:30.111 "data_offset": 2048, 00:17:30.111 "data_size": 63488 00:17:30.111 }, 00:17:30.111 { 00:17:30.111 "name": "BaseBdev4", 00:17:30.111 "uuid": "315b85aa-0fca-4a7d-9a21-9b42a898c2cb", 00:17:30.111 "is_configured": true, 00:17:30.111 "data_offset": 2048, 00:17:30.111 "data_size": 63488 00:17:30.111 } 00:17:30.111 ] 00:17:30.111 }' 00:17:30.111 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:30.111 11:58:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:30.679 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:17:30.679 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:30.679 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:30.679 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:30.679 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:30.679 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:17:30.679 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:30.679 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:30.679 [2024-07-12 11:58:20.800602] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:30.679 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:30.679 "name": "Existed_Raid", 00:17:30.679 "aliases": [ 00:17:30.679 "19ef10ed-f580-41fc-8db4-39b5ea8b915f" 00:17:30.679 ], 00:17:30.679 "product_name": "Raid Volume", 00:17:30.679 "block_size": 512, 00:17:30.679 "num_blocks": 63488, 00:17:30.679 "uuid": "19ef10ed-f580-41fc-8db4-39b5ea8b915f", 00:17:30.679 "assigned_rate_limits": { 00:17:30.679 "rw_ios_per_sec": 0, 00:17:30.679 "rw_mbytes_per_sec": 0, 00:17:30.679 "r_mbytes_per_sec": 0, 00:17:30.679 "w_mbytes_per_sec": 0 00:17:30.679 }, 00:17:30.679 "claimed": false, 00:17:30.679 "zoned": false, 00:17:30.679 "supported_io_types": { 00:17:30.679 "read": true, 00:17:30.679 "write": true, 00:17:30.679 "unmap": false, 00:17:30.679 "flush": false, 00:17:30.679 "reset": true, 00:17:30.679 "nvme_admin": false, 00:17:30.679 "nvme_io": false, 00:17:30.679 "nvme_io_md": false, 00:17:30.679 "write_zeroes": true, 00:17:30.679 "zcopy": false, 00:17:30.679 "get_zone_info": false, 00:17:30.679 "zone_management": false, 00:17:30.679 "zone_append": false, 00:17:30.679 "compare": false, 00:17:30.679 "compare_and_write": false, 00:17:30.679 "abort": false, 00:17:30.679 "seek_hole": false, 00:17:30.679 "seek_data": false, 00:17:30.679 "copy": false, 00:17:30.679 "nvme_iov_md": false 00:17:30.679 }, 00:17:30.679 "memory_domains": [ 00:17:30.679 { 00:17:30.679 "dma_device_id": "system", 00:17:30.679 "dma_device_type": 1 00:17:30.679 }, 00:17:30.679 { 00:17:30.679 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:30.679 "dma_device_type": 2 00:17:30.679 }, 00:17:30.679 { 00:17:30.679 "dma_device_id": "system", 00:17:30.679 "dma_device_type": 1 00:17:30.679 }, 00:17:30.679 { 00:17:30.679 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:30.679 "dma_device_type": 2 00:17:30.679 }, 00:17:30.679 { 00:17:30.679 "dma_device_id": "system", 00:17:30.679 "dma_device_type": 1 00:17:30.679 }, 00:17:30.680 { 00:17:30.680 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:30.680 "dma_device_type": 2 00:17:30.680 }, 00:17:30.680 { 00:17:30.680 "dma_device_id": "system", 00:17:30.680 "dma_device_type": 1 00:17:30.680 }, 00:17:30.680 { 00:17:30.680 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:30.680 "dma_device_type": 2 00:17:30.680 } 00:17:30.680 ], 00:17:30.680 "driver_specific": { 00:17:30.680 "raid": { 00:17:30.680 "uuid": "19ef10ed-f580-41fc-8db4-39b5ea8b915f", 00:17:30.680 "strip_size_kb": 0, 00:17:30.680 "state": "online", 00:17:30.680 "raid_level": "raid1", 00:17:30.680 "superblock": true, 00:17:30.680 "num_base_bdevs": 4, 00:17:30.680 "num_base_bdevs_discovered": 4, 00:17:30.680 "num_base_bdevs_operational": 4, 00:17:30.680 "base_bdevs_list": [ 00:17:30.680 { 00:17:30.680 "name": "BaseBdev1", 00:17:30.680 "uuid": "2d2faeea-2257-4491-a758-42696a09d8ce", 00:17:30.680 "is_configured": true, 00:17:30.680 "data_offset": 2048, 00:17:30.680 "data_size": 63488 00:17:30.680 }, 00:17:30.680 { 00:17:30.680 "name": "BaseBdev2", 00:17:30.680 "uuid": "631b6f48-7c3a-4b22-8e07-d29e85857eb4", 00:17:30.680 "is_configured": true, 00:17:30.680 "data_offset": 2048, 00:17:30.680 "data_size": 63488 00:17:30.680 }, 00:17:30.680 { 00:17:30.680 "name": "BaseBdev3", 00:17:30.680 "uuid": "54c351a0-a2cc-4542-93c3-a7bba44fb2bb", 00:17:30.680 "is_configured": true, 00:17:30.680 "data_offset": 2048, 00:17:30.680 "data_size": 63488 00:17:30.680 }, 00:17:30.680 { 00:17:30.680 "name": "BaseBdev4", 00:17:30.680 "uuid": "315b85aa-0fca-4a7d-9a21-9b42a898c2cb", 00:17:30.680 "is_configured": true, 00:17:30.680 "data_offset": 2048, 00:17:30.680 "data_size": 63488 00:17:30.680 } 00:17:30.680 ] 00:17:30.680 } 00:17:30.680 } 00:17:30.680 }' 00:17:30.680 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:30.680 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:17:30.680 BaseBdev2 00:17:30.680 BaseBdev3 00:17:30.680 BaseBdev4' 00:17:30.680 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:30.680 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:30.680 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:30.939 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:30.939 "name": "BaseBdev1", 00:17:30.939 "aliases": [ 00:17:30.939 "2d2faeea-2257-4491-a758-42696a09d8ce" 00:17:30.939 ], 00:17:30.939 "product_name": "Malloc disk", 00:17:30.939 "block_size": 512, 00:17:30.939 "num_blocks": 65536, 00:17:30.939 "uuid": "2d2faeea-2257-4491-a758-42696a09d8ce", 00:17:30.939 "assigned_rate_limits": { 00:17:30.939 "rw_ios_per_sec": 0, 00:17:30.939 "rw_mbytes_per_sec": 0, 00:17:30.939 "r_mbytes_per_sec": 0, 00:17:30.939 "w_mbytes_per_sec": 0 00:17:30.939 }, 00:17:30.939 "claimed": true, 00:17:30.939 "claim_type": "exclusive_write", 00:17:30.939 "zoned": false, 00:17:30.939 "supported_io_types": { 00:17:30.939 "read": true, 00:17:30.939 "write": true, 00:17:30.939 "unmap": true, 00:17:30.939 "flush": true, 00:17:30.939 "reset": true, 00:17:30.939 "nvme_admin": false, 00:17:30.939 "nvme_io": false, 00:17:30.939 "nvme_io_md": false, 00:17:30.939 "write_zeroes": true, 00:17:30.939 "zcopy": true, 00:17:30.939 "get_zone_info": false, 00:17:30.939 "zone_management": false, 00:17:30.939 "zone_append": false, 00:17:30.939 "compare": false, 00:17:30.939 "compare_and_write": false, 00:17:30.939 "abort": true, 00:17:30.939 "seek_hole": false, 00:17:30.939 "seek_data": false, 00:17:30.939 "copy": true, 00:17:30.939 "nvme_iov_md": false 00:17:30.939 }, 00:17:30.939 "memory_domains": [ 00:17:30.939 { 00:17:30.939 "dma_device_id": "system", 00:17:30.939 "dma_device_type": 1 00:17:30.939 }, 00:17:30.939 { 00:17:30.939 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:30.939 "dma_device_type": 2 00:17:30.939 } 00:17:30.939 ], 00:17:30.939 "driver_specific": {} 00:17:30.939 }' 00:17:30.939 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:30.939 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:30.939 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:30.939 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:30.939 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:30.939 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:30.939 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:31.198 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:31.198 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:31.198 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:31.198 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:31.198 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:31.198 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:31.198 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:31.198 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:31.457 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:31.457 "name": "BaseBdev2", 00:17:31.457 "aliases": [ 00:17:31.457 "631b6f48-7c3a-4b22-8e07-d29e85857eb4" 00:17:31.457 ], 00:17:31.457 "product_name": "Malloc disk", 00:17:31.457 "block_size": 512, 00:17:31.457 "num_blocks": 65536, 00:17:31.457 "uuid": "631b6f48-7c3a-4b22-8e07-d29e85857eb4", 00:17:31.457 "assigned_rate_limits": { 00:17:31.457 "rw_ios_per_sec": 0, 00:17:31.457 "rw_mbytes_per_sec": 0, 00:17:31.457 "r_mbytes_per_sec": 0, 00:17:31.457 "w_mbytes_per_sec": 0 00:17:31.457 }, 00:17:31.457 "claimed": true, 00:17:31.457 "claim_type": "exclusive_write", 00:17:31.457 "zoned": false, 00:17:31.457 "supported_io_types": { 00:17:31.457 "read": true, 00:17:31.457 "write": true, 00:17:31.457 "unmap": true, 00:17:31.457 "flush": true, 00:17:31.457 "reset": true, 00:17:31.457 "nvme_admin": false, 00:17:31.457 "nvme_io": false, 00:17:31.457 "nvme_io_md": false, 00:17:31.457 "write_zeroes": true, 00:17:31.457 "zcopy": true, 00:17:31.457 "get_zone_info": false, 00:17:31.457 "zone_management": false, 00:17:31.457 "zone_append": false, 00:17:31.457 "compare": false, 00:17:31.457 "compare_and_write": false, 00:17:31.457 "abort": true, 00:17:31.457 "seek_hole": false, 00:17:31.457 "seek_data": false, 00:17:31.457 "copy": true, 00:17:31.457 "nvme_iov_md": false 00:17:31.457 }, 00:17:31.457 "memory_domains": [ 00:17:31.457 { 00:17:31.457 "dma_device_id": "system", 00:17:31.457 "dma_device_type": 1 00:17:31.457 }, 00:17:31.457 { 00:17:31.457 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:31.457 "dma_device_type": 2 00:17:31.457 } 00:17:31.457 ], 00:17:31.457 "driver_specific": {} 00:17:31.457 }' 00:17:31.457 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:31.457 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:31.457 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:31.457 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:31.457 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:31.457 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:31.457 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:31.457 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:31.457 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:31.457 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:31.716 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:31.716 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:31.716 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:31.716 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:31.716 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:31.716 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:31.716 "name": "BaseBdev3", 00:17:31.716 "aliases": [ 00:17:31.716 "54c351a0-a2cc-4542-93c3-a7bba44fb2bb" 00:17:31.716 ], 00:17:31.716 "product_name": "Malloc disk", 00:17:31.716 "block_size": 512, 00:17:31.716 "num_blocks": 65536, 00:17:31.716 "uuid": "54c351a0-a2cc-4542-93c3-a7bba44fb2bb", 00:17:31.716 "assigned_rate_limits": { 00:17:31.716 "rw_ios_per_sec": 0, 00:17:31.716 "rw_mbytes_per_sec": 0, 00:17:31.716 "r_mbytes_per_sec": 0, 00:17:31.716 "w_mbytes_per_sec": 0 00:17:31.716 }, 00:17:31.716 "claimed": true, 00:17:31.717 "claim_type": "exclusive_write", 00:17:31.717 "zoned": false, 00:17:31.717 "supported_io_types": { 00:17:31.717 "read": true, 00:17:31.717 "write": true, 00:17:31.717 "unmap": true, 00:17:31.717 "flush": true, 00:17:31.717 "reset": true, 00:17:31.717 "nvme_admin": false, 00:17:31.717 "nvme_io": false, 00:17:31.717 "nvme_io_md": false, 00:17:31.717 "write_zeroes": true, 00:17:31.717 "zcopy": true, 00:17:31.717 "get_zone_info": false, 00:17:31.717 "zone_management": false, 00:17:31.717 "zone_append": false, 00:17:31.717 "compare": false, 00:17:31.717 "compare_and_write": false, 00:17:31.717 "abort": true, 00:17:31.717 "seek_hole": false, 00:17:31.717 "seek_data": false, 00:17:31.717 "copy": true, 00:17:31.717 "nvme_iov_md": false 00:17:31.717 }, 00:17:31.717 "memory_domains": [ 00:17:31.717 { 00:17:31.717 "dma_device_id": "system", 00:17:31.717 "dma_device_type": 1 00:17:31.717 }, 00:17:31.717 { 00:17:31.717 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:31.717 "dma_device_type": 2 00:17:31.717 } 00:17:31.717 ], 00:17:31.717 "driver_specific": {} 00:17:31.717 }' 00:17:31.717 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:31.717 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:31.976 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:31.976 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:31.976 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:31.976 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:31.976 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:31.976 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:31.976 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:31.976 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:31.976 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:31.976 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:31.976 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:31.976 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:32.237 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:32.237 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:32.237 "name": "BaseBdev4", 00:17:32.237 "aliases": [ 00:17:32.237 "315b85aa-0fca-4a7d-9a21-9b42a898c2cb" 00:17:32.237 ], 00:17:32.237 "product_name": "Malloc disk", 00:17:32.237 "block_size": 512, 00:17:32.237 "num_blocks": 65536, 00:17:32.237 "uuid": "315b85aa-0fca-4a7d-9a21-9b42a898c2cb", 00:17:32.237 "assigned_rate_limits": { 00:17:32.237 "rw_ios_per_sec": 0, 00:17:32.237 "rw_mbytes_per_sec": 0, 00:17:32.237 "r_mbytes_per_sec": 0, 00:17:32.237 "w_mbytes_per_sec": 0 00:17:32.237 }, 00:17:32.237 "claimed": true, 00:17:32.237 "claim_type": "exclusive_write", 00:17:32.237 "zoned": false, 00:17:32.237 "supported_io_types": { 00:17:32.237 "read": true, 00:17:32.237 "write": true, 00:17:32.237 "unmap": true, 00:17:32.237 "flush": true, 00:17:32.237 "reset": true, 00:17:32.237 "nvme_admin": false, 00:17:32.237 "nvme_io": false, 00:17:32.237 "nvme_io_md": false, 00:17:32.237 "write_zeroes": true, 00:17:32.237 "zcopy": true, 00:17:32.237 "get_zone_info": false, 00:17:32.237 "zone_management": false, 00:17:32.237 "zone_append": false, 00:17:32.237 "compare": false, 00:17:32.237 "compare_and_write": false, 00:17:32.237 "abort": true, 00:17:32.237 "seek_hole": false, 00:17:32.237 "seek_data": false, 00:17:32.237 "copy": true, 00:17:32.237 "nvme_iov_md": false 00:17:32.237 }, 00:17:32.237 "memory_domains": [ 00:17:32.237 { 00:17:32.237 "dma_device_id": "system", 00:17:32.237 "dma_device_type": 1 00:17:32.237 }, 00:17:32.237 { 00:17:32.237 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:32.237 "dma_device_type": 2 00:17:32.237 } 00:17:32.237 ], 00:17:32.237 "driver_specific": {} 00:17:32.237 }' 00:17:32.237 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:32.237 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:32.237 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:32.237 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:32.497 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:32.497 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:32.497 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:32.497 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:32.497 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:32.497 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:32.497 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:32.497 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:32.497 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:32.757 [2024-07-12 11:58:22.845801] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:32.757 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:17:32.757 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:17:32.757 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:32.757 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:17:32.757 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:17:32.757 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:17:32.757 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:32.757 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:32.757 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:32.757 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:32.757 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:32.757 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:32.757 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:32.757 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:32.757 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:32.757 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:32.757 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:33.016 11:58:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:33.016 "name": "Existed_Raid", 00:17:33.016 "uuid": "19ef10ed-f580-41fc-8db4-39b5ea8b915f", 00:17:33.016 "strip_size_kb": 0, 00:17:33.016 "state": "online", 00:17:33.016 "raid_level": "raid1", 00:17:33.016 "superblock": true, 00:17:33.016 "num_base_bdevs": 4, 00:17:33.016 "num_base_bdevs_discovered": 3, 00:17:33.016 "num_base_bdevs_operational": 3, 00:17:33.016 "base_bdevs_list": [ 00:17:33.016 { 00:17:33.016 "name": null, 00:17:33.016 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:33.016 "is_configured": false, 00:17:33.016 "data_offset": 2048, 00:17:33.016 "data_size": 63488 00:17:33.016 }, 00:17:33.016 { 00:17:33.016 "name": "BaseBdev2", 00:17:33.016 "uuid": "631b6f48-7c3a-4b22-8e07-d29e85857eb4", 00:17:33.016 "is_configured": true, 00:17:33.016 "data_offset": 2048, 00:17:33.016 "data_size": 63488 00:17:33.016 }, 00:17:33.016 { 00:17:33.016 "name": "BaseBdev3", 00:17:33.016 "uuid": "54c351a0-a2cc-4542-93c3-a7bba44fb2bb", 00:17:33.016 "is_configured": true, 00:17:33.016 "data_offset": 2048, 00:17:33.016 "data_size": 63488 00:17:33.016 }, 00:17:33.016 { 00:17:33.016 "name": "BaseBdev4", 00:17:33.016 "uuid": "315b85aa-0fca-4a7d-9a21-9b42a898c2cb", 00:17:33.016 "is_configured": true, 00:17:33.016 "data_offset": 2048, 00:17:33.016 "data_size": 63488 00:17:33.016 } 00:17:33.016 ] 00:17:33.016 }' 00:17:33.016 11:58:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:33.016 11:58:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:33.275 11:58:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:17:33.275 11:58:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:33.275 11:58:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:33.275 11:58:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:33.533 11:58:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:33.533 11:58:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:33.533 11:58:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:17:33.792 [2024-07-12 11:58:23.837305] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:33.792 11:58:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:33.792 11:58:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:33.792 11:58:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:33.792 11:58:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:33.792 11:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:33.792 11:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:33.792 11:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:17:34.052 [2024-07-12 11:58:24.184096] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:34.052 11:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:34.052 11:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:34.052 11:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:34.052 11:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:34.311 11:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:34.311 11:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:34.311 11:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:17:34.311 [2024-07-12 11:58:24.534793] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:17:34.311 [2024-07-12 11:58:24.534852] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:34.311 [2024-07-12 11:58:24.544880] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:34.311 [2024-07-12 11:58:24.544921] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:34.311 [2024-07-12 11:58:24.544926] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22a8b90 name Existed_Raid, state offline 00:17:34.571 11:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:34.571 11:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:34.571 11:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:34.571 11:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:17:34.571 11:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:17:34.571 11:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:17:34.571 11:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:17:34.571 11:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:17:34.571 11:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:34.571 11:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:34.831 BaseBdev2 00:17:34.831 11:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:17:34.831 11:58:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:34.831 11:58:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:34.831 11:58:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:34.831 11:58:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:34.831 11:58:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:34.831 11:58:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:34.831 11:58:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:35.089 [ 00:17:35.089 { 00:17:35.089 "name": "BaseBdev2", 00:17:35.089 "aliases": [ 00:17:35.089 "b2fbf4d6-af3b-425c-9cd2-3883c17b6824" 00:17:35.089 ], 00:17:35.089 "product_name": "Malloc disk", 00:17:35.089 "block_size": 512, 00:17:35.089 "num_blocks": 65536, 00:17:35.089 "uuid": "b2fbf4d6-af3b-425c-9cd2-3883c17b6824", 00:17:35.089 "assigned_rate_limits": { 00:17:35.089 "rw_ios_per_sec": 0, 00:17:35.089 "rw_mbytes_per_sec": 0, 00:17:35.089 "r_mbytes_per_sec": 0, 00:17:35.089 "w_mbytes_per_sec": 0 00:17:35.089 }, 00:17:35.089 "claimed": false, 00:17:35.089 "zoned": false, 00:17:35.089 "supported_io_types": { 00:17:35.089 "read": true, 00:17:35.089 "write": true, 00:17:35.089 "unmap": true, 00:17:35.089 "flush": true, 00:17:35.089 "reset": true, 00:17:35.089 "nvme_admin": false, 00:17:35.089 "nvme_io": false, 00:17:35.089 "nvme_io_md": false, 00:17:35.089 "write_zeroes": true, 00:17:35.089 "zcopy": true, 00:17:35.089 "get_zone_info": false, 00:17:35.089 "zone_management": false, 00:17:35.089 "zone_append": false, 00:17:35.089 "compare": false, 00:17:35.089 "compare_and_write": false, 00:17:35.089 "abort": true, 00:17:35.089 "seek_hole": false, 00:17:35.089 "seek_data": false, 00:17:35.089 "copy": true, 00:17:35.089 "nvme_iov_md": false 00:17:35.089 }, 00:17:35.089 "memory_domains": [ 00:17:35.089 { 00:17:35.089 "dma_device_id": "system", 00:17:35.089 "dma_device_type": 1 00:17:35.089 }, 00:17:35.089 { 00:17:35.089 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:35.089 "dma_device_type": 2 00:17:35.089 } 00:17:35.089 ], 00:17:35.089 "driver_specific": {} 00:17:35.089 } 00:17:35.089 ] 00:17:35.089 11:58:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:35.089 11:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:35.089 11:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:35.089 11:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:35.348 BaseBdev3 00:17:35.348 11:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:17:35.348 11:58:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:35.348 11:58:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:35.348 11:58:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:35.348 11:58:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:35.348 11:58:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:35.348 11:58:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:35.348 11:58:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:35.607 [ 00:17:35.607 { 00:17:35.607 "name": "BaseBdev3", 00:17:35.607 "aliases": [ 00:17:35.607 "a04324e7-de6c-40ad-b9bd-b0cfbdff03ce" 00:17:35.607 ], 00:17:35.607 "product_name": "Malloc disk", 00:17:35.607 "block_size": 512, 00:17:35.607 "num_blocks": 65536, 00:17:35.607 "uuid": "a04324e7-de6c-40ad-b9bd-b0cfbdff03ce", 00:17:35.607 "assigned_rate_limits": { 00:17:35.607 "rw_ios_per_sec": 0, 00:17:35.607 "rw_mbytes_per_sec": 0, 00:17:35.607 "r_mbytes_per_sec": 0, 00:17:35.607 "w_mbytes_per_sec": 0 00:17:35.607 }, 00:17:35.607 "claimed": false, 00:17:35.607 "zoned": false, 00:17:35.607 "supported_io_types": { 00:17:35.607 "read": true, 00:17:35.607 "write": true, 00:17:35.607 "unmap": true, 00:17:35.607 "flush": true, 00:17:35.607 "reset": true, 00:17:35.607 "nvme_admin": false, 00:17:35.607 "nvme_io": false, 00:17:35.607 "nvme_io_md": false, 00:17:35.607 "write_zeroes": true, 00:17:35.607 "zcopy": true, 00:17:35.607 "get_zone_info": false, 00:17:35.607 "zone_management": false, 00:17:35.607 "zone_append": false, 00:17:35.607 "compare": false, 00:17:35.607 "compare_and_write": false, 00:17:35.607 "abort": true, 00:17:35.607 "seek_hole": false, 00:17:35.607 "seek_data": false, 00:17:35.607 "copy": true, 00:17:35.607 "nvme_iov_md": false 00:17:35.607 }, 00:17:35.607 "memory_domains": [ 00:17:35.607 { 00:17:35.607 "dma_device_id": "system", 00:17:35.607 "dma_device_type": 1 00:17:35.607 }, 00:17:35.607 { 00:17:35.607 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:35.607 "dma_device_type": 2 00:17:35.607 } 00:17:35.607 ], 00:17:35.607 "driver_specific": {} 00:17:35.607 } 00:17:35.607 ] 00:17:35.607 11:58:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:35.607 11:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:35.607 11:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:35.607 11:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:35.867 BaseBdev4 00:17:35.867 11:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:17:35.867 11:58:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:17:35.867 11:58:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:35.867 11:58:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:35.867 11:58:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:35.867 11:58:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:35.867 11:58:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:35.867 11:58:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:36.126 [ 00:17:36.126 { 00:17:36.126 "name": "BaseBdev4", 00:17:36.126 "aliases": [ 00:17:36.126 "0e7f6af5-12f4-48e2-b5b0-b28f1e1a525c" 00:17:36.126 ], 00:17:36.126 "product_name": "Malloc disk", 00:17:36.126 "block_size": 512, 00:17:36.126 "num_blocks": 65536, 00:17:36.126 "uuid": "0e7f6af5-12f4-48e2-b5b0-b28f1e1a525c", 00:17:36.126 "assigned_rate_limits": { 00:17:36.126 "rw_ios_per_sec": 0, 00:17:36.126 "rw_mbytes_per_sec": 0, 00:17:36.126 "r_mbytes_per_sec": 0, 00:17:36.126 "w_mbytes_per_sec": 0 00:17:36.126 }, 00:17:36.126 "claimed": false, 00:17:36.126 "zoned": false, 00:17:36.126 "supported_io_types": { 00:17:36.126 "read": true, 00:17:36.126 "write": true, 00:17:36.126 "unmap": true, 00:17:36.126 "flush": true, 00:17:36.126 "reset": true, 00:17:36.126 "nvme_admin": false, 00:17:36.126 "nvme_io": false, 00:17:36.126 "nvme_io_md": false, 00:17:36.126 "write_zeroes": true, 00:17:36.126 "zcopy": true, 00:17:36.126 "get_zone_info": false, 00:17:36.126 "zone_management": false, 00:17:36.126 "zone_append": false, 00:17:36.126 "compare": false, 00:17:36.126 "compare_and_write": false, 00:17:36.126 "abort": true, 00:17:36.126 "seek_hole": false, 00:17:36.126 "seek_data": false, 00:17:36.126 "copy": true, 00:17:36.126 "nvme_iov_md": false 00:17:36.126 }, 00:17:36.126 "memory_domains": [ 00:17:36.126 { 00:17:36.126 "dma_device_id": "system", 00:17:36.126 "dma_device_type": 1 00:17:36.126 }, 00:17:36.126 { 00:17:36.126 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:36.126 "dma_device_type": 2 00:17:36.126 } 00:17:36.126 ], 00:17:36.126 "driver_specific": {} 00:17:36.126 } 00:17:36.126 ] 00:17:36.126 11:58:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:36.126 11:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:36.126 11:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:36.126 11:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:36.126 [2024-07-12 11:58:26.336463] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:36.126 [2024-07-12 11:58:26.336491] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:36.126 [2024-07-12 11:58:26.336502] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:36.126 [2024-07-12 11:58:26.337447] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:36.126 [2024-07-12 11:58:26.337474] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:36.126 11:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:36.126 11:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:36.126 11:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:36.126 11:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:36.126 11:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:36.126 11:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:36.126 11:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:36.126 11:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:36.126 11:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:36.126 11:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:36.126 11:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:36.126 11:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:36.385 11:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:36.385 "name": "Existed_Raid", 00:17:36.385 "uuid": "85f85ca2-6c2b-4713-bc55-62973c5cd7d0", 00:17:36.385 "strip_size_kb": 0, 00:17:36.385 "state": "configuring", 00:17:36.385 "raid_level": "raid1", 00:17:36.385 "superblock": true, 00:17:36.385 "num_base_bdevs": 4, 00:17:36.385 "num_base_bdevs_discovered": 3, 00:17:36.385 "num_base_bdevs_operational": 4, 00:17:36.385 "base_bdevs_list": [ 00:17:36.385 { 00:17:36.385 "name": "BaseBdev1", 00:17:36.385 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:36.385 "is_configured": false, 00:17:36.385 "data_offset": 0, 00:17:36.385 "data_size": 0 00:17:36.385 }, 00:17:36.385 { 00:17:36.385 "name": "BaseBdev2", 00:17:36.385 "uuid": "b2fbf4d6-af3b-425c-9cd2-3883c17b6824", 00:17:36.385 "is_configured": true, 00:17:36.385 "data_offset": 2048, 00:17:36.385 "data_size": 63488 00:17:36.385 }, 00:17:36.385 { 00:17:36.385 "name": "BaseBdev3", 00:17:36.385 "uuid": "a04324e7-de6c-40ad-b9bd-b0cfbdff03ce", 00:17:36.385 "is_configured": true, 00:17:36.385 "data_offset": 2048, 00:17:36.385 "data_size": 63488 00:17:36.385 }, 00:17:36.385 { 00:17:36.385 "name": "BaseBdev4", 00:17:36.385 "uuid": "0e7f6af5-12f4-48e2-b5b0-b28f1e1a525c", 00:17:36.385 "is_configured": true, 00:17:36.385 "data_offset": 2048, 00:17:36.385 "data_size": 63488 00:17:36.385 } 00:17:36.385 ] 00:17:36.385 }' 00:17:36.385 11:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:36.385 11:58:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:36.953 11:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:17:36.953 [2024-07-12 11:58:27.138536] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:36.953 11:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:36.953 11:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:36.953 11:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:36.953 11:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:36.953 11:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:36.953 11:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:36.953 11:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:36.953 11:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:36.953 11:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:36.953 11:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:36.953 11:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:36.953 11:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:37.212 11:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:37.212 "name": "Existed_Raid", 00:17:37.212 "uuid": "85f85ca2-6c2b-4713-bc55-62973c5cd7d0", 00:17:37.212 "strip_size_kb": 0, 00:17:37.212 "state": "configuring", 00:17:37.212 "raid_level": "raid1", 00:17:37.212 "superblock": true, 00:17:37.212 "num_base_bdevs": 4, 00:17:37.212 "num_base_bdevs_discovered": 2, 00:17:37.212 "num_base_bdevs_operational": 4, 00:17:37.212 "base_bdevs_list": [ 00:17:37.212 { 00:17:37.212 "name": "BaseBdev1", 00:17:37.212 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:37.212 "is_configured": false, 00:17:37.212 "data_offset": 0, 00:17:37.212 "data_size": 0 00:17:37.212 }, 00:17:37.212 { 00:17:37.212 "name": null, 00:17:37.212 "uuid": "b2fbf4d6-af3b-425c-9cd2-3883c17b6824", 00:17:37.212 "is_configured": false, 00:17:37.212 "data_offset": 2048, 00:17:37.212 "data_size": 63488 00:17:37.212 }, 00:17:37.212 { 00:17:37.212 "name": "BaseBdev3", 00:17:37.212 "uuid": "a04324e7-de6c-40ad-b9bd-b0cfbdff03ce", 00:17:37.212 "is_configured": true, 00:17:37.212 "data_offset": 2048, 00:17:37.212 "data_size": 63488 00:17:37.212 }, 00:17:37.212 { 00:17:37.212 "name": "BaseBdev4", 00:17:37.212 "uuid": "0e7f6af5-12f4-48e2-b5b0-b28f1e1a525c", 00:17:37.212 "is_configured": true, 00:17:37.212 "data_offset": 2048, 00:17:37.212 "data_size": 63488 00:17:37.212 } 00:17:37.212 ] 00:17:37.212 }' 00:17:37.212 11:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:37.212 11:58:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:37.782 11:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:37.782 11:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:37.782 11:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:17:37.782 11:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:38.040 [2024-07-12 11:58:28.128161] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:38.040 BaseBdev1 00:17:38.040 11:58:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:17:38.040 11:58:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:38.040 11:58:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:38.040 11:58:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:38.040 11:58:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:38.040 11:58:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:38.040 11:58:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:38.299 11:58:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:38.299 [ 00:17:38.299 { 00:17:38.299 "name": "BaseBdev1", 00:17:38.299 "aliases": [ 00:17:38.299 "97c9a7e6-56f0-4641-838a-30e6a4394a1e" 00:17:38.299 ], 00:17:38.299 "product_name": "Malloc disk", 00:17:38.299 "block_size": 512, 00:17:38.299 "num_blocks": 65536, 00:17:38.299 "uuid": "97c9a7e6-56f0-4641-838a-30e6a4394a1e", 00:17:38.299 "assigned_rate_limits": { 00:17:38.299 "rw_ios_per_sec": 0, 00:17:38.299 "rw_mbytes_per_sec": 0, 00:17:38.299 "r_mbytes_per_sec": 0, 00:17:38.299 "w_mbytes_per_sec": 0 00:17:38.299 }, 00:17:38.299 "claimed": true, 00:17:38.299 "claim_type": "exclusive_write", 00:17:38.299 "zoned": false, 00:17:38.299 "supported_io_types": { 00:17:38.299 "read": true, 00:17:38.299 "write": true, 00:17:38.299 "unmap": true, 00:17:38.299 "flush": true, 00:17:38.299 "reset": true, 00:17:38.299 "nvme_admin": false, 00:17:38.299 "nvme_io": false, 00:17:38.299 "nvme_io_md": false, 00:17:38.299 "write_zeroes": true, 00:17:38.299 "zcopy": true, 00:17:38.299 "get_zone_info": false, 00:17:38.299 "zone_management": false, 00:17:38.299 "zone_append": false, 00:17:38.299 "compare": false, 00:17:38.299 "compare_and_write": false, 00:17:38.299 "abort": true, 00:17:38.299 "seek_hole": false, 00:17:38.299 "seek_data": false, 00:17:38.299 "copy": true, 00:17:38.299 "nvme_iov_md": false 00:17:38.299 }, 00:17:38.299 "memory_domains": [ 00:17:38.299 { 00:17:38.299 "dma_device_id": "system", 00:17:38.299 "dma_device_type": 1 00:17:38.299 }, 00:17:38.299 { 00:17:38.299 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:38.299 "dma_device_type": 2 00:17:38.299 } 00:17:38.299 ], 00:17:38.299 "driver_specific": {} 00:17:38.299 } 00:17:38.299 ] 00:17:38.299 11:58:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:38.299 11:58:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:38.299 11:58:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:38.299 11:58:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:38.299 11:58:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:38.299 11:58:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:38.299 11:58:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:38.299 11:58:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:38.299 11:58:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:38.299 11:58:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:38.299 11:58:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:38.299 11:58:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:38.299 11:58:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:38.558 11:58:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:38.558 "name": "Existed_Raid", 00:17:38.558 "uuid": "85f85ca2-6c2b-4713-bc55-62973c5cd7d0", 00:17:38.558 "strip_size_kb": 0, 00:17:38.558 "state": "configuring", 00:17:38.558 "raid_level": "raid1", 00:17:38.558 "superblock": true, 00:17:38.558 "num_base_bdevs": 4, 00:17:38.558 "num_base_bdevs_discovered": 3, 00:17:38.558 "num_base_bdevs_operational": 4, 00:17:38.558 "base_bdevs_list": [ 00:17:38.558 { 00:17:38.558 "name": "BaseBdev1", 00:17:38.558 "uuid": "97c9a7e6-56f0-4641-838a-30e6a4394a1e", 00:17:38.558 "is_configured": true, 00:17:38.558 "data_offset": 2048, 00:17:38.558 "data_size": 63488 00:17:38.558 }, 00:17:38.558 { 00:17:38.558 "name": null, 00:17:38.558 "uuid": "b2fbf4d6-af3b-425c-9cd2-3883c17b6824", 00:17:38.558 "is_configured": false, 00:17:38.558 "data_offset": 2048, 00:17:38.558 "data_size": 63488 00:17:38.558 }, 00:17:38.558 { 00:17:38.558 "name": "BaseBdev3", 00:17:38.558 "uuid": "a04324e7-de6c-40ad-b9bd-b0cfbdff03ce", 00:17:38.558 "is_configured": true, 00:17:38.558 "data_offset": 2048, 00:17:38.558 "data_size": 63488 00:17:38.558 }, 00:17:38.558 { 00:17:38.558 "name": "BaseBdev4", 00:17:38.558 "uuid": "0e7f6af5-12f4-48e2-b5b0-b28f1e1a525c", 00:17:38.558 "is_configured": true, 00:17:38.558 "data_offset": 2048, 00:17:38.558 "data_size": 63488 00:17:38.558 } 00:17:38.558 ] 00:17:38.558 }' 00:17:38.558 11:58:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:38.558 11:58:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:39.126 11:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:39.126 11:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:39.126 11:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:17:39.126 11:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:17:39.384 [2024-07-12 11:58:29.423524] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:39.384 11:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:39.384 11:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:39.384 11:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:39.384 11:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:39.384 11:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:39.384 11:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:39.384 11:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:39.384 11:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:39.384 11:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:39.384 11:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:39.384 11:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:39.384 11:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:39.384 11:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:39.384 "name": "Existed_Raid", 00:17:39.384 "uuid": "85f85ca2-6c2b-4713-bc55-62973c5cd7d0", 00:17:39.384 "strip_size_kb": 0, 00:17:39.384 "state": "configuring", 00:17:39.384 "raid_level": "raid1", 00:17:39.384 "superblock": true, 00:17:39.384 "num_base_bdevs": 4, 00:17:39.384 "num_base_bdevs_discovered": 2, 00:17:39.384 "num_base_bdevs_operational": 4, 00:17:39.384 "base_bdevs_list": [ 00:17:39.384 { 00:17:39.384 "name": "BaseBdev1", 00:17:39.384 "uuid": "97c9a7e6-56f0-4641-838a-30e6a4394a1e", 00:17:39.384 "is_configured": true, 00:17:39.384 "data_offset": 2048, 00:17:39.384 "data_size": 63488 00:17:39.384 }, 00:17:39.384 { 00:17:39.384 "name": null, 00:17:39.384 "uuid": "b2fbf4d6-af3b-425c-9cd2-3883c17b6824", 00:17:39.384 "is_configured": false, 00:17:39.384 "data_offset": 2048, 00:17:39.384 "data_size": 63488 00:17:39.384 }, 00:17:39.384 { 00:17:39.384 "name": null, 00:17:39.384 "uuid": "a04324e7-de6c-40ad-b9bd-b0cfbdff03ce", 00:17:39.384 "is_configured": false, 00:17:39.384 "data_offset": 2048, 00:17:39.384 "data_size": 63488 00:17:39.384 }, 00:17:39.384 { 00:17:39.384 "name": "BaseBdev4", 00:17:39.384 "uuid": "0e7f6af5-12f4-48e2-b5b0-b28f1e1a525c", 00:17:39.384 "is_configured": true, 00:17:39.384 "data_offset": 2048, 00:17:39.384 "data_size": 63488 00:17:39.384 } 00:17:39.384 ] 00:17:39.384 }' 00:17:39.384 11:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:39.384 11:58:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:39.951 11:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:39.951 11:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:40.209 11:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:17:40.209 11:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:17:40.209 [2024-07-12 11:58:30.450198] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:40.468 11:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:40.468 11:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:40.468 11:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:40.468 11:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:40.468 11:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:40.468 11:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:40.468 11:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:40.468 11:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:40.468 11:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:40.468 11:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:40.468 11:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:40.468 11:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:40.468 11:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:40.468 "name": "Existed_Raid", 00:17:40.468 "uuid": "85f85ca2-6c2b-4713-bc55-62973c5cd7d0", 00:17:40.468 "strip_size_kb": 0, 00:17:40.468 "state": "configuring", 00:17:40.468 "raid_level": "raid1", 00:17:40.468 "superblock": true, 00:17:40.468 "num_base_bdevs": 4, 00:17:40.468 "num_base_bdevs_discovered": 3, 00:17:40.468 "num_base_bdevs_operational": 4, 00:17:40.468 "base_bdevs_list": [ 00:17:40.468 { 00:17:40.468 "name": "BaseBdev1", 00:17:40.468 "uuid": "97c9a7e6-56f0-4641-838a-30e6a4394a1e", 00:17:40.468 "is_configured": true, 00:17:40.468 "data_offset": 2048, 00:17:40.468 "data_size": 63488 00:17:40.468 }, 00:17:40.468 { 00:17:40.468 "name": null, 00:17:40.468 "uuid": "b2fbf4d6-af3b-425c-9cd2-3883c17b6824", 00:17:40.468 "is_configured": false, 00:17:40.468 "data_offset": 2048, 00:17:40.468 "data_size": 63488 00:17:40.468 }, 00:17:40.468 { 00:17:40.468 "name": "BaseBdev3", 00:17:40.468 "uuid": "a04324e7-de6c-40ad-b9bd-b0cfbdff03ce", 00:17:40.468 "is_configured": true, 00:17:40.468 "data_offset": 2048, 00:17:40.468 "data_size": 63488 00:17:40.468 }, 00:17:40.468 { 00:17:40.468 "name": "BaseBdev4", 00:17:40.468 "uuid": "0e7f6af5-12f4-48e2-b5b0-b28f1e1a525c", 00:17:40.468 "is_configured": true, 00:17:40.468 "data_offset": 2048, 00:17:40.468 "data_size": 63488 00:17:40.468 } 00:17:40.468 ] 00:17:40.468 }' 00:17:40.468 11:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:40.468 11:58:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:41.035 11:58:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:41.035 11:58:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:41.294 11:58:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:17:41.294 11:58:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:41.294 [2024-07-12 11:58:31.436761] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:41.294 11:58:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:41.294 11:58:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:41.294 11:58:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:41.294 11:58:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:41.294 11:58:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:41.294 11:58:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:41.294 11:58:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:41.294 11:58:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:41.294 11:58:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:41.295 11:58:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:41.295 11:58:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:41.295 11:58:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:41.554 11:58:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:41.554 "name": "Existed_Raid", 00:17:41.554 "uuid": "85f85ca2-6c2b-4713-bc55-62973c5cd7d0", 00:17:41.554 "strip_size_kb": 0, 00:17:41.554 "state": "configuring", 00:17:41.554 "raid_level": "raid1", 00:17:41.554 "superblock": true, 00:17:41.554 "num_base_bdevs": 4, 00:17:41.554 "num_base_bdevs_discovered": 2, 00:17:41.554 "num_base_bdevs_operational": 4, 00:17:41.554 "base_bdevs_list": [ 00:17:41.554 { 00:17:41.554 "name": null, 00:17:41.554 "uuid": "97c9a7e6-56f0-4641-838a-30e6a4394a1e", 00:17:41.554 "is_configured": false, 00:17:41.554 "data_offset": 2048, 00:17:41.554 "data_size": 63488 00:17:41.554 }, 00:17:41.554 { 00:17:41.554 "name": null, 00:17:41.554 "uuid": "b2fbf4d6-af3b-425c-9cd2-3883c17b6824", 00:17:41.554 "is_configured": false, 00:17:41.554 "data_offset": 2048, 00:17:41.554 "data_size": 63488 00:17:41.554 }, 00:17:41.554 { 00:17:41.554 "name": "BaseBdev3", 00:17:41.554 "uuid": "a04324e7-de6c-40ad-b9bd-b0cfbdff03ce", 00:17:41.554 "is_configured": true, 00:17:41.554 "data_offset": 2048, 00:17:41.554 "data_size": 63488 00:17:41.554 }, 00:17:41.554 { 00:17:41.554 "name": "BaseBdev4", 00:17:41.554 "uuid": "0e7f6af5-12f4-48e2-b5b0-b28f1e1a525c", 00:17:41.554 "is_configured": true, 00:17:41.554 "data_offset": 2048, 00:17:41.554 "data_size": 63488 00:17:41.554 } 00:17:41.554 ] 00:17:41.554 }' 00:17:41.554 11:58:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:41.554 11:58:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:42.122 11:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:42.122 11:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:42.122 11:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:17:42.122 11:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:17:42.382 [2024-07-12 11:58:32.457299] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:42.382 11:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:42.382 11:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:42.382 11:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:42.382 11:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:42.382 11:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:42.382 11:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:42.382 11:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:42.382 11:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:42.382 11:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:42.382 11:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:42.382 11:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:42.382 11:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:42.640 11:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:42.640 "name": "Existed_Raid", 00:17:42.640 "uuid": "85f85ca2-6c2b-4713-bc55-62973c5cd7d0", 00:17:42.640 "strip_size_kb": 0, 00:17:42.640 "state": "configuring", 00:17:42.640 "raid_level": "raid1", 00:17:42.640 "superblock": true, 00:17:42.640 "num_base_bdevs": 4, 00:17:42.640 "num_base_bdevs_discovered": 3, 00:17:42.640 "num_base_bdevs_operational": 4, 00:17:42.640 "base_bdevs_list": [ 00:17:42.640 { 00:17:42.640 "name": null, 00:17:42.640 "uuid": "97c9a7e6-56f0-4641-838a-30e6a4394a1e", 00:17:42.640 "is_configured": false, 00:17:42.640 "data_offset": 2048, 00:17:42.640 "data_size": 63488 00:17:42.640 }, 00:17:42.640 { 00:17:42.640 "name": "BaseBdev2", 00:17:42.640 "uuid": "b2fbf4d6-af3b-425c-9cd2-3883c17b6824", 00:17:42.640 "is_configured": true, 00:17:42.640 "data_offset": 2048, 00:17:42.640 "data_size": 63488 00:17:42.640 }, 00:17:42.640 { 00:17:42.640 "name": "BaseBdev3", 00:17:42.640 "uuid": "a04324e7-de6c-40ad-b9bd-b0cfbdff03ce", 00:17:42.640 "is_configured": true, 00:17:42.640 "data_offset": 2048, 00:17:42.640 "data_size": 63488 00:17:42.640 }, 00:17:42.640 { 00:17:42.640 "name": "BaseBdev4", 00:17:42.640 "uuid": "0e7f6af5-12f4-48e2-b5b0-b28f1e1a525c", 00:17:42.640 "is_configured": true, 00:17:42.640 "data_offset": 2048, 00:17:42.640 "data_size": 63488 00:17:42.640 } 00:17:42.640 ] 00:17:42.640 }' 00:17:42.640 11:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:42.640 11:58:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:42.899 11:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:42.899 11:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:43.156 11:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:17:43.156 11:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:43.156 11:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:17:43.427 11:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 97c9a7e6-56f0-4641-838a-30e6a4394a1e 00:17:43.427 [2024-07-12 11:58:33.615039] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:17:43.427 [2024-07-12 11:58:33.615152] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x244c100 00:17:43.427 [2024-07-12 11:58:33.615160] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:43.427 [2024-07-12 11:58:33.615272] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x244c8c0 00:17:43.427 [2024-07-12 11:58:33.615352] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x244c100 00:17:43.427 [2024-07-12 11:58:33.615357] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x244c100 00:17:43.427 [2024-07-12 11:58:33.615414] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:43.427 NewBaseBdev 00:17:43.427 11:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:17:43.427 11:58:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:17:43.427 11:58:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:43.427 11:58:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:43.427 11:58:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:43.427 11:58:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:43.427 11:58:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:43.711 11:58:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:17:43.711 [ 00:17:43.711 { 00:17:43.711 "name": "NewBaseBdev", 00:17:43.711 "aliases": [ 00:17:43.711 "97c9a7e6-56f0-4641-838a-30e6a4394a1e" 00:17:43.711 ], 00:17:43.711 "product_name": "Malloc disk", 00:17:43.711 "block_size": 512, 00:17:43.711 "num_blocks": 65536, 00:17:43.711 "uuid": "97c9a7e6-56f0-4641-838a-30e6a4394a1e", 00:17:43.711 "assigned_rate_limits": { 00:17:43.711 "rw_ios_per_sec": 0, 00:17:43.711 "rw_mbytes_per_sec": 0, 00:17:43.711 "r_mbytes_per_sec": 0, 00:17:43.711 "w_mbytes_per_sec": 0 00:17:43.711 }, 00:17:43.711 "claimed": true, 00:17:43.711 "claim_type": "exclusive_write", 00:17:43.711 "zoned": false, 00:17:43.711 "supported_io_types": { 00:17:43.711 "read": true, 00:17:43.711 "write": true, 00:17:43.711 "unmap": true, 00:17:43.711 "flush": true, 00:17:43.711 "reset": true, 00:17:43.711 "nvme_admin": false, 00:17:43.711 "nvme_io": false, 00:17:43.711 "nvme_io_md": false, 00:17:43.711 "write_zeroes": true, 00:17:43.711 "zcopy": true, 00:17:43.711 "get_zone_info": false, 00:17:43.711 "zone_management": false, 00:17:43.711 "zone_append": false, 00:17:43.711 "compare": false, 00:17:43.711 "compare_and_write": false, 00:17:43.711 "abort": true, 00:17:43.711 "seek_hole": false, 00:17:43.711 "seek_data": false, 00:17:43.711 "copy": true, 00:17:43.712 "nvme_iov_md": false 00:17:43.712 }, 00:17:43.712 "memory_domains": [ 00:17:43.712 { 00:17:43.712 "dma_device_id": "system", 00:17:43.712 "dma_device_type": 1 00:17:43.712 }, 00:17:43.712 { 00:17:43.712 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:43.712 "dma_device_type": 2 00:17:43.712 } 00:17:43.712 ], 00:17:43.712 "driver_specific": {} 00:17:43.712 } 00:17:43.712 ] 00:17:43.712 11:58:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:44.004 11:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:17:44.004 11:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:44.004 11:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:44.004 11:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:44.004 11:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:44.004 11:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:44.004 11:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:44.004 11:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:44.004 11:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:44.004 11:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:44.004 11:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:44.004 11:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:44.004 11:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:44.004 "name": "Existed_Raid", 00:17:44.004 "uuid": "85f85ca2-6c2b-4713-bc55-62973c5cd7d0", 00:17:44.004 "strip_size_kb": 0, 00:17:44.004 "state": "online", 00:17:44.004 "raid_level": "raid1", 00:17:44.004 "superblock": true, 00:17:44.004 "num_base_bdevs": 4, 00:17:44.004 "num_base_bdevs_discovered": 4, 00:17:44.004 "num_base_bdevs_operational": 4, 00:17:44.004 "base_bdevs_list": [ 00:17:44.004 { 00:17:44.004 "name": "NewBaseBdev", 00:17:44.004 "uuid": "97c9a7e6-56f0-4641-838a-30e6a4394a1e", 00:17:44.004 "is_configured": true, 00:17:44.004 "data_offset": 2048, 00:17:44.004 "data_size": 63488 00:17:44.004 }, 00:17:44.004 { 00:17:44.004 "name": "BaseBdev2", 00:17:44.004 "uuid": "b2fbf4d6-af3b-425c-9cd2-3883c17b6824", 00:17:44.004 "is_configured": true, 00:17:44.004 "data_offset": 2048, 00:17:44.004 "data_size": 63488 00:17:44.004 }, 00:17:44.004 { 00:17:44.004 "name": "BaseBdev3", 00:17:44.004 "uuid": "a04324e7-de6c-40ad-b9bd-b0cfbdff03ce", 00:17:44.004 "is_configured": true, 00:17:44.004 "data_offset": 2048, 00:17:44.004 "data_size": 63488 00:17:44.004 }, 00:17:44.004 { 00:17:44.004 "name": "BaseBdev4", 00:17:44.004 "uuid": "0e7f6af5-12f4-48e2-b5b0-b28f1e1a525c", 00:17:44.004 "is_configured": true, 00:17:44.004 "data_offset": 2048, 00:17:44.004 "data_size": 63488 00:17:44.004 } 00:17:44.004 ] 00:17:44.004 }' 00:17:44.004 11:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:44.004 11:58:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:44.571 11:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:17:44.571 11:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:44.571 11:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:44.571 11:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:44.571 11:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:44.571 11:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:17:44.571 11:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:44.571 11:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:44.571 [2024-07-12 11:58:34.774258] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:44.571 11:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:44.571 "name": "Existed_Raid", 00:17:44.571 "aliases": [ 00:17:44.571 "85f85ca2-6c2b-4713-bc55-62973c5cd7d0" 00:17:44.571 ], 00:17:44.571 "product_name": "Raid Volume", 00:17:44.571 "block_size": 512, 00:17:44.571 "num_blocks": 63488, 00:17:44.571 "uuid": "85f85ca2-6c2b-4713-bc55-62973c5cd7d0", 00:17:44.571 "assigned_rate_limits": { 00:17:44.571 "rw_ios_per_sec": 0, 00:17:44.571 "rw_mbytes_per_sec": 0, 00:17:44.571 "r_mbytes_per_sec": 0, 00:17:44.571 "w_mbytes_per_sec": 0 00:17:44.571 }, 00:17:44.571 "claimed": false, 00:17:44.571 "zoned": false, 00:17:44.571 "supported_io_types": { 00:17:44.571 "read": true, 00:17:44.571 "write": true, 00:17:44.571 "unmap": false, 00:17:44.571 "flush": false, 00:17:44.571 "reset": true, 00:17:44.571 "nvme_admin": false, 00:17:44.571 "nvme_io": false, 00:17:44.571 "nvme_io_md": false, 00:17:44.571 "write_zeroes": true, 00:17:44.571 "zcopy": false, 00:17:44.571 "get_zone_info": false, 00:17:44.571 "zone_management": false, 00:17:44.571 "zone_append": false, 00:17:44.571 "compare": false, 00:17:44.571 "compare_and_write": false, 00:17:44.571 "abort": false, 00:17:44.571 "seek_hole": false, 00:17:44.571 "seek_data": false, 00:17:44.571 "copy": false, 00:17:44.571 "nvme_iov_md": false 00:17:44.571 }, 00:17:44.571 "memory_domains": [ 00:17:44.571 { 00:17:44.571 "dma_device_id": "system", 00:17:44.571 "dma_device_type": 1 00:17:44.571 }, 00:17:44.571 { 00:17:44.571 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:44.571 "dma_device_type": 2 00:17:44.571 }, 00:17:44.571 { 00:17:44.571 "dma_device_id": "system", 00:17:44.571 "dma_device_type": 1 00:17:44.571 }, 00:17:44.571 { 00:17:44.571 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:44.571 "dma_device_type": 2 00:17:44.571 }, 00:17:44.571 { 00:17:44.571 "dma_device_id": "system", 00:17:44.571 "dma_device_type": 1 00:17:44.571 }, 00:17:44.571 { 00:17:44.571 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:44.571 "dma_device_type": 2 00:17:44.571 }, 00:17:44.571 { 00:17:44.571 "dma_device_id": "system", 00:17:44.571 "dma_device_type": 1 00:17:44.571 }, 00:17:44.571 { 00:17:44.571 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:44.571 "dma_device_type": 2 00:17:44.571 } 00:17:44.571 ], 00:17:44.571 "driver_specific": { 00:17:44.571 "raid": { 00:17:44.571 "uuid": "85f85ca2-6c2b-4713-bc55-62973c5cd7d0", 00:17:44.571 "strip_size_kb": 0, 00:17:44.571 "state": "online", 00:17:44.571 "raid_level": "raid1", 00:17:44.571 "superblock": true, 00:17:44.571 "num_base_bdevs": 4, 00:17:44.571 "num_base_bdevs_discovered": 4, 00:17:44.571 "num_base_bdevs_operational": 4, 00:17:44.571 "base_bdevs_list": [ 00:17:44.571 { 00:17:44.571 "name": "NewBaseBdev", 00:17:44.571 "uuid": "97c9a7e6-56f0-4641-838a-30e6a4394a1e", 00:17:44.571 "is_configured": true, 00:17:44.571 "data_offset": 2048, 00:17:44.572 "data_size": 63488 00:17:44.572 }, 00:17:44.572 { 00:17:44.572 "name": "BaseBdev2", 00:17:44.572 "uuid": "b2fbf4d6-af3b-425c-9cd2-3883c17b6824", 00:17:44.572 "is_configured": true, 00:17:44.572 "data_offset": 2048, 00:17:44.572 "data_size": 63488 00:17:44.572 }, 00:17:44.572 { 00:17:44.572 "name": "BaseBdev3", 00:17:44.572 "uuid": "a04324e7-de6c-40ad-b9bd-b0cfbdff03ce", 00:17:44.572 "is_configured": true, 00:17:44.572 "data_offset": 2048, 00:17:44.572 "data_size": 63488 00:17:44.572 }, 00:17:44.572 { 00:17:44.572 "name": "BaseBdev4", 00:17:44.572 "uuid": "0e7f6af5-12f4-48e2-b5b0-b28f1e1a525c", 00:17:44.572 "is_configured": true, 00:17:44.572 "data_offset": 2048, 00:17:44.572 "data_size": 63488 00:17:44.572 } 00:17:44.572 ] 00:17:44.572 } 00:17:44.572 } 00:17:44.572 }' 00:17:44.572 11:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:44.831 11:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:17:44.831 BaseBdev2 00:17:44.831 BaseBdev3 00:17:44.831 BaseBdev4' 00:17:44.831 11:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:44.831 11:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:17:44.831 11:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:44.831 11:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:44.831 "name": "NewBaseBdev", 00:17:44.831 "aliases": [ 00:17:44.831 "97c9a7e6-56f0-4641-838a-30e6a4394a1e" 00:17:44.831 ], 00:17:44.831 "product_name": "Malloc disk", 00:17:44.831 "block_size": 512, 00:17:44.831 "num_blocks": 65536, 00:17:44.831 "uuid": "97c9a7e6-56f0-4641-838a-30e6a4394a1e", 00:17:44.831 "assigned_rate_limits": { 00:17:44.831 "rw_ios_per_sec": 0, 00:17:44.831 "rw_mbytes_per_sec": 0, 00:17:44.831 "r_mbytes_per_sec": 0, 00:17:44.831 "w_mbytes_per_sec": 0 00:17:44.831 }, 00:17:44.831 "claimed": true, 00:17:44.831 "claim_type": "exclusive_write", 00:17:44.831 "zoned": false, 00:17:44.831 "supported_io_types": { 00:17:44.831 "read": true, 00:17:44.831 "write": true, 00:17:44.831 "unmap": true, 00:17:44.831 "flush": true, 00:17:44.831 "reset": true, 00:17:44.831 "nvme_admin": false, 00:17:44.831 "nvme_io": false, 00:17:44.831 "nvme_io_md": false, 00:17:44.831 "write_zeroes": true, 00:17:44.831 "zcopy": true, 00:17:44.831 "get_zone_info": false, 00:17:44.831 "zone_management": false, 00:17:44.831 "zone_append": false, 00:17:44.831 "compare": false, 00:17:44.831 "compare_and_write": false, 00:17:44.831 "abort": true, 00:17:44.831 "seek_hole": false, 00:17:44.831 "seek_data": false, 00:17:44.831 "copy": true, 00:17:44.831 "nvme_iov_md": false 00:17:44.831 }, 00:17:44.831 "memory_domains": [ 00:17:44.831 { 00:17:44.831 "dma_device_id": "system", 00:17:44.831 "dma_device_type": 1 00:17:44.831 }, 00:17:44.831 { 00:17:44.831 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:44.831 "dma_device_type": 2 00:17:44.831 } 00:17:44.831 ], 00:17:44.831 "driver_specific": {} 00:17:44.831 }' 00:17:44.831 11:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:44.831 11:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:45.090 11:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:45.090 11:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:45.090 11:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:45.090 11:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:45.090 11:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:45.090 11:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:45.090 11:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:45.090 11:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:45.090 11:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:45.090 11:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:45.090 11:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:45.090 11:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:45.090 11:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:45.349 11:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:45.349 "name": "BaseBdev2", 00:17:45.349 "aliases": [ 00:17:45.349 "b2fbf4d6-af3b-425c-9cd2-3883c17b6824" 00:17:45.349 ], 00:17:45.349 "product_name": "Malloc disk", 00:17:45.349 "block_size": 512, 00:17:45.349 "num_blocks": 65536, 00:17:45.349 "uuid": "b2fbf4d6-af3b-425c-9cd2-3883c17b6824", 00:17:45.349 "assigned_rate_limits": { 00:17:45.349 "rw_ios_per_sec": 0, 00:17:45.349 "rw_mbytes_per_sec": 0, 00:17:45.349 "r_mbytes_per_sec": 0, 00:17:45.349 "w_mbytes_per_sec": 0 00:17:45.349 }, 00:17:45.349 "claimed": true, 00:17:45.349 "claim_type": "exclusive_write", 00:17:45.349 "zoned": false, 00:17:45.349 "supported_io_types": { 00:17:45.349 "read": true, 00:17:45.349 "write": true, 00:17:45.349 "unmap": true, 00:17:45.349 "flush": true, 00:17:45.349 "reset": true, 00:17:45.349 "nvme_admin": false, 00:17:45.349 "nvme_io": false, 00:17:45.349 "nvme_io_md": false, 00:17:45.349 "write_zeroes": true, 00:17:45.349 "zcopy": true, 00:17:45.349 "get_zone_info": false, 00:17:45.349 "zone_management": false, 00:17:45.349 "zone_append": false, 00:17:45.349 "compare": false, 00:17:45.349 "compare_and_write": false, 00:17:45.349 "abort": true, 00:17:45.349 "seek_hole": false, 00:17:45.349 "seek_data": false, 00:17:45.349 "copy": true, 00:17:45.349 "nvme_iov_md": false 00:17:45.349 }, 00:17:45.349 "memory_domains": [ 00:17:45.349 { 00:17:45.349 "dma_device_id": "system", 00:17:45.349 "dma_device_type": 1 00:17:45.349 }, 00:17:45.349 { 00:17:45.349 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:45.349 "dma_device_type": 2 00:17:45.349 } 00:17:45.349 ], 00:17:45.349 "driver_specific": {} 00:17:45.349 }' 00:17:45.349 11:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:45.349 11:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:45.349 11:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:45.349 11:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:45.349 11:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:45.607 11:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:45.607 11:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:45.607 11:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:45.607 11:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:45.607 11:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:45.607 11:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:45.607 11:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:45.607 11:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:45.607 11:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:45.607 11:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:45.865 11:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:45.865 "name": "BaseBdev3", 00:17:45.865 "aliases": [ 00:17:45.865 "a04324e7-de6c-40ad-b9bd-b0cfbdff03ce" 00:17:45.865 ], 00:17:45.865 "product_name": "Malloc disk", 00:17:45.865 "block_size": 512, 00:17:45.865 "num_blocks": 65536, 00:17:45.865 "uuid": "a04324e7-de6c-40ad-b9bd-b0cfbdff03ce", 00:17:45.865 "assigned_rate_limits": { 00:17:45.865 "rw_ios_per_sec": 0, 00:17:45.865 "rw_mbytes_per_sec": 0, 00:17:45.865 "r_mbytes_per_sec": 0, 00:17:45.865 "w_mbytes_per_sec": 0 00:17:45.865 }, 00:17:45.865 "claimed": true, 00:17:45.865 "claim_type": "exclusive_write", 00:17:45.865 "zoned": false, 00:17:45.865 "supported_io_types": { 00:17:45.865 "read": true, 00:17:45.865 "write": true, 00:17:45.865 "unmap": true, 00:17:45.865 "flush": true, 00:17:45.865 "reset": true, 00:17:45.865 "nvme_admin": false, 00:17:45.865 "nvme_io": false, 00:17:45.865 "nvme_io_md": false, 00:17:45.865 "write_zeroes": true, 00:17:45.865 "zcopy": true, 00:17:45.865 "get_zone_info": false, 00:17:45.865 "zone_management": false, 00:17:45.865 "zone_append": false, 00:17:45.865 "compare": false, 00:17:45.865 "compare_and_write": false, 00:17:45.865 "abort": true, 00:17:45.865 "seek_hole": false, 00:17:45.865 "seek_data": false, 00:17:45.865 "copy": true, 00:17:45.865 "nvme_iov_md": false 00:17:45.865 }, 00:17:45.865 "memory_domains": [ 00:17:45.865 { 00:17:45.865 "dma_device_id": "system", 00:17:45.865 "dma_device_type": 1 00:17:45.865 }, 00:17:45.865 { 00:17:45.865 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:45.866 "dma_device_type": 2 00:17:45.866 } 00:17:45.866 ], 00:17:45.866 "driver_specific": {} 00:17:45.866 }' 00:17:45.866 11:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:45.866 11:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:45.866 11:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:45.866 11:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:45.866 11:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:45.866 11:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:45.866 11:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:46.124 11:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:46.124 11:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:46.124 11:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:46.124 11:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:46.124 11:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:46.124 11:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:46.124 11:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:46.124 11:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:46.124 11:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:46.124 "name": "BaseBdev4", 00:17:46.124 "aliases": [ 00:17:46.124 "0e7f6af5-12f4-48e2-b5b0-b28f1e1a525c" 00:17:46.124 ], 00:17:46.124 "product_name": "Malloc disk", 00:17:46.124 "block_size": 512, 00:17:46.124 "num_blocks": 65536, 00:17:46.124 "uuid": "0e7f6af5-12f4-48e2-b5b0-b28f1e1a525c", 00:17:46.124 "assigned_rate_limits": { 00:17:46.124 "rw_ios_per_sec": 0, 00:17:46.124 "rw_mbytes_per_sec": 0, 00:17:46.124 "r_mbytes_per_sec": 0, 00:17:46.124 "w_mbytes_per_sec": 0 00:17:46.124 }, 00:17:46.124 "claimed": true, 00:17:46.124 "claim_type": "exclusive_write", 00:17:46.124 "zoned": false, 00:17:46.124 "supported_io_types": { 00:17:46.124 "read": true, 00:17:46.124 "write": true, 00:17:46.124 "unmap": true, 00:17:46.124 "flush": true, 00:17:46.124 "reset": true, 00:17:46.124 "nvme_admin": false, 00:17:46.124 "nvme_io": false, 00:17:46.124 "nvme_io_md": false, 00:17:46.124 "write_zeroes": true, 00:17:46.124 "zcopy": true, 00:17:46.124 "get_zone_info": false, 00:17:46.124 "zone_management": false, 00:17:46.124 "zone_append": false, 00:17:46.124 "compare": false, 00:17:46.124 "compare_and_write": false, 00:17:46.124 "abort": true, 00:17:46.124 "seek_hole": false, 00:17:46.124 "seek_data": false, 00:17:46.124 "copy": true, 00:17:46.124 "nvme_iov_md": false 00:17:46.124 }, 00:17:46.124 "memory_domains": [ 00:17:46.124 { 00:17:46.124 "dma_device_id": "system", 00:17:46.124 "dma_device_type": 1 00:17:46.124 }, 00:17:46.124 { 00:17:46.124 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:46.124 "dma_device_type": 2 00:17:46.124 } 00:17:46.124 ], 00:17:46.124 "driver_specific": {} 00:17:46.124 }' 00:17:46.124 11:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:46.382 11:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:46.382 11:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:46.382 11:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:46.382 11:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:46.382 11:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:46.382 11:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:46.382 11:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:46.382 11:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:46.382 11:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:46.382 11:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:46.640 11:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:46.640 11:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:46.640 [2024-07-12 11:58:36.791450] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:46.640 [2024-07-12 11:58:36.791468] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:46.640 [2024-07-12 11:58:36.791504] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:46.640 [2024-07-12 11:58:36.791698] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:46.640 [2024-07-12 11:58:36.791705] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x244c100 name Existed_Raid, state offline 00:17:46.640 11:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 682277 00:17:46.640 11:58:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 682277 ']' 00:17:46.640 11:58:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 682277 00:17:46.640 11:58:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:17:46.640 11:58:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:46.640 11:58:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 682277 00:17:46.640 11:58:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:46.640 11:58:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:46.640 11:58:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 682277' 00:17:46.640 killing process with pid 682277 00:17:46.640 11:58:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 682277 00:17:46.640 [2024-07-12 11:58:36.849955] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:46.640 11:58:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 682277 00:17:46.640 [2024-07-12 11:58:36.881017] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:46.898 11:58:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:17:46.898 00:17:46.898 real 0m24.094s 00:17:46.898 user 0m44.974s 00:17:46.898 sys 0m3.670s 00:17:46.898 11:58:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:46.898 11:58:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:46.898 ************************************ 00:17:46.898 END TEST raid_state_function_test_sb 00:17:46.898 ************************************ 00:17:46.898 11:58:37 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:46.898 11:58:37 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 4 00:17:46.898 11:58:37 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:17:46.898 11:58:37 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:46.898 11:58:37 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:46.898 ************************************ 00:17:46.898 START TEST raid_superblock_test 00:17:46.898 ************************************ 00:17:46.898 11:58:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 4 00:17:46.898 11:58:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:17:46.898 11:58:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:17:46.898 11:58:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:17:46.898 11:58:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:17:46.898 11:58:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:17:46.898 11:58:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:17:46.898 11:58:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:17:46.898 11:58:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:17:46.898 11:58:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:17:46.898 11:58:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:17:46.898 11:58:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:17:46.898 11:58:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:17:46.898 11:58:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:17:46.898 11:58:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:17:46.898 11:58:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:17:46.898 11:58:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=686926 00:17:46.898 11:58:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 686926 /var/tmp/spdk-raid.sock 00:17:46.898 11:58:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:17:46.898 11:58:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 686926 ']' 00:17:46.898 11:58:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:46.898 11:58:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:46.898 11:58:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:46.898 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:46.898 11:58:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:46.898 11:58:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:47.155 [2024-07-12 11:58:37.180109] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:17:47.155 [2024-07-12 11:58:37.180148] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid686926 ] 00:17:47.155 [2024-07-12 11:58:37.243101] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:47.155 [2024-07-12 11:58:37.321856] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:17:47.155 [2024-07-12 11:58:37.374326] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:47.155 [2024-07-12 11:58:37.374368] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:48.086 11:58:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:48.086 11:58:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:17:48.086 11:58:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:17:48.086 11:58:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:48.086 11:58:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:17:48.086 11:58:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:17:48.086 11:58:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:17:48.086 11:58:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:48.086 11:58:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:48.086 11:58:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:48.086 11:58:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:17:48.086 malloc1 00:17:48.086 11:58:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:48.086 [2024-07-12 11:58:38.286055] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:48.087 [2024-07-12 11:58:38.286101] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:48.087 [2024-07-12 11:58:38.286113] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21b4270 00:17:48.087 [2024-07-12 11:58:38.286119] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:48.087 [2024-07-12 11:58:38.287299] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:48.087 [2024-07-12 11:58:38.287318] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:48.087 pt1 00:17:48.087 11:58:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:48.087 11:58:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:48.087 11:58:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:17:48.087 11:58:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:17:48.087 11:58:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:17:48.087 11:58:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:48.087 11:58:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:48.087 11:58:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:48.087 11:58:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:17:48.344 malloc2 00:17:48.344 11:58:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:48.602 [2024-07-12 11:58:38.614613] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:48.602 [2024-07-12 11:58:38.614643] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:48.602 [2024-07-12 11:58:38.614652] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21b5580 00:17:48.602 [2024-07-12 11:58:38.614659] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:48.602 [2024-07-12 11:58:38.615704] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:48.602 [2024-07-12 11:58:38.615724] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:48.602 pt2 00:17:48.602 11:58:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:48.602 11:58:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:48.602 11:58:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:17:48.602 11:58:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:17:48.602 11:58:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:17:48.602 11:58:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:48.602 11:58:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:48.602 11:58:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:48.602 11:58:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:17:48.602 malloc3 00:17:48.602 11:58:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:48.860 [2024-07-12 11:58:38.947037] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:48.860 [2024-07-12 11:58:38.947069] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:48.860 [2024-07-12 11:58:38.947080] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x235fe30 00:17:48.860 [2024-07-12 11:58:38.947086] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:48.860 [2024-07-12 11:58:38.948170] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:48.860 [2024-07-12 11:58:38.948189] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:48.860 pt3 00:17:48.860 11:58:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:48.860 11:58:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:48.860 11:58:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:17:48.860 11:58:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:17:48.860 11:58:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:17:48.860 11:58:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:48.860 11:58:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:48.860 11:58:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:48.860 11:58:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:17:49.119 malloc4 00:17:49.119 11:58:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:17:49.119 [2024-07-12 11:58:39.287245] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:17:49.119 [2024-07-12 11:58:39.287274] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:49.119 [2024-07-12 11:58:39.287284] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2362570 00:17:49.119 [2024-07-12 11:58:39.287289] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:49.119 [2024-07-12 11:58:39.288346] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:49.119 [2024-07-12 11:58:39.288365] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:17:49.119 pt4 00:17:49.119 11:58:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:49.119 11:58:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:49.119 11:58:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:17:49.376 [2024-07-12 11:58:39.451683] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:49.376 [2024-07-12 11:58:39.452586] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:49.376 [2024-07-12 11:58:39.452631] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:49.376 [2024-07-12 11:58:39.452661] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:17:49.376 [2024-07-12 11:58:39.452782] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2363a80 00:17:49.376 [2024-07-12 11:58:39.452788] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:49.376 [2024-07-12 11:58:39.452925] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21b35d0 00:17:49.376 [2024-07-12 11:58:39.453028] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2363a80 00:17:49.376 [2024-07-12 11:58:39.453033] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2363a80 00:17:49.376 [2024-07-12 11:58:39.453097] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:49.376 11:58:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:17:49.376 11:58:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:49.376 11:58:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:49.376 11:58:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:49.376 11:58:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:49.376 11:58:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:49.376 11:58:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:49.376 11:58:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:49.376 11:58:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:49.376 11:58:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:49.376 11:58:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:49.376 11:58:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:49.634 11:58:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:49.634 "name": "raid_bdev1", 00:17:49.634 "uuid": "def2915b-5aff-4786-ae71-c1e85208df84", 00:17:49.634 "strip_size_kb": 0, 00:17:49.634 "state": "online", 00:17:49.634 "raid_level": "raid1", 00:17:49.634 "superblock": true, 00:17:49.634 "num_base_bdevs": 4, 00:17:49.634 "num_base_bdevs_discovered": 4, 00:17:49.634 "num_base_bdevs_operational": 4, 00:17:49.634 "base_bdevs_list": [ 00:17:49.634 { 00:17:49.634 "name": "pt1", 00:17:49.634 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:49.634 "is_configured": true, 00:17:49.634 "data_offset": 2048, 00:17:49.634 "data_size": 63488 00:17:49.634 }, 00:17:49.634 { 00:17:49.634 "name": "pt2", 00:17:49.634 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:49.634 "is_configured": true, 00:17:49.634 "data_offset": 2048, 00:17:49.634 "data_size": 63488 00:17:49.634 }, 00:17:49.634 { 00:17:49.634 "name": "pt3", 00:17:49.634 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:49.634 "is_configured": true, 00:17:49.634 "data_offset": 2048, 00:17:49.634 "data_size": 63488 00:17:49.634 }, 00:17:49.634 { 00:17:49.634 "name": "pt4", 00:17:49.634 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:49.634 "is_configured": true, 00:17:49.634 "data_offset": 2048, 00:17:49.634 "data_size": 63488 00:17:49.634 } 00:17:49.634 ] 00:17:49.634 }' 00:17:49.634 11:58:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:49.634 11:58:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:49.892 11:58:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:17:49.892 11:58:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:49.892 11:58:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:49.892 11:58:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:49.892 11:58:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:49.892 11:58:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:49.892 11:58:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:49.892 11:58:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:50.152 [2024-07-12 11:58:40.274227] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:50.152 11:58:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:50.152 "name": "raid_bdev1", 00:17:50.152 "aliases": [ 00:17:50.152 "def2915b-5aff-4786-ae71-c1e85208df84" 00:17:50.152 ], 00:17:50.152 "product_name": "Raid Volume", 00:17:50.152 "block_size": 512, 00:17:50.152 "num_blocks": 63488, 00:17:50.152 "uuid": "def2915b-5aff-4786-ae71-c1e85208df84", 00:17:50.152 "assigned_rate_limits": { 00:17:50.152 "rw_ios_per_sec": 0, 00:17:50.152 "rw_mbytes_per_sec": 0, 00:17:50.152 "r_mbytes_per_sec": 0, 00:17:50.152 "w_mbytes_per_sec": 0 00:17:50.152 }, 00:17:50.152 "claimed": false, 00:17:50.152 "zoned": false, 00:17:50.152 "supported_io_types": { 00:17:50.152 "read": true, 00:17:50.152 "write": true, 00:17:50.152 "unmap": false, 00:17:50.152 "flush": false, 00:17:50.152 "reset": true, 00:17:50.152 "nvme_admin": false, 00:17:50.152 "nvme_io": false, 00:17:50.152 "nvme_io_md": false, 00:17:50.152 "write_zeroes": true, 00:17:50.152 "zcopy": false, 00:17:50.152 "get_zone_info": false, 00:17:50.152 "zone_management": false, 00:17:50.152 "zone_append": false, 00:17:50.152 "compare": false, 00:17:50.152 "compare_and_write": false, 00:17:50.152 "abort": false, 00:17:50.152 "seek_hole": false, 00:17:50.152 "seek_data": false, 00:17:50.152 "copy": false, 00:17:50.152 "nvme_iov_md": false 00:17:50.152 }, 00:17:50.152 "memory_domains": [ 00:17:50.152 { 00:17:50.152 "dma_device_id": "system", 00:17:50.152 "dma_device_type": 1 00:17:50.152 }, 00:17:50.152 { 00:17:50.152 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:50.152 "dma_device_type": 2 00:17:50.152 }, 00:17:50.152 { 00:17:50.152 "dma_device_id": "system", 00:17:50.152 "dma_device_type": 1 00:17:50.152 }, 00:17:50.152 { 00:17:50.152 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:50.152 "dma_device_type": 2 00:17:50.152 }, 00:17:50.152 { 00:17:50.152 "dma_device_id": "system", 00:17:50.152 "dma_device_type": 1 00:17:50.152 }, 00:17:50.152 { 00:17:50.152 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:50.152 "dma_device_type": 2 00:17:50.152 }, 00:17:50.152 { 00:17:50.152 "dma_device_id": "system", 00:17:50.152 "dma_device_type": 1 00:17:50.152 }, 00:17:50.152 { 00:17:50.152 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:50.152 "dma_device_type": 2 00:17:50.152 } 00:17:50.152 ], 00:17:50.152 "driver_specific": { 00:17:50.152 "raid": { 00:17:50.152 "uuid": "def2915b-5aff-4786-ae71-c1e85208df84", 00:17:50.152 "strip_size_kb": 0, 00:17:50.152 "state": "online", 00:17:50.152 "raid_level": "raid1", 00:17:50.152 "superblock": true, 00:17:50.152 "num_base_bdevs": 4, 00:17:50.152 "num_base_bdevs_discovered": 4, 00:17:50.152 "num_base_bdevs_operational": 4, 00:17:50.152 "base_bdevs_list": [ 00:17:50.152 { 00:17:50.152 "name": "pt1", 00:17:50.152 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:50.152 "is_configured": true, 00:17:50.152 "data_offset": 2048, 00:17:50.152 "data_size": 63488 00:17:50.152 }, 00:17:50.152 { 00:17:50.152 "name": "pt2", 00:17:50.152 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:50.152 "is_configured": true, 00:17:50.152 "data_offset": 2048, 00:17:50.152 "data_size": 63488 00:17:50.152 }, 00:17:50.152 { 00:17:50.152 "name": "pt3", 00:17:50.152 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:50.152 "is_configured": true, 00:17:50.152 "data_offset": 2048, 00:17:50.152 "data_size": 63488 00:17:50.152 }, 00:17:50.152 { 00:17:50.152 "name": "pt4", 00:17:50.152 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:50.152 "is_configured": true, 00:17:50.152 "data_offset": 2048, 00:17:50.152 "data_size": 63488 00:17:50.152 } 00:17:50.152 ] 00:17:50.152 } 00:17:50.152 } 00:17:50.152 }' 00:17:50.152 11:58:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:50.152 11:58:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:50.152 pt2 00:17:50.152 pt3 00:17:50.152 pt4' 00:17:50.152 11:58:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:50.152 11:58:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:50.152 11:58:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:50.412 11:58:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:50.412 "name": "pt1", 00:17:50.412 "aliases": [ 00:17:50.412 "00000000-0000-0000-0000-000000000001" 00:17:50.412 ], 00:17:50.412 "product_name": "passthru", 00:17:50.412 "block_size": 512, 00:17:50.412 "num_blocks": 65536, 00:17:50.412 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:50.412 "assigned_rate_limits": { 00:17:50.412 "rw_ios_per_sec": 0, 00:17:50.412 "rw_mbytes_per_sec": 0, 00:17:50.412 "r_mbytes_per_sec": 0, 00:17:50.412 "w_mbytes_per_sec": 0 00:17:50.412 }, 00:17:50.412 "claimed": true, 00:17:50.412 "claim_type": "exclusive_write", 00:17:50.412 "zoned": false, 00:17:50.412 "supported_io_types": { 00:17:50.412 "read": true, 00:17:50.412 "write": true, 00:17:50.412 "unmap": true, 00:17:50.412 "flush": true, 00:17:50.412 "reset": true, 00:17:50.412 "nvme_admin": false, 00:17:50.412 "nvme_io": false, 00:17:50.412 "nvme_io_md": false, 00:17:50.412 "write_zeroes": true, 00:17:50.412 "zcopy": true, 00:17:50.412 "get_zone_info": false, 00:17:50.412 "zone_management": false, 00:17:50.412 "zone_append": false, 00:17:50.412 "compare": false, 00:17:50.412 "compare_and_write": false, 00:17:50.412 "abort": true, 00:17:50.412 "seek_hole": false, 00:17:50.412 "seek_data": false, 00:17:50.412 "copy": true, 00:17:50.412 "nvme_iov_md": false 00:17:50.412 }, 00:17:50.412 "memory_domains": [ 00:17:50.412 { 00:17:50.412 "dma_device_id": "system", 00:17:50.412 "dma_device_type": 1 00:17:50.412 }, 00:17:50.412 { 00:17:50.412 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:50.412 "dma_device_type": 2 00:17:50.412 } 00:17:50.412 ], 00:17:50.412 "driver_specific": { 00:17:50.412 "passthru": { 00:17:50.412 "name": "pt1", 00:17:50.412 "base_bdev_name": "malloc1" 00:17:50.412 } 00:17:50.412 } 00:17:50.412 }' 00:17:50.412 11:58:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:50.412 11:58:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:50.412 11:58:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:50.412 11:58:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:50.412 11:58:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:50.670 11:58:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:50.670 11:58:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:50.670 11:58:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:50.670 11:58:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:50.670 11:58:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:50.670 11:58:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:50.670 11:58:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:50.670 11:58:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:50.670 11:58:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:50.670 11:58:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:50.929 11:58:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:50.929 "name": "pt2", 00:17:50.929 "aliases": [ 00:17:50.929 "00000000-0000-0000-0000-000000000002" 00:17:50.929 ], 00:17:50.929 "product_name": "passthru", 00:17:50.929 "block_size": 512, 00:17:50.929 "num_blocks": 65536, 00:17:50.929 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:50.929 "assigned_rate_limits": { 00:17:50.929 "rw_ios_per_sec": 0, 00:17:50.929 "rw_mbytes_per_sec": 0, 00:17:50.929 "r_mbytes_per_sec": 0, 00:17:50.929 "w_mbytes_per_sec": 0 00:17:50.929 }, 00:17:50.929 "claimed": true, 00:17:50.929 "claim_type": "exclusive_write", 00:17:50.929 "zoned": false, 00:17:50.929 "supported_io_types": { 00:17:50.929 "read": true, 00:17:50.929 "write": true, 00:17:50.929 "unmap": true, 00:17:50.929 "flush": true, 00:17:50.929 "reset": true, 00:17:50.929 "nvme_admin": false, 00:17:50.929 "nvme_io": false, 00:17:50.929 "nvme_io_md": false, 00:17:50.929 "write_zeroes": true, 00:17:50.929 "zcopy": true, 00:17:50.929 "get_zone_info": false, 00:17:50.929 "zone_management": false, 00:17:50.929 "zone_append": false, 00:17:50.929 "compare": false, 00:17:50.929 "compare_and_write": false, 00:17:50.929 "abort": true, 00:17:50.929 "seek_hole": false, 00:17:50.929 "seek_data": false, 00:17:50.929 "copy": true, 00:17:50.929 "nvme_iov_md": false 00:17:50.929 }, 00:17:50.929 "memory_domains": [ 00:17:50.929 { 00:17:50.929 "dma_device_id": "system", 00:17:50.929 "dma_device_type": 1 00:17:50.929 }, 00:17:50.929 { 00:17:50.929 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:50.929 "dma_device_type": 2 00:17:50.929 } 00:17:50.929 ], 00:17:50.929 "driver_specific": { 00:17:50.929 "passthru": { 00:17:50.929 "name": "pt2", 00:17:50.929 "base_bdev_name": "malloc2" 00:17:50.929 } 00:17:50.929 } 00:17:50.929 }' 00:17:50.929 11:58:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:50.929 11:58:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:50.929 11:58:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:50.929 11:58:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:50.929 11:58:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:50.929 11:58:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:50.929 11:58:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:51.188 11:58:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:51.188 11:58:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:51.188 11:58:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:51.188 11:58:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:51.188 11:58:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:51.188 11:58:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:51.188 11:58:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:51.188 11:58:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:51.446 11:58:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:51.446 "name": "pt3", 00:17:51.446 "aliases": [ 00:17:51.446 "00000000-0000-0000-0000-000000000003" 00:17:51.447 ], 00:17:51.447 "product_name": "passthru", 00:17:51.447 "block_size": 512, 00:17:51.447 "num_blocks": 65536, 00:17:51.447 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:51.447 "assigned_rate_limits": { 00:17:51.447 "rw_ios_per_sec": 0, 00:17:51.447 "rw_mbytes_per_sec": 0, 00:17:51.447 "r_mbytes_per_sec": 0, 00:17:51.447 "w_mbytes_per_sec": 0 00:17:51.447 }, 00:17:51.447 "claimed": true, 00:17:51.447 "claim_type": "exclusive_write", 00:17:51.447 "zoned": false, 00:17:51.447 "supported_io_types": { 00:17:51.447 "read": true, 00:17:51.447 "write": true, 00:17:51.447 "unmap": true, 00:17:51.447 "flush": true, 00:17:51.447 "reset": true, 00:17:51.447 "nvme_admin": false, 00:17:51.447 "nvme_io": false, 00:17:51.447 "nvme_io_md": false, 00:17:51.447 "write_zeroes": true, 00:17:51.447 "zcopy": true, 00:17:51.447 "get_zone_info": false, 00:17:51.447 "zone_management": false, 00:17:51.447 "zone_append": false, 00:17:51.447 "compare": false, 00:17:51.447 "compare_and_write": false, 00:17:51.447 "abort": true, 00:17:51.447 "seek_hole": false, 00:17:51.447 "seek_data": false, 00:17:51.447 "copy": true, 00:17:51.447 "nvme_iov_md": false 00:17:51.447 }, 00:17:51.447 "memory_domains": [ 00:17:51.447 { 00:17:51.447 "dma_device_id": "system", 00:17:51.447 "dma_device_type": 1 00:17:51.447 }, 00:17:51.447 { 00:17:51.447 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:51.447 "dma_device_type": 2 00:17:51.447 } 00:17:51.447 ], 00:17:51.447 "driver_specific": { 00:17:51.447 "passthru": { 00:17:51.447 "name": "pt3", 00:17:51.447 "base_bdev_name": "malloc3" 00:17:51.447 } 00:17:51.447 } 00:17:51.447 }' 00:17:51.447 11:58:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:51.447 11:58:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:51.447 11:58:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:51.447 11:58:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:51.447 11:58:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:51.447 11:58:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:51.447 11:58:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:51.447 11:58:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:51.447 11:58:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:51.447 11:58:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:51.706 11:58:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:51.706 11:58:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:51.706 11:58:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:51.706 11:58:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:17:51.706 11:58:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:51.706 11:58:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:51.706 "name": "pt4", 00:17:51.706 "aliases": [ 00:17:51.706 "00000000-0000-0000-0000-000000000004" 00:17:51.706 ], 00:17:51.706 "product_name": "passthru", 00:17:51.706 "block_size": 512, 00:17:51.706 "num_blocks": 65536, 00:17:51.706 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:51.706 "assigned_rate_limits": { 00:17:51.706 "rw_ios_per_sec": 0, 00:17:51.706 "rw_mbytes_per_sec": 0, 00:17:51.706 "r_mbytes_per_sec": 0, 00:17:51.706 "w_mbytes_per_sec": 0 00:17:51.706 }, 00:17:51.706 "claimed": true, 00:17:51.706 "claim_type": "exclusive_write", 00:17:51.706 "zoned": false, 00:17:51.706 "supported_io_types": { 00:17:51.706 "read": true, 00:17:51.706 "write": true, 00:17:51.706 "unmap": true, 00:17:51.706 "flush": true, 00:17:51.706 "reset": true, 00:17:51.706 "nvme_admin": false, 00:17:51.706 "nvme_io": false, 00:17:51.706 "nvme_io_md": false, 00:17:51.706 "write_zeroes": true, 00:17:51.706 "zcopy": true, 00:17:51.706 "get_zone_info": false, 00:17:51.706 "zone_management": false, 00:17:51.706 "zone_append": false, 00:17:51.706 "compare": false, 00:17:51.706 "compare_and_write": false, 00:17:51.706 "abort": true, 00:17:51.706 "seek_hole": false, 00:17:51.706 "seek_data": false, 00:17:51.706 "copy": true, 00:17:51.706 "nvme_iov_md": false 00:17:51.706 }, 00:17:51.706 "memory_domains": [ 00:17:51.706 { 00:17:51.706 "dma_device_id": "system", 00:17:51.706 "dma_device_type": 1 00:17:51.706 }, 00:17:51.706 { 00:17:51.706 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:51.706 "dma_device_type": 2 00:17:51.706 } 00:17:51.706 ], 00:17:51.706 "driver_specific": { 00:17:51.706 "passthru": { 00:17:51.706 "name": "pt4", 00:17:51.706 "base_bdev_name": "malloc4" 00:17:51.706 } 00:17:51.706 } 00:17:51.706 }' 00:17:51.706 11:58:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:51.965 11:58:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:51.965 11:58:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:51.965 11:58:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:51.965 11:58:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:51.965 11:58:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:51.965 11:58:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:51.965 11:58:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:51.965 11:58:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:51.965 11:58:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:51.965 11:58:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:52.224 11:58:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:52.224 11:58:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:52.224 11:58:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:17:52.224 [2024-07-12 11:58:42.391711] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:52.224 11:58:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=def2915b-5aff-4786-ae71-c1e85208df84 00:17:52.224 11:58:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z def2915b-5aff-4786-ae71-c1e85208df84 ']' 00:17:52.224 11:58:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:52.483 [2024-07-12 11:58:42.559931] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:52.483 [2024-07-12 11:58:42.559944] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:52.483 [2024-07-12 11:58:42.559977] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:52.483 [2024-07-12 11:58:42.560032] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:52.483 [2024-07-12 11:58:42.560038] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2363a80 name raid_bdev1, state offline 00:17:52.483 11:58:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:52.483 11:58:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:17:52.741 11:58:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:17:52.741 11:58:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:17:52.741 11:58:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:52.741 11:58:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:17:52.741 11:58:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:52.741 11:58:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:52.998 11:58:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:52.998 11:58:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:53.256 11:58:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:53.256 11:58:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:17:53.256 11:58:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:17:53.256 11:58:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:17:53.515 11:58:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:17:53.515 11:58:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:17:53.515 11:58:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:17:53.515 11:58:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:17:53.515 11:58:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:53.515 11:58:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:53.515 11:58:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:53.515 11:58:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:53.515 11:58:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:53.515 11:58:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:53.515 11:58:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:53.515 11:58:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:17:53.515 11:58:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:17:53.515 [2024-07-12 11:58:43.734929] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:17:53.515 [2024-07-12 11:58:43.735903] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:17:53.515 [2024-07-12 11:58:43.735932] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:17:53.515 [2024-07-12 11:58:43.735952] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:17:53.515 [2024-07-12 11:58:43.735984] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:17:53.515 [2024-07-12 11:58:43.736008] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:17:53.515 [2024-07-12 11:58:43.736019] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:17:53.515 [2024-07-12 11:58:43.736047] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:17:53.515 [2024-07-12 11:58:43.736056] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:53.515 [2024-07-12 11:58:43.736066] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21b3810 name raid_bdev1, state configuring 00:17:53.515 request: 00:17:53.515 { 00:17:53.515 "name": "raid_bdev1", 00:17:53.515 "raid_level": "raid1", 00:17:53.515 "base_bdevs": [ 00:17:53.515 "malloc1", 00:17:53.515 "malloc2", 00:17:53.515 "malloc3", 00:17:53.515 "malloc4" 00:17:53.515 ], 00:17:53.515 "superblock": false, 00:17:53.515 "method": "bdev_raid_create", 00:17:53.515 "req_id": 1 00:17:53.515 } 00:17:53.515 Got JSON-RPC error response 00:17:53.515 response: 00:17:53.515 { 00:17:53.515 "code": -17, 00:17:53.515 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:17:53.516 } 00:17:53.516 11:58:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:17:53.516 11:58:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:17:53.516 11:58:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:17:53.516 11:58:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:17:53.516 11:58:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:53.516 11:58:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:17:53.774 11:58:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:17:53.774 11:58:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:17:53.774 11:58:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:54.033 [2024-07-12 11:58:44.059731] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:54.033 [2024-07-12 11:58:44.059757] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:54.033 [2024-07-12 11:58:44.059766] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21b4d90 00:17:54.033 [2024-07-12 11:58:44.059771] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:54.033 [2024-07-12 11:58:44.060968] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:54.033 [2024-07-12 11:58:44.060988] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:54.033 [2024-07-12 11:58:44.061033] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:17:54.033 [2024-07-12 11:58:44.061052] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:54.033 pt1 00:17:54.033 11:58:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:17:54.033 11:58:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:54.033 11:58:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:54.033 11:58:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:54.033 11:58:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:54.033 11:58:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:54.033 11:58:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:54.033 11:58:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:54.033 11:58:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:54.033 11:58:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:54.034 11:58:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:54.034 11:58:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:54.034 11:58:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:54.034 "name": "raid_bdev1", 00:17:54.034 "uuid": "def2915b-5aff-4786-ae71-c1e85208df84", 00:17:54.034 "strip_size_kb": 0, 00:17:54.034 "state": "configuring", 00:17:54.034 "raid_level": "raid1", 00:17:54.034 "superblock": true, 00:17:54.034 "num_base_bdevs": 4, 00:17:54.034 "num_base_bdevs_discovered": 1, 00:17:54.034 "num_base_bdevs_operational": 4, 00:17:54.034 "base_bdevs_list": [ 00:17:54.034 { 00:17:54.034 "name": "pt1", 00:17:54.034 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:54.034 "is_configured": true, 00:17:54.034 "data_offset": 2048, 00:17:54.034 "data_size": 63488 00:17:54.034 }, 00:17:54.034 { 00:17:54.034 "name": null, 00:17:54.034 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:54.034 "is_configured": false, 00:17:54.034 "data_offset": 2048, 00:17:54.034 "data_size": 63488 00:17:54.034 }, 00:17:54.034 { 00:17:54.034 "name": null, 00:17:54.034 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:54.034 "is_configured": false, 00:17:54.034 "data_offset": 2048, 00:17:54.034 "data_size": 63488 00:17:54.034 }, 00:17:54.034 { 00:17:54.034 "name": null, 00:17:54.034 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:54.034 "is_configured": false, 00:17:54.034 "data_offset": 2048, 00:17:54.034 "data_size": 63488 00:17:54.034 } 00:17:54.034 ] 00:17:54.034 }' 00:17:54.034 11:58:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:54.034 11:58:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:54.601 11:58:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:17:54.601 11:58:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:54.860 [2024-07-12 11:58:44.885866] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:54.860 [2024-07-12 11:58:44.885898] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:54.860 [2024-07-12 11:58:44.885909] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2365300 00:17:54.860 [2024-07-12 11:58:44.885932] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:54.860 [2024-07-12 11:58:44.886177] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:54.860 [2024-07-12 11:58:44.886187] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:54.860 [2024-07-12 11:58:44.886231] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:54.860 [2024-07-12 11:58:44.886243] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:54.860 pt2 00:17:54.860 11:58:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:54.860 [2024-07-12 11:58:45.054317] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:17:54.860 11:58:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:17:54.860 11:58:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:54.860 11:58:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:54.860 11:58:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:54.860 11:58:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:54.860 11:58:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:54.860 11:58:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:54.860 11:58:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:54.860 11:58:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:54.860 11:58:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:54.860 11:58:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:54.860 11:58:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:55.119 11:58:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:55.119 "name": "raid_bdev1", 00:17:55.119 "uuid": "def2915b-5aff-4786-ae71-c1e85208df84", 00:17:55.119 "strip_size_kb": 0, 00:17:55.119 "state": "configuring", 00:17:55.119 "raid_level": "raid1", 00:17:55.119 "superblock": true, 00:17:55.119 "num_base_bdevs": 4, 00:17:55.119 "num_base_bdevs_discovered": 1, 00:17:55.119 "num_base_bdevs_operational": 4, 00:17:55.119 "base_bdevs_list": [ 00:17:55.119 { 00:17:55.119 "name": "pt1", 00:17:55.120 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:55.120 "is_configured": true, 00:17:55.120 "data_offset": 2048, 00:17:55.120 "data_size": 63488 00:17:55.120 }, 00:17:55.120 { 00:17:55.120 "name": null, 00:17:55.120 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:55.120 "is_configured": false, 00:17:55.120 "data_offset": 2048, 00:17:55.120 "data_size": 63488 00:17:55.120 }, 00:17:55.120 { 00:17:55.120 "name": null, 00:17:55.120 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:55.120 "is_configured": false, 00:17:55.120 "data_offset": 2048, 00:17:55.120 "data_size": 63488 00:17:55.120 }, 00:17:55.120 { 00:17:55.120 "name": null, 00:17:55.120 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:55.120 "is_configured": false, 00:17:55.120 "data_offset": 2048, 00:17:55.120 "data_size": 63488 00:17:55.120 } 00:17:55.120 ] 00:17:55.120 }' 00:17:55.120 11:58:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:55.120 11:58:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:55.687 11:58:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:17:55.687 11:58:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:55.687 11:58:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:55.687 [2024-07-12 11:58:45.876495] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:55.687 [2024-07-12 11:58:45.876534] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:55.687 [2024-07-12 11:58:45.876547] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21b4640 00:17:55.687 [2024-07-12 11:58:45.876553] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:55.687 [2024-07-12 11:58:45.876796] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:55.687 [2024-07-12 11:58:45.876806] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:55.687 [2024-07-12 11:58:45.876849] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:55.687 [2024-07-12 11:58:45.876861] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:55.687 pt2 00:17:55.687 11:58:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:55.687 11:58:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:55.687 11:58:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:55.946 [2024-07-12 11:58:46.044931] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:55.946 [2024-07-12 11:58:46.044950] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:55.946 [2024-07-12 11:58:46.044958] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x235e520 00:17:55.946 [2024-07-12 11:58:46.044964] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:55.946 [2024-07-12 11:58:46.045174] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:55.946 [2024-07-12 11:58:46.045184] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:55.946 [2024-07-12 11:58:46.045219] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:55.946 [2024-07-12 11:58:46.045231] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:55.946 pt3 00:17:55.946 11:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:55.946 11:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:55.946 11:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:17:56.205 [2024-07-12 11:58:46.209357] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:17:56.205 [2024-07-12 11:58:46.209384] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:56.205 [2024-07-12 11:58:46.209393] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2365630 00:17:56.205 [2024-07-12 11:58:46.209398] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:56.205 [2024-07-12 11:58:46.209641] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:56.205 [2024-07-12 11:58:46.209650] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:17:56.205 [2024-07-12 11:58:46.209685] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:17:56.205 [2024-07-12 11:58:46.209697] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:17:56.205 [2024-07-12 11:58:46.209785] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2361190 00:17:56.205 [2024-07-12 11:58:46.209791] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:56.205 [2024-07-12 11:58:46.209909] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23642c0 00:17:56.205 [2024-07-12 11:58:46.210003] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2361190 00:17:56.205 [2024-07-12 11:58:46.210008] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2361190 00:17:56.205 [2024-07-12 11:58:46.210075] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:56.205 pt4 00:17:56.205 11:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:56.205 11:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:56.205 11:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:17:56.205 11:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:56.205 11:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:56.205 11:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:56.205 11:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:56.205 11:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:56.205 11:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:56.205 11:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:56.205 11:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:56.205 11:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:56.205 11:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:56.205 11:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:56.205 11:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:56.205 "name": "raid_bdev1", 00:17:56.205 "uuid": "def2915b-5aff-4786-ae71-c1e85208df84", 00:17:56.205 "strip_size_kb": 0, 00:17:56.205 "state": "online", 00:17:56.205 "raid_level": "raid1", 00:17:56.205 "superblock": true, 00:17:56.205 "num_base_bdevs": 4, 00:17:56.205 "num_base_bdevs_discovered": 4, 00:17:56.205 "num_base_bdevs_operational": 4, 00:17:56.205 "base_bdevs_list": [ 00:17:56.205 { 00:17:56.205 "name": "pt1", 00:17:56.205 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:56.205 "is_configured": true, 00:17:56.205 "data_offset": 2048, 00:17:56.205 "data_size": 63488 00:17:56.205 }, 00:17:56.205 { 00:17:56.205 "name": "pt2", 00:17:56.205 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:56.205 "is_configured": true, 00:17:56.205 "data_offset": 2048, 00:17:56.205 "data_size": 63488 00:17:56.205 }, 00:17:56.205 { 00:17:56.205 "name": "pt3", 00:17:56.205 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:56.205 "is_configured": true, 00:17:56.205 "data_offset": 2048, 00:17:56.205 "data_size": 63488 00:17:56.205 }, 00:17:56.205 { 00:17:56.205 "name": "pt4", 00:17:56.205 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:56.205 "is_configured": true, 00:17:56.205 "data_offset": 2048, 00:17:56.205 "data_size": 63488 00:17:56.205 } 00:17:56.205 ] 00:17:56.205 }' 00:17:56.205 11:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:56.205 11:58:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:56.774 11:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:17:56.774 11:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:56.774 11:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:56.774 11:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:56.774 11:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:56.774 11:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:56.774 11:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:56.774 11:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:57.033 [2024-07-12 11:58:47.027685] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:57.033 11:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:57.033 "name": "raid_bdev1", 00:17:57.033 "aliases": [ 00:17:57.033 "def2915b-5aff-4786-ae71-c1e85208df84" 00:17:57.033 ], 00:17:57.033 "product_name": "Raid Volume", 00:17:57.033 "block_size": 512, 00:17:57.033 "num_blocks": 63488, 00:17:57.033 "uuid": "def2915b-5aff-4786-ae71-c1e85208df84", 00:17:57.033 "assigned_rate_limits": { 00:17:57.033 "rw_ios_per_sec": 0, 00:17:57.033 "rw_mbytes_per_sec": 0, 00:17:57.033 "r_mbytes_per_sec": 0, 00:17:57.033 "w_mbytes_per_sec": 0 00:17:57.033 }, 00:17:57.033 "claimed": false, 00:17:57.033 "zoned": false, 00:17:57.033 "supported_io_types": { 00:17:57.033 "read": true, 00:17:57.033 "write": true, 00:17:57.033 "unmap": false, 00:17:57.033 "flush": false, 00:17:57.033 "reset": true, 00:17:57.033 "nvme_admin": false, 00:17:57.033 "nvme_io": false, 00:17:57.033 "nvme_io_md": false, 00:17:57.033 "write_zeroes": true, 00:17:57.033 "zcopy": false, 00:17:57.033 "get_zone_info": false, 00:17:57.033 "zone_management": false, 00:17:57.033 "zone_append": false, 00:17:57.033 "compare": false, 00:17:57.033 "compare_and_write": false, 00:17:57.033 "abort": false, 00:17:57.033 "seek_hole": false, 00:17:57.033 "seek_data": false, 00:17:57.033 "copy": false, 00:17:57.033 "nvme_iov_md": false 00:17:57.033 }, 00:17:57.033 "memory_domains": [ 00:17:57.033 { 00:17:57.033 "dma_device_id": "system", 00:17:57.033 "dma_device_type": 1 00:17:57.033 }, 00:17:57.033 { 00:17:57.033 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:57.033 "dma_device_type": 2 00:17:57.033 }, 00:17:57.033 { 00:17:57.033 "dma_device_id": "system", 00:17:57.033 "dma_device_type": 1 00:17:57.033 }, 00:17:57.033 { 00:17:57.033 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:57.033 "dma_device_type": 2 00:17:57.033 }, 00:17:57.033 { 00:17:57.033 "dma_device_id": "system", 00:17:57.033 "dma_device_type": 1 00:17:57.033 }, 00:17:57.033 { 00:17:57.033 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:57.033 "dma_device_type": 2 00:17:57.033 }, 00:17:57.033 { 00:17:57.033 "dma_device_id": "system", 00:17:57.033 "dma_device_type": 1 00:17:57.033 }, 00:17:57.033 { 00:17:57.033 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:57.033 "dma_device_type": 2 00:17:57.033 } 00:17:57.033 ], 00:17:57.033 "driver_specific": { 00:17:57.033 "raid": { 00:17:57.033 "uuid": "def2915b-5aff-4786-ae71-c1e85208df84", 00:17:57.033 "strip_size_kb": 0, 00:17:57.033 "state": "online", 00:17:57.033 "raid_level": "raid1", 00:17:57.033 "superblock": true, 00:17:57.033 "num_base_bdevs": 4, 00:17:57.033 "num_base_bdevs_discovered": 4, 00:17:57.033 "num_base_bdevs_operational": 4, 00:17:57.033 "base_bdevs_list": [ 00:17:57.033 { 00:17:57.033 "name": "pt1", 00:17:57.033 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:57.033 "is_configured": true, 00:17:57.033 "data_offset": 2048, 00:17:57.033 "data_size": 63488 00:17:57.033 }, 00:17:57.033 { 00:17:57.033 "name": "pt2", 00:17:57.033 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:57.033 "is_configured": true, 00:17:57.033 "data_offset": 2048, 00:17:57.033 "data_size": 63488 00:17:57.033 }, 00:17:57.033 { 00:17:57.033 "name": "pt3", 00:17:57.033 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:57.034 "is_configured": true, 00:17:57.034 "data_offset": 2048, 00:17:57.034 "data_size": 63488 00:17:57.034 }, 00:17:57.034 { 00:17:57.034 "name": "pt4", 00:17:57.034 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:57.034 "is_configured": true, 00:17:57.034 "data_offset": 2048, 00:17:57.034 "data_size": 63488 00:17:57.034 } 00:17:57.034 ] 00:17:57.034 } 00:17:57.034 } 00:17:57.034 }' 00:17:57.034 11:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:57.034 11:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:57.034 pt2 00:17:57.034 pt3 00:17:57.034 pt4' 00:17:57.034 11:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:57.034 11:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:57.034 11:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:57.034 11:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:57.034 "name": "pt1", 00:17:57.034 "aliases": [ 00:17:57.034 "00000000-0000-0000-0000-000000000001" 00:17:57.034 ], 00:17:57.034 "product_name": "passthru", 00:17:57.034 "block_size": 512, 00:17:57.034 "num_blocks": 65536, 00:17:57.034 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:57.034 "assigned_rate_limits": { 00:17:57.034 "rw_ios_per_sec": 0, 00:17:57.034 "rw_mbytes_per_sec": 0, 00:17:57.034 "r_mbytes_per_sec": 0, 00:17:57.034 "w_mbytes_per_sec": 0 00:17:57.034 }, 00:17:57.034 "claimed": true, 00:17:57.034 "claim_type": "exclusive_write", 00:17:57.034 "zoned": false, 00:17:57.034 "supported_io_types": { 00:17:57.034 "read": true, 00:17:57.034 "write": true, 00:17:57.034 "unmap": true, 00:17:57.034 "flush": true, 00:17:57.034 "reset": true, 00:17:57.034 "nvme_admin": false, 00:17:57.034 "nvme_io": false, 00:17:57.034 "nvme_io_md": false, 00:17:57.034 "write_zeroes": true, 00:17:57.034 "zcopy": true, 00:17:57.034 "get_zone_info": false, 00:17:57.034 "zone_management": false, 00:17:57.034 "zone_append": false, 00:17:57.034 "compare": false, 00:17:57.034 "compare_and_write": false, 00:17:57.034 "abort": true, 00:17:57.034 "seek_hole": false, 00:17:57.034 "seek_data": false, 00:17:57.034 "copy": true, 00:17:57.034 "nvme_iov_md": false 00:17:57.034 }, 00:17:57.034 "memory_domains": [ 00:17:57.034 { 00:17:57.034 "dma_device_id": "system", 00:17:57.034 "dma_device_type": 1 00:17:57.034 }, 00:17:57.034 { 00:17:57.034 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:57.034 "dma_device_type": 2 00:17:57.034 } 00:17:57.034 ], 00:17:57.034 "driver_specific": { 00:17:57.034 "passthru": { 00:17:57.034 "name": "pt1", 00:17:57.034 "base_bdev_name": "malloc1" 00:17:57.034 } 00:17:57.034 } 00:17:57.034 }' 00:17:57.034 11:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:57.293 11:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:57.293 11:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:57.293 11:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:57.293 11:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:57.293 11:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:57.293 11:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:57.293 11:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:57.293 11:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:57.293 11:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:57.293 11:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:57.293 11:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:57.293 11:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:57.293 11:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:57.293 11:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:57.553 11:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:57.553 "name": "pt2", 00:17:57.553 "aliases": [ 00:17:57.553 "00000000-0000-0000-0000-000000000002" 00:17:57.553 ], 00:17:57.553 "product_name": "passthru", 00:17:57.553 "block_size": 512, 00:17:57.553 "num_blocks": 65536, 00:17:57.553 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:57.553 "assigned_rate_limits": { 00:17:57.553 "rw_ios_per_sec": 0, 00:17:57.553 "rw_mbytes_per_sec": 0, 00:17:57.553 "r_mbytes_per_sec": 0, 00:17:57.553 "w_mbytes_per_sec": 0 00:17:57.553 }, 00:17:57.553 "claimed": true, 00:17:57.553 "claim_type": "exclusive_write", 00:17:57.553 "zoned": false, 00:17:57.553 "supported_io_types": { 00:17:57.553 "read": true, 00:17:57.553 "write": true, 00:17:57.553 "unmap": true, 00:17:57.553 "flush": true, 00:17:57.553 "reset": true, 00:17:57.553 "nvme_admin": false, 00:17:57.553 "nvme_io": false, 00:17:57.553 "nvme_io_md": false, 00:17:57.553 "write_zeroes": true, 00:17:57.553 "zcopy": true, 00:17:57.553 "get_zone_info": false, 00:17:57.553 "zone_management": false, 00:17:57.553 "zone_append": false, 00:17:57.553 "compare": false, 00:17:57.553 "compare_and_write": false, 00:17:57.553 "abort": true, 00:17:57.553 "seek_hole": false, 00:17:57.553 "seek_data": false, 00:17:57.553 "copy": true, 00:17:57.553 "nvme_iov_md": false 00:17:57.553 }, 00:17:57.553 "memory_domains": [ 00:17:57.553 { 00:17:57.553 "dma_device_id": "system", 00:17:57.553 "dma_device_type": 1 00:17:57.553 }, 00:17:57.553 { 00:17:57.553 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:57.553 "dma_device_type": 2 00:17:57.553 } 00:17:57.553 ], 00:17:57.553 "driver_specific": { 00:17:57.553 "passthru": { 00:17:57.553 "name": "pt2", 00:17:57.553 "base_bdev_name": "malloc2" 00:17:57.553 } 00:17:57.553 } 00:17:57.553 }' 00:17:57.553 11:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:57.553 11:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:57.553 11:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:57.553 11:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:57.811 11:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:57.811 11:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:57.811 11:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:57.811 11:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:57.811 11:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:57.811 11:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:57.811 11:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:57.811 11:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:57.811 11:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:57.811 11:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:57.811 11:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:58.069 11:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:58.069 "name": "pt3", 00:17:58.069 "aliases": [ 00:17:58.069 "00000000-0000-0000-0000-000000000003" 00:17:58.069 ], 00:17:58.069 "product_name": "passthru", 00:17:58.069 "block_size": 512, 00:17:58.069 "num_blocks": 65536, 00:17:58.069 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:58.069 "assigned_rate_limits": { 00:17:58.069 "rw_ios_per_sec": 0, 00:17:58.069 "rw_mbytes_per_sec": 0, 00:17:58.069 "r_mbytes_per_sec": 0, 00:17:58.069 "w_mbytes_per_sec": 0 00:17:58.069 }, 00:17:58.069 "claimed": true, 00:17:58.069 "claim_type": "exclusive_write", 00:17:58.069 "zoned": false, 00:17:58.069 "supported_io_types": { 00:17:58.069 "read": true, 00:17:58.069 "write": true, 00:17:58.069 "unmap": true, 00:17:58.069 "flush": true, 00:17:58.069 "reset": true, 00:17:58.069 "nvme_admin": false, 00:17:58.069 "nvme_io": false, 00:17:58.069 "nvme_io_md": false, 00:17:58.069 "write_zeroes": true, 00:17:58.069 "zcopy": true, 00:17:58.069 "get_zone_info": false, 00:17:58.069 "zone_management": false, 00:17:58.069 "zone_append": false, 00:17:58.069 "compare": false, 00:17:58.069 "compare_and_write": false, 00:17:58.069 "abort": true, 00:17:58.069 "seek_hole": false, 00:17:58.069 "seek_data": false, 00:17:58.069 "copy": true, 00:17:58.069 "nvme_iov_md": false 00:17:58.069 }, 00:17:58.069 "memory_domains": [ 00:17:58.069 { 00:17:58.069 "dma_device_id": "system", 00:17:58.069 "dma_device_type": 1 00:17:58.069 }, 00:17:58.069 { 00:17:58.069 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:58.069 "dma_device_type": 2 00:17:58.069 } 00:17:58.069 ], 00:17:58.069 "driver_specific": { 00:17:58.069 "passthru": { 00:17:58.069 "name": "pt3", 00:17:58.069 "base_bdev_name": "malloc3" 00:17:58.069 } 00:17:58.069 } 00:17:58.069 }' 00:17:58.069 11:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:58.069 11:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:58.069 11:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:58.069 11:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:58.069 11:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:58.328 11:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:58.328 11:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:58.328 11:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:58.328 11:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:58.328 11:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:58.328 11:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:58.328 11:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:58.328 11:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:58.328 11:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:58.328 11:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:17:58.587 11:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:58.587 "name": "pt4", 00:17:58.587 "aliases": [ 00:17:58.587 "00000000-0000-0000-0000-000000000004" 00:17:58.587 ], 00:17:58.587 "product_name": "passthru", 00:17:58.587 "block_size": 512, 00:17:58.587 "num_blocks": 65536, 00:17:58.587 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:58.587 "assigned_rate_limits": { 00:17:58.587 "rw_ios_per_sec": 0, 00:17:58.587 "rw_mbytes_per_sec": 0, 00:17:58.587 "r_mbytes_per_sec": 0, 00:17:58.587 "w_mbytes_per_sec": 0 00:17:58.587 }, 00:17:58.587 "claimed": true, 00:17:58.587 "claim_type": "exclusive_write", 00:17:58.587 "zoned": false, 00:17:58.587 "supported_io_types": { 00:17:58.587 "read": true, 00:17:58.587 "write": true, 00:17:58.587 "unmap": true, 00:17:58.587 "flush": true, 00:17:58.587 "reset": true, 00:17:58.587 "nvme_admin": false, 00:17:58.587 "nvme_io": false, 00:17:58.587 "nvme_io_md": false, 00:17:58.587 "write_zeroes": true, 00:17:58.587 "zcopy": true, 00:17:58.587 "get_zone_info": false, 00:17:58.587 "zone_management": false, 00:17:58.587 "zone_append": false, 00:17:58.587 "compare": false, 00:17:58.587 "compare_and_write": false, 00:17:58.587 "abort": true, 00:17:58.587 "seek_hole": false, 00:17:58.587 "seek_data": false, 00:17:58.587 "copy": true, 00:17:58.587 "nvme_iov_md": false 00:17:58.587 }, 00:17:58.587 "memory_domains": [ 00:17:58.587 { 00:17:58.587 "dma_device_id": "system", 00:17:58.587 "dma_device_type": 1 00:17:58.587 }, 00:17:58.587 { 00:17:58.587 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:58.587 "dma_device_type": 2 00:17:58.587 } 00:17:58.587 ], 00:17:58.587 "driver_specific": { 00:17:58.587 "passthru": { 00:17:58.587 "name": "pt4", 00:17:58.587 "base_bdev_name": "malloc4" 00:17:58.587 } 00:17:58.587 } 00:17:58.587 }' 00:17:58.587 11:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:58.587 11:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:58.587 11:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:58.587 11:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:58.587 11:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:58.587 11:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:58.587 11:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:58.845 11:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:58.845 11:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:58.845 11:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:58.845 11:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:58.845 11:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:58.845 11:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:58.845 11:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:17:59.104 [2024-07-12 11:58:49.101128] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:59.104 11:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' def2915b-5aff-4786-ae71-c1e85208df84 '!=' def2915b-5aff-4786-ae71-c1e85208df84 ']' 00:17:59.104 11:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:17:59.104 11:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:59.104 11:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:59.104 11:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:17:59.104 [2024-07-12 11:58:49.269386] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:17:59.104 11:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:59.104 11:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:59.104 11:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:59.104 11:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:59.104 11:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:59.104 11:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:59.104 11:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:59.104 11:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:59.104 11:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:59.104 11:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:59.104 11:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:59.104 11:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:59.362 11:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:59.362 "name": "raid_bdev1", 00:17:59.362 "uuid": "def2915b-5aff-4786-ae71-c1e85208df84", 00:17:59.362 "strip_size_kb": 0, 00:17:59.362 "state": "online", 00:17:59.362 "raid_level": "raid1", 00:17:59.362 "superblock": true, 00:17:59.362 "num_base_bdevs": 4, 00:17:59.362 "num_base_bdevs_discovered": 3, 00:17:59.362 "num_base_bdevs_operational": 3, 00:17:59.362 "base_bdevs_list": [ 00:17:59.362 { 00:17:59.362 "name": null, 00:17:59.362 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:59.362 "is_configured": false, 00:17:59.362 "data_offset": 2048, 00:17:59.362 "data_size": 63488 00:17:59.362 }, 00:17:59.362 { 00:17:59.362 "name": "pt2", 00:17:59.362 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:59.362 "is_configured": true, 00:17:59.362 "data_offset": 2048, 00:17:59.362 "data_size": 63488 00:17:59.362 }, 00:17:59.362 { 00:17:59.362 "name": "pt3", 00:17:59.362 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:59.362 "is_configured": true, 00:17:59.362 "data_offset": 2048, 00:17:59.362 "data_size": 63488 00:17:59.362 }, 00:17:59.362 { 00:17:59.362 "name": "pt4", 00:17:59.362 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:59.362 "is_configured": true, 00:17:59.362 "data_offset": 2048, 00:17:59.362 "data_size": 63488 00:17:59.362 } 00:17:59.362 ] 00:17:59.362 }' 00:17:59.362 11:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:59.362 11:58:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:59.929 11:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:59.929 [2024-07-12 11:58:50.095494] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:59.929 [2024-07-12 11:58:50.095515] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:59.929 [2024-07-12 11:58:50.095557] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:59.929 [2024-07-12 11:58:50.095604] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:59.929 [2024-07-12 11:58:50.095610] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2361190 name raid_bdev1, state offline 00:17:59.929 11:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:59.929 11:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:18:00.187 11:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:18:00.187 11:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:18:00.187 11:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:18:00.187 11:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:18:00.187 11:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:00.446 11:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:18:00.446 11:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:18:00.446 11:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:18:00.446 11:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:18:00.446 11:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:18:00.446 11:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:18:00.705 11:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:18:00.705 11:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:18:00.705 11:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:18:00.705 11:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:18:00.705 11:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:00.964 [2024-07-12 11:58:50.957818] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:00.964 [2024-07-12 11:58:50.957849] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:00.964 [2024-07-12 11:58:50.957860] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21b3330 00:18:00.964 [2024-07-12 11:58:50.957867] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:00.964 [2024-07-12 11:58:50.959067] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:00.964 [2024-07-12 11:58:50.959087] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:00.964 [2024-07-12 11:58:50.959133] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:18:00.964 [2024-07-12 11:58:50.959152] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:00.964 pt2 00:18:00.964 11:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:18:00.964 11:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:00.964 11:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:00.964 11:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:00.964 11:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:00.964 11:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:00.964 11:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:00.964 11:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:00.964 11:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:00.964 11:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:00.964 11:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:00.964 11:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:00.964 11:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:00.964 "name": "raid_bdev1", 00:18:00.964 "uuid": "def2915b-5aff-4786-ae71-c1e85208df84", 00:18:00.964 "strip_size_kb": 0, 00:18:00.964 "state": "configuring", 00:18:00.964 "raid_level": "raid1", 00:18:00.964 "superblock": true, 00:18:00.964 "num_base_bdevs": 4, 00:18:00.964 "num_base_bdevs_discovered": 1, 00:18:00.964 "num_base_bdevs_operational": 3, 00:18:00.964 "base_bdevs_list": [ 00:18:00.964 { 00:18:00.964 "name": null, 00:18:00.964 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:00.964 "is_configured": false, 00:18:00.964 "data_offset": 2048, 00:18:00.964 "data_size": 63488 00:18:00.964 }, 00:18:00.964 { 00:18:00.964 "name": "pt2", 00:18:00.964 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:00.964 "is_configured": true, 00:18:00.964 "data_offset": 2048, 00:18:00.964 "data_size": 63488 00:18:00.964 }, 00:18:00.964 { 00:18:00.964 "name": null, 00:18:00.964 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:00.964 "is_configured": false, 00:18:00.964 "data_offset": 2048, 00:18:00.964 "data_size": 63488 00:18:00.964 }, 00:18:00.964 { 00:18:00.964 "name": null, 00:18:00.964 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:00.964 "is_configured": false, 00:18:00.964 "data_offset": 2048, 00:18:00.964 "data_size": 63488 00:18:00.964 } 00:18:00.964 ] 00:18:00.964 }' 00:18:00.964 11:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:00.964 11:58:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:01.531 11:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:18:01.531 11:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:18:01.531 11:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:01.531 [2024-07-12 11:58:51.739846] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:01.531 [2024-07-12 11:58:51.739877] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:01.531 [2024-07-12 11:58:51.739888] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2363ef0 00:18:01.531 [2024-07-12 11:58:51.739894] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:01.531 [2024-07-12 11:58:51.740135] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:01.531 [2024-07-12 11:58:51.740145] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:01.531 [2024-07-12 11:58:51.740186] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:18:01.531 [2024-07-12 11:58:51.740198] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:01.531 pt3 00:18:01.531 11:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:18:01.531 11:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:01.531 11:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:01.531 11:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:01.531 11:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:01.531 11:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:01.531 11:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:01.531 11:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:01.531 11:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:01.531 11:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:01.531 11:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:01.531 11:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:01.790 11:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:01.790 "name": "raid_bdev1", 00:18:01.790 "uuid": "def2915b-5aff-4786-ae71-c1e85208df84", 00:18:01.790 "strip_size_kb": 0, 00:18:01.790 "state": "configuring", 00:18:01.790 "raid_level": "raid1", 00:18:01.790 "superblock": true, 00:18:01.790 "num_base_bdevs": 4, 00:18:01.790 "num_base_bdevs_discovered": 2, 00:18:01.790 "num_base_bdevs_operational": 3, 00:18:01.790 "base_bdevs_list": [ 00:18:01.790 { 00:18:01.790 "name": null, 00:18:01.790 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:01.790 "is_configured": false, 00:18:01.790 "data_offset": 2048, 00:18:01.790 "data_size": 63488 00:18:01.790 }, 00:18:01.790 { 00:18:01.790 "name": "pt2", 00:18:01.790 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:01.790 "is_configured": true, 00:18:01.790 "data_offset": 2048, 00:18:01.790 "data_size": 63488 00:18:01.790 }, 00:18:01.790 { 00:18:01.790 "name": "pt3", 00:18:01.790 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:01.790 "is_configured": true, 00:18:01.790 "data_offset": 2048, 00:18:01.790 "data_size": 63488 00:18:01.790 }, 00:18:01.790 { 00:18:01.790 "name": null, 00:18:01.790 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:01.790 "is_configured": false, 00:18:01.790 "data_offset": 2048, 00:18:01.790 "data_size": 63488 00:18:01.790 } 00:18:01.790 ] 00:18:01.790 }' 00:18:01.790 11:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:01.790 11:58:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:02.357 11:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:18:02.357 11:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:18:02.357 11:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=3 00:18:02.357 11:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:18:02.357 [2024-07-12 11:58:52.561970] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:18:02.357 [2024-07-12 11:58:52.562001] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:02.357 [2024-07-12 11:58:52.562011] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21b2e20 00:18:02.357 [2024-07-12 11:58:52.562033] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:02.357 [2024-07-12 11:58:52.562267] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:02.357 [2024-07-12 11:58:52.562277] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:18:02.357 [2024-07-12 11:58:52.562317] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:18:02.357 [2024-07-12 11:58:52.562328] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:18:02.357 [2024-07-12 11:58:52.562406] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x21b4640 00:18:02.357 [2024-07-12 11:58:52.562412] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:02.357 [2024-07-12 11:58:52.562528] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21b30b0 00:18:02.357 [2024-07-12 11:58:52.562618] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21b4640 00:18:02.357 [2024-07-12 11:58:52.562623] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x21b4640 00:18:02.357 [2024-07-12 11:58:52.562687] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:02.357 pt4 00:18:02.357 11:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:18:02.357 11:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:02.357 11:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:02.357 11:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:02.357 11:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:02.357 11:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:02.357 11:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:02.357 11:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:02.357 11:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:02.357 11:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:02.357 11:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:02.357 11:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:02.616 11:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:02.616 "name": "raid_bdev1", 00:18:02.616 "uuid": "def2915b-5aff-4786-ae71-c1e85208df84", 00:18:02.616 "strip_size_kb": 0, 00:18:02.616 "state": "online", 00:18:02.616 "raid_level": "raid1", 00:18:02.616 "superblock": true, 00:18:02.616 "num_base_bdevs": 4, 00:18:02.616 "num_base_bdevs_discovered": 3, 00:18:02.616 "num_base_bdevs_operational": 3, 00:18:02.616 "base_bdevs_list": [ 00:18:02.616 { 00:18:02.616 "name": null, 00:18:02.616 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:02.616 "is_configured": false, 00:18:02.616 "data_offset": 2048, 00:18:02.616 "data_size": 63488 00:18:02.616 }, 00:18:02.616 { 00:18:02.616 "name": "pt2", 00:18:02.616 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:02.616 "is_configured": true, 00:18:02.616 "data_offset": 2048, 00:18:02.616 "data_size": 63488 00:18:02.616 }, 00:18:02.616 { 00:18:02.616 "name": "pt3", 00:18:02.616 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:02.616 "is_configured": true, 00:18:02.616 "data_offset": 2048, 00:18:02.616 "data_size": 63488 00:18:02.616 }, 00:18:02.616 { 00:18:02.616 "name": "pt4", 00:18:02.617 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:02.617 "is_configured": true, 00:18:02.617 "data_offset": 2048, 00:18:02.617 "data_size": 63488 00:18:02.617 } 00:18:02.617 ] 00:18:02.617 }' 00:18:02.617 11:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:02.617 11:58:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:03.182 11:58:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:03.182 [2024-07-12 11:58:53.408144] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:03.182 [2024-07-12 11:58:53.408163] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:03.182 [2024-07-12 11:58:53.408201] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:03.182 [2024-07-12 11:58:53.408245] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:03.182 [2024-07-12 11:58:53.408251] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21b4640 name raid_bdev1, state offline 00:18:03.441 11:58:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:03.441 11:58:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:18:03.441 11:58:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:18:03.441 11:58:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:18:03.441 11:58:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 4 -gt 2 ']' 00:18:03.441 11:58:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=3 00:18:03.441 11:58:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:18:03.700 11:58:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:03.700 [2024-07-12 11:58:53.913432] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:03.700 [2024-07-12 11:58:53.913464] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:03.700 [2024-07-12 11:58:53.913473] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21b4640 00:18:03.700 [2024-07-12 11:58:53.913479] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:03.700 [2024-07-12 11:58:53.914662] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:03.700 [2024-07-12 11:58:53.914682] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:03.700 [2024-07-12 11:58:53.914724] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:18:03.700 [2024-07-12 11:58:53.914754] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:03.700 [2024-07-12 11:58:53.914815] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:18:03.700 [2024-07-12 11:58:53.914821] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:03.700 [2024-07-12 11:58:53.914829] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2364b30 name raid_bdev1, state configuring 00:18:03.700 [2024-07-12 11:58:53.914843] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:03.700 [2024-07-12 11:58:53.914890] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:03.700 pt1 00:18:03.700 11:58:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 4 -gt 2 ']' 00:18:03.700 11:58:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:18:03.700 11:58:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:03.700 11:58:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:03.700 11:58:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:03.700 11:58:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:03.700 11:58:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:03.700 11:58:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:03.700 11:58:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:03.700 11:58:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:03.700 11:58:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:03.700 11:58:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:03.700 11:58:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:03.960 11:58:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:03.960 "name": "raid_bdev1", 00:18:03.960 "uuid": "def2915b-5aff-4786-ae71-c1e85208df84", 00:18:03.960 "strip_size_kb": 0, 00:18:03.960 "state": "configuring", 00:18:03.960 "raid_level": "raid1", 00:18:03.960 "superblock": true, 00:18:03.960 "num_base_bdevs": 4, 00:18:03.960 "num_base_bdevs_discovered": 2, 00:18:03.960 "num_base_bdevs_operational": 3, 00:18:03.960 "base_bdevs_list": [ 00:18:03.960 { 00:18:03.960 "name": null, 00:18:03.960 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:03.960 "is_configured": false, 00:18:03.960 "data_offset": 2048, 00:18:03.960 "data_size": 63488 00:18:03.960 }, 00:18:03.960 { 00:18:03.960 "name": "pt2", 00:18:03.960 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:03.960 "is_configured": true, 00:18:03.960 "data_offset": 2048, 00:18:03.960 "data_size": 63488 00:18:03.960 }, 00:18:03.960 { 00:18:03.960 "name": "pt3", 00:18:03.960 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:03.960 "is_configured": true, 00:18:03.960 "data_offset": 2048, 00:18:03.960 "data_size": 63488 00:18:03.960 }, 00:18:03.960 { 00:18:03.960 "name": null, 00:18:03.960 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:03.960 "is_configured": false, 00:18:03.960 "data_offset": 2048, 00:18:03.960 "data_size": 63488 00:18:03.960 } 00:18:03.960 ] 00:18:03.960 }' 00:18:03.960 11:58:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:03.960 11:58:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:04.527 11:58:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:18:04.527 11:58:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:18:04.786 11:58:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:18:04.786 11:58:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:18:04.786 [2024-07-12 11:58:54.932084] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:18:04.786 [2024-07-12 11:58:54.932118] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:04.786 [2024-07-12 11:58:54.932128] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x235e8b0 00:18:04.786 [2024-07-12 11:58:54.932150] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:04.786 [2024-07-12 11:58:54.932392] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:04.786 [2024-07-12 11:58:54.932401] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:18:04.786 [2024-07-12 11:58:54.932441] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:18:04.786 [2024-07-12 11:58:54.932453] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:18:04.786 [2024-07-12 11:58:54.932535] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2249e10 00:18:04.786 [2024-07-12 11:58:54.932541] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:04.786 [2024-07-12 11:58:54.932654] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2249a50 00:18:04.786 [2024-07-12 11:58:54.932739] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2249e10 00:18:04.786 [2024-07-12 11:58:54.932744] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2249e10 00:18:04.786 [2024-07-12 11:58:54.932804] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:04.786 pt4 00:18:04.786 11:58:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:18:04.786 11:58:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:04.786 11:58:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:04.786 11:58:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:04.786 11:58:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:04.786 11:58:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:04.786 11:58:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:04.786 11:58:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:04.786 11:58:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:04.786 11:58:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:04.786 11:58:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:04.786 11:58:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:05.045 11:58:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:05.045 "name": "raid_bdev1", 00:18:05.045 "uuid": "def2915b-5aff-4786-ae71-c1e85208df84", 00:18:05.045 "strip_size_kb": 0, 00:18:05.045 "state": "online", 00:18:05.045 "raid_level": "raid1", 00:18:05.045 "superblock": true, 00:18:05.045 "num_base_bdevs": 4, 00:18:05.045 "num_base_bdevs_discovered": 3, 00:18:05.045 "num_base_bdevs_operational": 3, 00:18:05.045 "base_bdevs_list": [ 00:18:05.045 { 00:18:05.045 "name": null, 00:18:05.045 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:05.045 "is_configured": false, 00:18:05.045 "data_offset": 2048, 00:18:05.045 "data_size": 63488 00:18:05.045 }, 00:18:05.045 { 00:18:05.045 "name": "pt2", 00:18:05.045 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:05.045 "is_configured": true, 00:18:05.045 "data_offset": 2048, 00:18:05.045 "data_size": 63488 00:18:05.045 }, 00:18:05.045 { 00:18:05.045 "name": "pt3", 00:18:05.045 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:05.045 "is_configured": true, 00:18:05.045 "data_offset": 2048, 00:18:05.045 "data_size": 63488 00:18:05.045 }, 00:18:05.045 { 00:18:05.045 "name": "pt4", 00:18:05.045 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:05.045 "is_configured": true, 00:18:05.045 "data_offset": 2048, 00:18:05.045 "data_size": 63488 00:18:05.045 } 00:18:05.045 ] 00:18:05.045 }' 00:18:05.045 11:58:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:05.045 11:58:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:05.612 11:58:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:18:05.612 11:58:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:18:05.612 11:58:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:18:05.612 11:58:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:05.612 11:58:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:18:05.871 [2024-07-12 11:58:55.938857] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:05.871 11:58:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' def2915b-5aff-4786-ae71-c1e85208df84 '!=' def2915b-5aff-4786-ae71-c1e85208df84 ']' 00:18:05.871 11:58:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 686926 00:18:05.871 11:58:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 686926 ']' 00:18:05.871 11:58:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 686926 00:18:05.871 11:58:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:18:05.871 11:58:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:05.871 11:58:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 686926 00:18:05.871 11:58:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:05.871 11:58:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:05.871 11:58:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 686926' 00:18:05.871 killing process with pid 686926 00:18:05.871 11:58:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 686926 00:18:05.871 [2024-07-12 11:58:55.995523] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:05.871 [2024-07-12 11:58:55.995567] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:05.871 [2024-07-12 11:58:55.995611] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:05.871 [2024-07-12 11:58:55.995616] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2249e10 name raid_bdev1, state offline 00:18:05.871 11:58:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 686926 00:18:05.871 [2024-07-12 11:58:56.027963] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:06.134 11:58:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:18:06.134 00:18:06.134 real 0m19.069s 00:18:06.134 user 0m35.421s 00:18:06.134 sys 0m2.928s 00:18:06.134 11:58:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:06.134 11:58:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:06.134 ************************************ 00:18:06.134 END TEST raid_superblock_test 00:18:06.134 ************************************ 00:18:06.134 11:58:56 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:06.134 11:58:56 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 4 read 00:18:06.134 11:58:56 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:06.134 11:58:56 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:06.134 11:58:56 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:06.134 ************************************ 00:18:06.134 START TEST raid_read_error_test 00:18:06.134 ************************************ 00:18:06.134 11:58:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 4 read 00:18:06.134 11:58:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:18:06.134 11:58:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:18:06.134 11:58:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:18:06.134 11:58:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:18:06.134 11:58:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:06.134 11:58:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:18:06.134 11:58:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:06.134 11:58:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:06.134 11:58:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:18:06.134 11:58:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:06.134 11:58:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:06.134 11:58:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:18:06.134 11:58:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:06.134 11:58:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:06.134 11:58:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:18:06.134 11:58:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:06.135 11:58:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:06.135 11:58:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:06.135 11:58:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:18:06.135 11:58:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:18:06.135 11:58:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:18:06.135 11:58:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:18:06.135 11:58:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:18:06.135 11:58:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:18:06.135 11:58:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:18:06.135 11:58:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:18:06.135 11:58:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:18:06.135 11:58:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.fr9OpT2UWs 00:18:06.135 11:58:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=690548 00:18:06.135 11:58:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 690548 /var/tmp/spdk-raid.sock 00:18:06.135 11:58:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:18:06.135 11:58:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 690548 ']' 00:18:06.135 11:58:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:06.135 11:58:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:06.135 11:58:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:06.135 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:06.135 11:58:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:06.135 11:58:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:06.135 [2024-07-12 11:58:56.333593] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:18:06.135 [2024-07-12 11:58:56.333632] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid690548 ] 00:18:06.437 [2024-07-12 11:58:56.400447] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:06.437 [2024-07-12 11:58:56.479426] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:18:06.437 [2024-07-12 11:58:56.536450] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:06.437 [2024-07-12 11:58:56.536477] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:07.013 11:58:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:07.013 11:58:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:18:07.013 11:58:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:07.013 11:58:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:07.272 BaseBdev1_malloc 00:18:07.272 11:58:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:18:07.272 true 00:18:07.272 11:58:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:18:07.531 [2024-07-12 11:58:57.604563] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:18:07.531 [2024-07-12 11:58:57.604592] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:07.531 [2024-07-12 11:58:57.604603] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1eb72d0 00:18:07.531 [2024-07-12 11:58:57.604609] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:07.531 [2024-07-12 11:58:57.605822] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:07.531 [2024-07-12 11:58:57.605843] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:07.531 BaseBdev1 00:18:07.531 11:58:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:07.531 11:58:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:07.531 BaseBdev2_malloc 00:18:07.790 11:58:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:18:07.790 true 00:18:07.790 11:58:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:18:08.048 [2024-07-12 11:58:58.097614] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:18:08.048 [2024-07-12 11:58:58.097643] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:08.048 [2024-07-12 11:58:58.097653] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ebbf40 00:18:08.048 [2024-07-12 11:58:58.097659] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:08.048 [2024-07-12 11:58:58.098730] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:08.048 [2024-07-12 11:58:58.098750] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:08.048 BaseBdev2 00:18:08.048 11:58:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:08.048 11:58:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:18:08.048 BaseBdev3_malloc 00:18:08.048 11:58:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:18:08.304 true 00:18:08.304 11:58:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:18:08.562 [2024-07-12 11:58:58.578250] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:18:08.562 [2024-07-12 11:58:58.578277] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:08.562 [2024-07-12 11:58:58.578287] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ebeea0 00:18:08.562 [2024-07-12 11:58:58.578294] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:08.562 [2024-07-12 11:58:58.579366] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:08.562 [2024-07-12 11:58:58.579386] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:18:08.562 BaseBdev3 00:18:08.562 11:58:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:08.562 11:58:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:18:08.562 BaseBdev4_malloc 00:18:08.562 11:58:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:18:08.820 true 00:18:08.820 11:58:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:18:09.079 [2024-07-12 11:58:59.074954] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:18:09.079 [2024-07-12 11:58:59.074983] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:09.079 [2024-07-12 11:58:59.074993] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1eb82f0 00:18:09.079 [2024-07-12 11:58:59.074999] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:09.079 [2024-07-12 11:58:59.075949] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:09.079 [2024-07-12 11:58:59.075967] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:18:09.079 BaseBdev4 00:18:09.079 11:58:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:18:09.079 [2024-07-12 11:58:59.255440] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:09.079 [2024-07-12 11:58:59.256352] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:09.079 [2024-07-12 11:58:59.256398] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:09.079 [2024-07-12 11:58:59.256437] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:09.079 [2024-07-12 11:58:59.256597] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1eb8d30 00:18:09.079 [2024-07-12 11:58:59.256604] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:09.079 [2024-07-12 11:58:59.256744] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1da1720 00:18:09.079 [2024-07-12 11:58:59.256846] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1eb8d30 00:18:09.079 [2024-07-12 11:58:59.256851] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1eb8d30 00:18:09.079 [2024-07-12 11:58:59.256917] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:09.079 11:58:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:18:09.079 11:58:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:09.079 11:58:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:09.079 11:58:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:09.079 11:58:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:09.079 11:58:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:09.079 11:58:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:09.079 11:58:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:09.079 11:58:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:09.079 11:58:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:09.079 11:58:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:09.079 11:58:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:09.338 11:58:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:09.338 "name": "raid_bdev1", 00:18:09.338 "uuid": "ec658ccf-031a-4652-b692-498f13b58266", 00:18:09.338 "strip_size_kb": 0, 00:18:09.338 "state": "online", 00:18:09.338 "raid_level": "raid1", 00:18:09.338 "superblock": true, 00:18:09.338 "num_base_bdevs": 4, 00:18:09.338 "num_base_bdevs_discovered": 4, 00:18:09.338 "num_base_bdevs_operational": 4, 00:18:09.338 "base_bdevs_list": [ 00:18:09.338 { 00:18:09.338 "name": "BaseBdev1", 00:18:09.338 "uuid": "7ffd3a92-95d8-52eb-86af-9d480c112d13", 00:18:09.338 "is_configured": true, 00:18:09.338 "data_offset": 2048, 00:18:09.338 "data_size": 63488 00:18:09.338 }, 00:18:09.338 { 00:18:09.338 "name": "BaseBdev2", 00:18:09.338 "uuid": "97670557-057d-5c59-9a3c-3dd9041857a4", 00:18:09.338 "is_configured": true, 00:18:09.338 "data_offset": 2048, 00:18:09.338 "data_size": 63488 00:18:09.338 }, 00:18:09.338 { 00:18:09.338 "name": "BaseBdev3", 00:18:09.338 "uuid": "53522bbe-ebc4-5317-97ee-f36428ddd5ca", 00:18:09.338 "is_configured": true, 00:18:09.338 "data_offset": 2048, 00:18:09.338 "data_size": 63488 00:18:09.338 }, 00:18:09.338 { 00:18:09.338 "name": "BaseBdev4", 00:18:09.338 "uuid": "9e68ae7e-b432-5718-8001-c4be6a0d2f58", 00:18:09.338 "is_configured": true, 00:18:09.338 "data_offset": 2048, 00:18:09.338 "data_size": 63488 00:18:09.338 } 00:18:09.338 ] 00:18:09.338 }' 00:18:09.338 11:58:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:09.338 11:58:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:09.906 11:58:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:18:09.906 11:58:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:18:09.906 [2024-07-12 11:58:59.989527] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1da5220 00:18:10.843 11:59:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:18:10.843 11:59:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:18:10.843 11:59:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:18:10.843 11:59:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:18:10.843 11:59:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:18:10.843 11:59:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:18:10.843 11:59:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:10.843 11:59:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:10.843 11:59:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:10.843 11:59:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:10.843 11:59:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:10.843 11:59:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:10.843 11:59:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:10.843 11:59:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:10.843 11:59:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:11.102 11:59:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:11.102 11:59:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:11.102 11:59:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:11.102 "name": "raid_bdev1", 00:18:11.102 "uuid": "ec658ccf-031a-4652-b692-498f13b58266", 00:18:11.102 "strip_size_kb": 0, 00:18:11.102 "state": "online", 00:18:11.102 "raid_level": "raid1", 00:18:11.102 "superblock": true, 00:18:11.102 "num_base_bdevs": 4, 00:18:11.102 "num_base_bdevs_discovered": 4, 00:18:11.102 "num_base_bdevs_operational": 4, 00:18:11.102 "base_bdevs_list": [ 00:18:11.102 { 00:18:11.102 "name": "BaseBdev1", 00:18:11.102 "uuid": "7ffd3a92-95d8-52eb-86af-9d480c112d13", 00:18:11.102 "is_configured": true, 00:18:11.102 "data_offset": 2048, 00:18:11.102 "data_size": 63488 00:18:11.102 }, 00:18:11.102 { 00:18:11.102 "name": "BaseBdev2", 00:18:11.102 "uuid": "97670557-057d-5c59-9a3c-3dd9041857a4", 00:18:11.102 "is_configured": true, 00:18:11.102 "data_offset": 2048, 00:18:11.102 "data_size": 63488 00:18:11.102 }, 00:18:11.102 { 00:18:11.102 "name": "BaseBdev3", 00:18:11.102 "uuid": "53522bbe-ebc4-5317-97ee-f36428ddd5ca", 00:18:11.102 "is_configured": true, 00:18:11.102 "data_offset": 2048, 00:18:11.102 "data_size": 63488 00:18:11.102 }, 00:18:11.102 { 00:18:11.102 "name": "BaseBdev4", 00:18:11.102 "uuid": "9e68ae7e-b432-5718-8001-c4be6a0d2f58", 00:18:11.102 "is_configured": true, 00:18:11.102 "data_offset": 2048, 00:18:11.102 "data_size": 63488 00:18:11.102 } 00:18:11.102 ] 00:18:11.102 }' 00:18:11.102 11:59:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:11.102 11:59:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:11.668 11:59:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:11.668 [2024-07-12 11:59:01.908678] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:11.668 [2024-07-12 11:59:01.908706] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:11.668 [2024-07-12 11:59:01.910882] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:11.668 [2024-07-12 11:59:01.910908] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:11.668 [2024-07-12 11:59:01.910988] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:11.668 [2024-07-12 11:59:01.911000] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1eb8d30 name raid_bdev1, state offline 00:18:11.668 0 00:18:11.927 11:59:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 690548 00:18:11.927 11:59:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 690548 ']' 00:18:11.927 11:59:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 690548 00:18:11.927 11:59:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:18:11.927 11:59:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:11.927 11:59:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 690548 00:18:11.927 11:59:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:11.927 11:59:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:11.927 11:59:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 690548' 00:18:11.927 killing process with pid 690548 00:18:11.927 11:59:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 690548 00:18:11.927 [2024-07-12 11:59:01.969441] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:11.927 11:59:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 690548 00:18:11.927 [2024-07-12 11:59:01.996008] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:12.187 11:59:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.fr9OpT2UWs 00:18:12.187 11:59:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:18:12.187 11:59:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:18:12.187 11:59:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:18:12.187 11:59:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:18:12.187 11:59:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:12.187 11:59:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:18:12.187 11:59:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:18:12.187 00:18:12.187 real 0m5.915s 00:18:12.187 user 0m9.311s 00:18:12.187 sys 0m0.841s 00:18:12.187 11:59:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:12.187 11:59:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:12.187 ************************************ 00:18:12.187 END TEST raid_read_error_test 00:18:12.187 ************************************ 00:18:12.187 11:59:02 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:12.187 11:59:02 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 4 write 00:18:12.187 11:59:02 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:12.187 11:59:02 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:12.187 11:59:02 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:12.187 ************************************ 00:18:12.187 START TEST raid_write_error_test 00:18:12.187 ************************************ 00:18:12.187 11:59:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 4 write 00:18:12.187 11:59:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:18:12.187 11:59:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:18:12.187 11:59:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:18:12.187 11:59:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:18:12.187 11:59:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:12.187 11:59:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:18:12.187 11:59:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:12.187 11:59:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:12.187 11:59:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:18:12.187 11:59:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:12.187 11:59:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:12.187 11:59:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:18:12.187 11:59:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:12.187 11:59:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:12.187 11:59:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:18:12.187 11:59:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:12.187 11:59:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:12.187 11:59:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:12.187 11:59:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:18:12.187 11:59:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:18:12.187 11:59:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:18:12.187 11:59:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:18:12.187 11:59:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:18:12.187 11:59:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:18:12.187 11:59:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:18:12.187 11:59:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:18:12.187 11:59:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:18:12.187 11:59:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.74ORfJ9Ypu 00:18:12.187 11:59:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=691747 00:18:12.187 11:59:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:18:12.187 11:59:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 691747 /var/tmp/spdk-raid.sock 00:18:12.187 11:59:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 691747 ']' 00:18:12.187 11:59:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:12.187 11:59:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:12.187 11:59:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:12.187 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:12.187 11:59:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:12.187 11:59:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:12.187 [2024-07-12 11:59:02.320046] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:18:12.187 [2024-07-12 11:59:02.320083] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid691747 ] 00:18:12.187 [2024-07-12 11:59:02.384612] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:12.446 [2024-07-12 11:59:02.461212] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:18:12.447 [2024-07-12 11:59:02.509112] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:12.447 [2024-07-12 11:59:02.509133] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:13.012 11:59:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:13.012 11:59:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:18:13.012 11:59:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:13.012 11:59:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:13.270 BaseBdev1_malloc 00:18:13.270 11:59:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:18:13.270 true 00:18:13.270 11:59:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:18:13.528 [2024-07-12 11:59:03.600959] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:18:13.528 [2024-07-12 11:59:03.600993] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:13.528 [2024-07-12 11:59:03.601005] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27192d0 00:18:13.528 [2024-07-12 11:59:03.601012] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:13.528 [2024-07-12 11:59:03.602357] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:13.528 [2024-07-12 11:59:03.602379] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:13.528 BaseBdev1 00:18:13.528 11:59:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:13.528 11:59:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:13.528 BaseBdev2_malloc 00:18:13.786 11:59:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:18:13.786 true 00:18:13.786 11:59:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:18:14.044 [2024-07-12 11:59:04.113725] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:18:14.044 [2024-07-12 11:59:04.113753] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:14.044 [2024-07-12 11:59:04.113762] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x271df40 00:18:14.044 [2024-07-12 11:59:04.113768] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:14.044 [2024-07-12 11:59:04.114714] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:14.044 [2024-07-12 11:59:04.114733] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:14.044 BaseBdev2 00:18:14.044 11:59:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:14.044 11:59:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:18:14.302 BaseBdev3_malloc 00:18:14.302 11:59:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:18:14.302 true 00:18:14.302 11:59:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:18:14.561 [2024-07-12 11:59:04.618489] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:18:14.561 [2024-07-12 11:59:04.618519] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:14.561 [2024-07-12 11:59:04.618528] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2720ea0 00:18:14.561 [2024-07-12 11:59:04.618534] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:14.561 [2024-07-12 11:59:04.619497] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:14.561 [2024-07-12 11:59:04.619516] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:18:14.561 BaseBdev3 00:18:14.561 11:59:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:14.561 11:59:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:18:14.561 BaseBdev4_malloc 00:18:14.819 11:59:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:18:14.819 true 00:18:14.819 11:59:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:18:15.078 [2024-07-12 11:59:05.143410] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:18:15.078 [2024-07-12 11:59:05.143439] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:15.078 [2024-07-12 11:59:05.143447] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x271a2f0 00:18:15.078 [2024-07-12 11:59:05.143453] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:15.078 [2024-07-12 11:59:05.144391] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:15.078 [2024-07-12 11:59:05.144411] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:18:15.078 BaseBdev4 00:18:15.078 11:59:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:18:15.078 [2024-07-12 11:59:05.311864] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:15.078 [2024-07-12 11:59:05.312625] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:15.078 [2024-07-12 11:59:05.312667] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:15.078 [2024-07-12 11:59:05.312704] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:15.078 [2024-07-12 11:59:05.312847] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x271ad30 00:18:15.078 [2024-07-12 11:59:05.312853] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:15.078 [2024-07-12 11:59:05.312965] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2603720 00:18:15.078 [2024-07-12 11:59:05.313062] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x271ad30 00:18:15.078 [2024-07-12 11:59:05.313067] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x271ad30 00:18:15.078 [2024-07-12 11:59:05.313129] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:15.336 11:59:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:18:15.336 11:59:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:15.336 11:59:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:15.336 11:59:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:15.336 11:59:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:15.336 11:59:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:15.336 11:59:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:15.336 11:59:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:15.336 11:59:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:15.336 11:59:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:15.336 11:59:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:15.336 11:59:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:15.336 11:59:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:15.336 "name": "raid_bdev1", 00:18:15.336 "uuid": "8a91af66-ff7f-4509-846e-e484f685fb42", 00:18:15.336 "strip_size_kb": 0, 00:18:15.336 "state": "online", 00:18:15.336 "raid_level": "raid1", 00:18:15.336 "superblock": true, 00:18:15.336 "num_base_bdevs": 4, 00:18:15.336 "num_base_bdevs_discovered": 4, 00:18:15.337 "num_base_bdevs_operational": 4, 00:18:15.337 "base_bdevs_list": [ 00:18:15.337 { 00:18:15.337 "name": "BaseBdev1", 00:18:15.337 "uuid": "9eca9318-face-552c-86cd-807537416e75", 00:18:15.337 "is_configured": true, 00:18:15.337 "data_offset": 2048, 00:18:15.337 "data_size": 63488 00:18:15.337 }, 00:18:15.337 { 00:18:15.337 "name": "BaseBdev2", 00:18:15.337 "uuid": "1caf32b0-bb68-575c-b70d-cf19ffc75910", 00:18:15.337 "is_configured": true, 00:18:15.337 "data_offset": 2048, 00:18:15.337 "data_size": 63488 00:18:15.337 }, 00:18:15.337 { 00:18:15.337 "name": "BaseBdev3", 00:18:15.337 "uuid": "0f04d8c4-631e-5de0-a36f-3b4b1e1de28c", 00:18:15.337 "is_configured": true, 00:18:15.337 "data_offset": 2048, 00:18:15.337 "data_size": 63488 00:18:15.337 }, 00:18:15.337 { 00:18:15.337 "name": "BaseBdev4", 00:18:15.337 "uuid": "fe32b1ab-5bbe-5762-ab9f-b345317f0d18", 00:18:15.337 "is_configured": true, 00:18:15.337 "data_offset": 2048, 00:18:15.337 "data_size": 63488 00:18:15.337 } 00:18:15.337 ] 00:18:15.337 }' 00:18:15.337 11:59:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:15.337 11:59:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:15.904 11:59:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:18:15.904 11:59:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:18:15.904 [2024-07-12 11:59:06.082051] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2607220 00:18:16.839 11:59:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:18:17.097 [2024-07-12 11:59:07.165675] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:18:17.097 [2024-07-12 11:59:07.165715] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:17.097 [2024-07-12 11:59:07.165897] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x2607220 00:18:17.097 11:59:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:18:17.097 11:59:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:18:17.097 11:59:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:18:17.097 11:59:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=3 00:18:17.097 11:59:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:18:17.097 11:59:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:17.097 11:59:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:17.097 11:59:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:17.097 11:59:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:17.097 11:59:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:17.097 11:59:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:17.097 11:59:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:17.097 11:59:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:17.097 11:59:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:17.097 11:59:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:17.097 11:59:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:17.355 11:59:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:17.355 "name": "raid_bdev1", 00:18:17.355 "uuid": "8a91af66-ff7f-4509-846e-e484f685fb42", 00:18:17.355 "strip_size_kb": 0, 00:18:17.355 "state": "online", 00:18:17.355 "raid_level": "raid1", 00:18:17.355 "superblock": true, 00:18:17.355 "num_base_bdevs": 4, 00:18:17.355 "num_base_bdevs_discovered": 3, 00:18:17.355 "num_base_bdevs_operational": 3, 00:18:17.355 "base_bdevs_list": [ 00:18:17.355 { 00:18:17.355 "name": null, 00:18:17.355 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:17.355 "is_configured": false, 00:18:17.355 "data_offset": 2048, 00:18:17.355 "data_size": 63488 00:18:17.355 }, 00:18:17.355 { 00:18:17.355 "name": "BaseBdev2", 00:18:17.355 "uuid": "1caf32b0-bb68-575c-b70d-cf19ffc75910", 00:18:17.355 "is_configured": true, 00:18:17.355 "data_offset": 2048, 00:18:17.355 "data_size": 63488 00:18:17.355 }, 00:18:17.355 { 00:18:17.355 "name": "BaseBdev3", 00:18:17.355 "uuid": "0f04d8c4-631e-5de0-a36f-3b4b1e1de28c", 00:18:17.355 "is_configured": true, 00:18:17.355 "data_offset": 2048, 00:18:17.355 "data_size": 63488 00:18:17.355 }, 00:18:17.355 { 00:18:17.355 "name": "BaseBdev4", 00:18:17.355 "uuid": "fe32b1ab-5bbe-5762-ab9f-b345317f0d18", 00:18:17.355 "is_configured": true, 00:18:17.355 "data_offset": 2048, 00:18:17.355 "data_size": 63488 00:18:17.355 } 00:18:17.355 ] 00:18:17.355 }' 00:18:17.355 11:59:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:17.355 11:59:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:17.613 11:59:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:17.872 [2024-07-12 11:59:07.999668] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:17.872 [2024-07-12 11:59:07.999704] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:17.872 [2024-07-12 11:59:08.001777] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:17.872 [2024-07-12 11:59:08.001800] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:17.872 [2024-07-12 11:59:08.001859] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:17.872 [2024-07-12 11:59:08.001865] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x271ad30 name raid_bdev1, state offline 00:18:17.872 0 00:18:17.872 11:59:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 691747 00:18:17.872 11:59:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 691747 ']' 00:18:17.872 11:59:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 691747 00:18:17.872 11:59:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:18:17.872 11:59:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:17.872 11:59:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 691747 00:18:17.872 11:59:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:17.872 11:59:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:17.872 11:59:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 691747' 00:18:17.872 killing process with pid 691747 00:18:17.872 11:59:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 691747 00:18:17.872 [2024-07-12 11:59:08.060368] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:17.872 11:59:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 691747 00:18:17.872 [2024-07-12 11:59:08.086508] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:18.132 11:59:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.74ORfJ9Ypu 00:18:18.132 11:59:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:18:18.132 11:59:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:18:18.132 11:59:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:18:18.132 11:59:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:18:18.132 11:59:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:18.132 11:59:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:18:18.132 11:59:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:18:18.132 00:18:18.132 real 0m6.018s 00:18:18.132 user 0m9.495s 00:18:18.132 sys 0m0.884s 00:18:18.132 11:59:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:18.132 11:59:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:18.132 ************************************ 00:18:18.132 END TEST raid_write_error_test 00:18:18.132 ************************************ 00:18:18.132 11:59:08 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:18.132 11:59:08 bdev_raid -- bdev/bdev_raid.sh@875 -- # '[' true = true ']' 00:18:18.132 11:59:08 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:18:18.132 11:59:08 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 2 false false true 00:18:18.132 11:59:08 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:18:18.132 11:59:08 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:18.132 11:59:08 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:18.132 ************************************ 00:18:18.132 START TEST raid_rebuild_test 00:18:18.132 ************************************ 00:18:18.132 11:59:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 false false true 00:18:18.132 11:59:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:18:18.132 11:59:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:18:18.132 11:59:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:18:18.132 11:59:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:18:18.132 11:59:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:18:18.132 11:59:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:18:18.132 11:59:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:18:18.132 11:59:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:18:18.132 11:59:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:18:18.132 11:59:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:18:18.132 11:59:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:18:18.132 11:59:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:18:18.132 11:59:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:18:18.132 11:59:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:18:18.132 11:59:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:18:18.132 11:59:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:18:18.132 11:59:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:18:18.132 11:59:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:18:18.132 11:59:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:18:18.132 11:59:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:18:18.132 11:59:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:18:18.132 11:59:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:18:18.132 11:59:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:18:18.132 11:59:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=692793 00:18:18.132 11:59:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 692793 /var/tmp/spdk-raid.sock 00:18:18.132 11:59:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:18:18.132 11:59:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@829 -- # '[' -z 692793 ']' 00:18:18.132 11:59:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:18.132 11:59:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:18.132 11:59:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:18.132 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:18.132 11:59:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:18.132 11:59:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:18:18.391 [2024-07-12 11:59:08.398878] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:18:18.391 [2024-07-12 11:59:08.398919] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid692793 ] 00:18:18.391 I/O size of 3145728 is greater than zero copy threshold (65536). 00:18:18.391 Zero copy mechanism will not be used. 00:18:18.391 [2024-07-12 11:59:08.464076] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:18.391 [2024-07-12 11:59:08.541991] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:18:18.391 [2024-07-12 11:59:08.590784] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:18.391 [2024-07-12 11:59:08.590808] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:18.957 11:59:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:18.957 11:59:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@862 -- # return 0 00:18:18.957 11:59:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:18:18.957 11:59:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:19.215 BaseBdev1_malloc 00:18:19.215 11:59:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:18:19.473 [2024-07-12 11:59:09.517798] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:18:19.473 [2024-07-12 11:59:09.517828] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:19.473 [2024-07-12 11:59:09.517839] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xde8010 00:18:19.473 [2024-07-12 11:59:09.517845] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:19.473 [2024-07-12 11:59:09.519019] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:19.473 [2024-07-12 11:59:09.519040] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:19.473 BaseBdev1 00:18:19.473 11:59:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:18:19.473 11:59:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:19.473 BaseBdev2_malloc 00:18:19.473 11:59:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:18:19.732 [2024-07-12 11:59:09.858297] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:18:19.732 [2024-07-12 11:59:09.858328] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:19.732 [2024-07-12 11:59:09.858341] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xde8b60 00:18:19.732 [2024-07-12 11:59:09.858348] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:19.732 [2024-07-12 11:59:09.859438] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:19.732 [2024-07-12 11:59:09.859458] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:19.732 BaseBdev2 00:18:19.732 11:59:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:18:19.990 spare_malloc 00:18:19.990 11:59:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:18:19.991 spare_delay 00:18:19.991 11:59:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:18:20.250 [2024-07-12 11:59:10.367166] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:18:20.250 [2024-07-12 11:59:10.367199] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:20.250 [2024-07-12 11:59:10.367210] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf93880 00:18:20.250 [2024-07-12 11:59:10.367217] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:20.250 [2024-07-12 11:59:10.368485] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:20.250 [2024-07-12 11:59:10.368507] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:18:20.250 spare 00:18:20.250 11:59:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:18:20.508 [2024-07-12 11:59:10.523585] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:20.508 [2024-07-12 11:59:10.524411] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:20.508 [2024-07-12 11:59:10.524463] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf976c0 00:18:20.508 [2024-07-12 11:59:10.524470] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:18:20.508 [2024-07-12 11:59:10.524616] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf93b10 00:18:20.508 [2024-07-12 11:59:10.524712] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf976c0 00:18:20.508 [2024-07-12 11:59:10.524718] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf976c0 00:18:20.508 [2024-07-12 11:59:10.524790] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:20.508 11:59:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:20.508 11:59:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:20.508 11:59:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:20.508 11:59:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:20.508 11:59:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:20.508 11:59:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:20.508 11:59:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:20.508 11:59:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:20.508 11:59:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:20.508 11:59:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:20.508 11:59:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:20.508 11:59:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:20.508 11:59:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:20.508 "name": "raid_bdev1", 00:18:20.508 "uuid": "abda5314-75f5-4a4a-949d-f31d02330a49", 00:18:20.508 "strip_size_kb": 0, 00:18:20.508 "state": "online", 00:18:20.508 "raid_level": "raid1", 00:18:20.508 "superblock": false, 00:18:20.508 "num_base_bdevs": 2, 00:18:20.508 "num_base_bdevs_discovered": 2, 00:18:20.508 "num_base_bdevs_operational": 2, 00:18:20.508 "base_bdevs_list": [ 00:18:20.508 { 00:18:20.508 "name": "BaseBdev1", 00:18:20.508 "uuid": "c12090b4-453d-51cb-a280-9ce94f38ef55", 00:18:20.508 "is_configured": true, 00:18:20.508 "data_offset": 0, 00:18:20.508 "data_size": 65536 00:18:20.508 }, 00:18:20.508 { 00:18:20.508 "name": "BaseBdev2", 00:18:20.508 "uuid": "fa45e431-04c4-5119-bb3e-748913d2cc8d", 00:18:20.508 "is_configured": true, 00:18:20.508 "data_offset": 0, 00:18:20.508 "data_size": 65536 00:18:20.508 } 00:18:20.508 ] 00:18:20.508 }' 00:18:20.508 11:59:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:20.508 11:59:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:18:21.075 11:59:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:21.075 11:59:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:18:21.075 [2024-07-12 11:59:11.313778] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:21.334 11:59:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:18:21.334 11:59:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:21.334 11:59:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:18:21.334 11:59:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:18:21.334 11:59:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:18:21.334 11:59:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:18:21.334 11:59:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:18:21.334 11:59:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:18:21.334 11:59:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:18:21.334 11:59:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:18:21.334 11:59:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:18:21.334 11:59:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:18:21.334 11:59:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:18:21.334 11:59:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:18:21.334 11:59:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:18:21.334 11:59:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:18:21.334 11:59:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:18:21.592 [2024-07-12 11:59:11.638495] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf93b10 00:18:21.592 /dev/nbd0 00:18:21.592 11:59:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:18:21.592 11:59:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:18:21.592 11:59:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:18:21.592 11:59:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:18:21.592 11:59:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:18:21.592 11:59:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:18:21.592 11:59:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:18:21.592 11:59:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:18:21.592 11:59:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:18:21.592 11:59:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:18:21.592 11:59:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:18:21.592 1+0 records in 00:18:21.592 1+0 records out 00:18:21.592 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000178395 s, 23.0 MB/s 00:18:21.592 11:59:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:21.592 11:59:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:18:21.593 11:59:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:21.593 11:59:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:18:21.593 11:59:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:18:21.593 11:59:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:18:21.593 11:59:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:18:21.593 11:59:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:18:21.593 11:59:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:18:21.593 11:59:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:18:24.874 65536+0 records in 00:18:24.875 65536+0 records out 00:18:24.875 33554432 bytes (34 MB, 32 MiB) copied, 3.1277 s, 10.7 MB/s 00:18:24.875 11:59:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:18:24.875 11:59:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:18:24.875 11:59:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:18:24.875 11:59:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:18:24.875 11:59:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:18:24.875 11:59:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:18:24.875 11:59:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:18:24.875 [2024-07-12 11:59:15.008909] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:24.875 11:59:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:18:24.875 11:59:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:18:24.875 11:59:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:18:24.875 11:59:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:18:24.875 11:59:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:18:24.875 11:59:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:18:24.875 11:59:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:18:24.875 11:59:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:18:24.875 11:59:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:18:25.134 [2024-07-12 11:59:15.173346] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:25.134 11:59:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:25.134 11:59:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:25.134 11:59:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:25.134 11:59:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:25.134 11:59:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:25.134 11:59:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:18:25.134 11:59:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:25.134 11:59:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:25.134 11:59:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:25.134 11:59:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:25.134 11:59:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:25.134 11:59:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:25.134 11:59:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:25.134 "name": "raid_bdev1", 00:18:25.134 "uuid": "abda5314-75f5-4a4a-949d-f31d02330a49", 00:18:25.134 "strip_size_kb": 0, 00:18:25.134 "state": "online", 00:18:25.134 "raid_level": "raid1", 00:18:25.134 "superblock": false, 00:18:25.134 "num_base_bdevs": 2, 00:18:25.134 "num_base_bdevs_discovered": 1, 00:18:25.134 "num_base_bdevs_operational": 1, 00:18:25.134 "base_bdevs_list": [ 00:18:25.134 { 00:18:25.134 "name": null, 00:18:25.134 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:25.134 "is_configured": false, 00:18:25.134 "data_offset": 0, 00:18:25.134 "data_size": 65536 00:18:25.134 }, 00:18:25.134 { 00:18:25.134 "name": "BaseBdev2", 00:18:25.134 "uuid": "fa45e431-04c4-5119-bb3e-748913d2cc8d", 00:18:25.134 "is_configured": true, 00:18:25.134 "data_offset": 0, 00:18:25.134 "data_size": 65536 00:18:25.134 } 00:18:25.134 ] 00:18:25.134 }' 00:18:25.134 11:59:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:25.134 11:59:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:18:25.702 11:59:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:18:25.960 [2024-07-12 11:59:16.019546] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:18:25.960 [2024-07-12 11:59:16.023977] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf96ea0 00:18:25.960 [2024-07-12 11:59:16.025525] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:18:25.960 11:59:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:18:26.897 11:59:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:26.897 11:59:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:26.897 11:59:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:26.897 11:59:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:26.897 11:59:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:26.897 11:59:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:26.897 11:59:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:27.156 11:59:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:27.156 "name": "raid_bdev1", 00:18:27.156 "uuid": "abda5314-75f5-4a4a-949d-f31d02330a49", 00:18:27.156 "strip_size_kb": 0, 00:18:27.156 "state": "online", 00:18:27.156 "raid_level": "raid1", 00:18:27.156 "superblock": false, 00:18:27.156 "num_base_bdevs": 2, 00:18:27.156 "num_base_bdevs_discovered": 2, 00:18:27.156 "num_base_bdevs_operational": 2, 00:18:27.156 "process": { 00:18:27.156 "type": "rebuild", 00:18:27.156 "target": "spare", 00:18:27.156 "progress": { 00:18:27.156 "blocks": 22528, 00:18:27.156 "percent": 34 00:18:27.156 } 00:18:27.156 }, 00:18:27.156 "base_bdevs_list": [ 00:18:27.156 { 00:18:27.156 "name": "spare", 00:18:27.156 "uuid": "1dc3fe1d-6f00-59d1-a889-57ddf1f177c0", 00:18:27.156 "is_configured": true, 00:18:27.156 "data_offset": 0, 00:18:27.156 "data_size": 65536 00:18:27.156 }, 00:18:27.156 { 00:18:27.156 "name": "BaseBdev2", 00:18:27.156 "uuid": "fa45e431-04c4-5119-bb3e-748913d2cc8d", 00:18:27.156 "is_configured": true, 00:18:27.156 "data_offset": 0, 00:18:27.156 "data_size": 65536 00:18:27.156 } 00:18:27.156 ] 00:18:27.156 }' 00:18:27.156 11:59:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:27.156 11:59:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:27.156 11:59:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:27.156 11:59:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:27.156 11:59:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:18:27.415 [2024-07-12 11:59:17.440067] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:18:27.415 [2024-07-12 11:59:17.535973] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:18:27.415 [2024-07-12 11:59:17.536003] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:27.415 [2024-07-12 11:59:17.536012] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:18:27.415 [2024-07-12 11:59:17.536016] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:18:27.415 11:59:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:27.415 11:59:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:27.415 11:59:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:27.415 11:59:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:27.415 11:59:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:27.415 11:59:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:18:27.415 11:59:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:27.415 11:59:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:27.415 11:59:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:27.415 11:59:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:27.415 11:59:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:27.415 11:59:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:27.674 11:59:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:27.674 "name": "raid_bdev1", 00:18:27.674 "uuid": "abda5314-75f5-4a4a-949d-f31d02330a49", 00:18:27.674 "strip_size_kb": 0, 00:18:27.674 "state": "online", 00:18:27.674 "raid_level": "raid1", 00:18:27.674 "superblock": false, 00:18:27.674 "num_base_bdevs": 2, 00:18:27.674 "num_base_bdevs_discovered": 1, 00:18:27.674 "num_base_bdevs_operational": 1, 00:18:27.674 "base_bdevs_list": [ 00:18:27.674 { 00:18:27.674 "name": null, 00:18:27.674 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:27.674 "is_configured": false, 00:18:27.674 "data_offset": 0, 00:18:27.674 "data_size": 65536 00:18:27.674 }, 00:18:27.674 { 00:18:27.674 "name": "BaseBdev2", 00:18:27.674 "uuid": "fa45e431-04c4-5119-bb3e-748913d2cc8d", 00:18:27.674 "is_configured": true, 00:18:27.674 "data_offset": 0, 00:18:27.674 "data_size": 65536 00:18:27.674 } 00:18:27.674 ] 00:18:27.674 }' 00:18:27.674 11:59:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:27.674 11:59:17 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:18:28.241 11:59:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:18:28.241 11:59:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:28.241 11:59:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:18:28.241 11:59:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:18:28.241 11:59:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:28.241 11:59:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:28.241 11:59:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:28.241 11:59:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:28.241 "name": "raid_bdev1", 00:18:28.241 "uuid": "abda5314-75f5-4a4a-949d-f31d02330a49", 00:18:28.241 "strip_size_kb": 0, 00:18:28.241 "state": "online", 00:18:28.241 "raid_level": "raid1", 00:18:28.241 "superblock": false, 00:18:28.241 "num_base_bdevs": 2, 00:18:28.241 "num_base_bdevs_discovered": 1, 00:18:28.241 "num_base_bdevs_operational": 1, 00:18:28.241 "base_bdevs_list": [ 00:18:28.241 { 00:18:28.241 "name": null, 00:18:28.241 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:28.241 "is_configured": false, 00:18:28.241 "data_offset": 0, 00:18:28.241 "data_size": 65536 00:18:28.241 }, 00:18:28.241 { 00:18:28.241 "name": "BaseBdev2", 00:18:28.241 "uuid": "fa45e431-04c4-5119-bb3e-748913d2cc8d", 00:18:28.241 "is_configured": true, 00:18:28.241 "data_offset": 0, 00:18:28.241 "data_size": 65536 00:18:28.241 } 00:18:28.241 ] 00:18:28.241 }' 00:18:28.241 11:59:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:28.241 11:59:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:18:28.241 11:59:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:28.241 11:59:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:18:28.241 11:59:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:18:28.500 [2024-07-12 11:59:18.598718] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:18:28.500 [2024-07-12 11:59:18.602975] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf96590 00:18:28.500 [2024-07-12 11:59:18.604035] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:18:28.500 11:59:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:18:29.437 11:59:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:29.437 11:59:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:29.437 11:59:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:29.437 11:59:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:29.437 11:59:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:29.437 11:59:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:29.437 11:59:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:29.695 11:59:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:29.695 "name": "raid_bdev1", 00:18:29.695 "uuid": "abda5314-75f5-4a4a-949d-f31d02330a49", 00:18:29.695 "strip_size_kb": 0, 00:18:29.695 "state": "online", 00:18:29.695 "raid_level": "raid1", 00:18:29.695 "superblock": false, 00:18:29.695 "num_base_bdevs": 2, 00:18:29.695 "num_base_bdevs_discovered": 2, 00:18:29.695 "num_base_bdevs_operational": 2, 00:18:29.695 "process": { 00:18:29.695 "type": "rebuild", 00:18:29.695 "target": "spare", 00:18:29.695 "progress": { 00:18:29.695 "blocks": 22528, 00:18:29.695 "percent": 34 00:18:29.695 } 00:18:29.695 }, 00:18:29.695 "base_bdevs_list": [ 00:18:29.695 { 00:18:29.695 "name": "spare", 00:18:29.695 "uuid": "1dc3fe1d-6f00-59d1-a889-57ddf1f177c0", 00:18:29.695 "is_configured": true, 00:18:29.695 "data_offset": 0, 00:18:29.695 "data_size": 65536 00:18:29.695 }, 00:18:29.695 { 00:18:29.695 "name": "BaseBdev2", 00:18:29.695 "uuid": "fa45e431-04c4-5119-bb3e-748913d2cc8d", 00:18:29.695 "is_configured": true, 00:18:29.695 "data_offset": 0, 00:18:29.695 "data_size": 65536 00:18:29.695 } 00:18:29.695 ] 00:18:29.695 }' 00:18:29.695 11:59:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:29.695 11:59:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:29.695 11:59:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:29.695 11:59:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:29.695 11:59:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:18:29.695 11:59:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:18:29.695 11:59:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:18:29.695 11:59:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:18:29.695 11:59:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=579 00:18:29.695 11:59:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:18:29.695 11:59:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:29.695 11:59:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:29.695 11:59:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:29.695 11:59:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:29.695 11:59:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:29.695 11:59:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:29.695 11:59:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:29.953 11:59:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:29.953 "name": "raid_bdev1", 00:18:29.953 "uuid": "abda5314-75f5-4a4a-949d-f31d02330a49", 00:18:29.953 "strip_size_kb": 0, 00:18:29.953 "state": "online", 00:18:29.953 "raid_level": "raid1", 00:18:29.953 "superblock": false, 00:18:29.953 "num_base_bdevs": 2, 00:18:29.953 "num_base_bdevs_discovered": 2, 00:18:29.953 "num_base_bdevs_operational": 2, 00:18:29.953 "process": { 00:18:29.953 "type": "rebuild", 00:18:29.953 "target": "spare", 00:18:29.953 "progress": { 00:18:29.953 "blocks": 28672, 00:18:29.953 "percent": 43 00:18:29.953 } 00:18:29.953 }, 00:18:29.953 "base_bdevs_list": [ 00:18:29.953 { 00:18:29.953 "name": "spare", 00:18:29.953 "uuid": "1dc3fe1d-6f00-59d1-a889-57ddf1f177c0", 00:18:29.953 "is_configured": true, 00:18:29.953 "data_offset": 0, 00:18:29.953 "data_size": 65536 00:18:29.954 }, 00:18:29.954 { 00:18:29.954 "name": "BaseBdev2", 00:18:29.954 "uuid": "fa45e431-04c4-5119-bb3e-748913d2cc8d", 00:18:29.954 "is_configured": true, 00:18:29.954 "data_offset": 0, 00:18:29.954 "data_size": 65536 00:18:29.954 } 00:18:29.954 ] 00:18:29.954 }' 00:18:29.954 11:59:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:29.954 11:59:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:29.954 11:59:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:29.954 11:59:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:29.954 11:59:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:18:31.331 11:59:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:18:31.331 11:59:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:31.331 11:59:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:31.331 11:59:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:31.331 11:59:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:31.331 11:59:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:31.331 11:59:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:31.331 11:59:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:31.331 11:59:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:31.331 "name": "raid_bdev1", 00:18:31.331 "uuid": "abda5314-75f5-4a4a-949d-f31d02330a49", 00:18:31.331 "strip_size_kb": 0, 00:18:31.331 "state": "online", 00:18:31.331 "raid_level": "raid1", 00:18:31.331 "superblock": false, 00:18:31.331 "num_base_bdevs": 2, 00:18:31.331 "num_base_bdevs_discovered": 2, 00:18:31.331 "num_base_bdevs_operational": 2, 00:18:31.331 "process": { 00:18:31.331 "type": "rebuild", 00:18:31.331 "target": "spare", 00:18:31.331 "progress": { 00:18:31.331 "blocks": 53248, 00:18:31.331 "percent": 81 00:18:31.331 } 00:18:31.331 }, 00:18:31.331 "base_bdevs_list": [ 00:18:31.331 { 00:18:31.331 "name": "spare", 00:18:31.331 "uuid": "1dc3fe1d-6f00-59d1-a889-57ddf1f177c0", 00:18:31.331 "is_configured": true, 00:18:31.331 "data_offset": 0, 00:18:31.331 "data_size": 65536 00:18:31.331 }, 00:18:31.331 { 00:18:31.331 "name": "BaseBdev2", 00:18:31.331 "uuid": "fa45e431-04c4-5119-bb3e-748913d2cc8d", 00:18:31.331 "is_configured": true, 00:18:31.331 "data_offset": 0, 00:18:31.331 "data_size": 65536 00:18:31.331 } 00:18:31.331 ] 00:18:31.331 }' 00:18:31.331 11:59:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:31.331 11:59:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:31.331 11:59:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:31.331 11:59:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:31.331 11:59:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:18:31.589 [2024-07-12 11:59:21.826225] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:18:31.589 [2024-07-12 11:59:21.826265] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:18:31.589 [2024-07-12 11:59:21.826289] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:32.215 11:59:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:18:32.215 11:59:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:32.215 11:59:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:32.215 11:59:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:32.215 11:59:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:32.215 11:59:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:32.215 11:59:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:32.215 11:59:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:32.502 11:59:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:32.502 "name": "raid_bdev1", 00:18:32.502 "uuid": "abda5314-75f5-4a4a-949d-f31d02330a49", 00:18:32.502 "strip_size_kb": 0, 00:18:32.502 "state": "online", 00:18:32.502 "raid_level": "raid1", 00:18:32.502 "superblock": false, 00:18:32.502 "num_base_bdevs": 2, 00:18:32.502 "num_base_bdevs_discovered": 2, 00:18:32.502 "num_base_bdevs_operational": 2, 00:18:32.502 "base_bdevs_list": [ 00:18:32.502 { 00:18:32.502 "name": "spare", 00:18:32.502 "uuid": "1dc3fe1d-6f00-59d1-a889-57ddf1f177c0", 00:18:32.502 "is_configured": true, 00:18:32.502 "data_offset": 0, 00:18:32.502 "data_size": 65536 00:18:32.502 }, 00:18:32.502 { 00:18:32.502 "name": "BaseBdev2", 00:18:32.502 "uuid": "fa45e431-04c4-5119-bb3e-748913d2cc8d", 00:18:32.502 "is_configured": true, 00:18:32.502 "data_offset": 0, 00:18:32.502 "data_size": 65536 00:18:32.502 } 00:18:32.502 ] 00:18:32.502 }' 00:18:32.502 11:59:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:32.503 11:59:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:18:32.503 11:59:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:32.503 11:59:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:18:32.503 11:59:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:18:32.503 11:59:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:18:32.503 11:59:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:32.503 11:59:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:18:32.503 11:59:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:18:32.503 11:59:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:32.503 11:59:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:32.503 11:59:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:32.761 11:59:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:32.761 "name": "raid_bdev1", 00:18:32.761 "uuid": "abda5314-75f5-4a4a-949d-f31d02330a49", 00:18:32.761 "strip_size_kb": 0, 00:18:32.761 "state": "online", 00:18:32.761 "raid_level": "raid1", 00:18:32.761 "superblock": false, 00:18:32.761 "num_base_bdevs": 2, 00:18:32.761 "num_base_bdevs_discovered": 2, 00:18:32.761 "num_base_bdevs_operational": 2, 00:18:32.761 "base_bdevs_list": [ 00:18:32.762 { 00:18:32.762 "name": "spare", 00:18:32.762 "uuid": "1dc3fe1d-6f00-59d1-a889-57ddf1f177c0", 00:18:32.762 "is_configured": true, 00:18:32.762 "data_offset": 0, 00:18:32.762 "data_size": 65536 00:18:32.762 }, 00:18:32.762 { 00:18:32.762 "name": "BaseBdev2", 00:18:32.762 "uuid": "fa45e431-04c4-5119-bb3e-748913d2cc8d", 00:18:32.762 "is_configured": true, 00:18:32.762 "data_offset": 0, 00:18:32.762 "data_size": 65536 00:18:32.762 } 00:18:32.762 ] 00:18:32.762 }' 00:18:32.762 11:59:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:32.762 11:59:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:18:32.762 11:59:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:32.762 11:59:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:18:32.762 11:59:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:32.762 11:59:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:32.762 11:59:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:32.762 11:59:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:32.762 11:59:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:32.762 11:59:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:32.762 11:59:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:32.762 11:59:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:32.762 11:59:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:32.762 11:59:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:32.762 11:59:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:32.762 11:59:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:33.020 11:59:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:33.020 "name": "raid_bdev1", 00:18:33.020 "uuid": "abda5314-75f5-4a4a-949d-f31d02330a49", 00:18:33.020 "strip_size_kb": 0, 00:18:33.020 "state": "online", 00:18:33.020 "raid_level": "raid1", 00:18:33.020 "superblock": false, 00:18:33.020 "num_base_bdevs": 2, 00:18:33.020 "num_base_bdevs_discovered": 2, 00:18:33.020 "num_base_bdevs_operational": 2, 00:18:33.020 "base_bdevs_list": [ 00:18:33.020 { 00:18:33.020 "name": "spare", 00:18:33.020 "uuid": "1dc3fe1d-6f00-59d1-a889-57ddf1f177c0", 00:18:33.020 "is_configured": true, 00:18:33.020 "data_offset": 0, 00:18:33.020 "data_size": 65536 00:18:33.020 }, 00:18:33.020 { 00:18:33.020 "name": "BaseBdev2", 00:18:33.020 "uuid": "fa45e431-04c4-5119-bb3e-748913d2cc8d", 00:18:33.020 "is_configured": true, 00:18:33.020 "data_offset": 0, 00:18:33.020 "data_size": 65536 00:18:33.020 } 00:18:33.020 ] 00:18:33.020 }' 00:18:33.020 11:59:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:33.020 11:59:23 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:18:33.278 11:59:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:33.537 [2024-07-12 11:59:23.670884] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:33.537 [2024-07-12 11:59:23.670903] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:33.537 [2024-07-12 11:59:23.670945] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:33.537 [2024-07-12 11:59:23.670983] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:33.537 [2024-07-12 11:59:23.670989] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf976c0 name raid_bdev1, state offline 00:18:33.537 11:59:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:18:33.537 11:59:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:33.796 11:59:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:18:33.796 11:59:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:18:33.796 11:59:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:18:33.796 11:59:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:18:33.796 11:59:23 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:18:33.796 11:59:23 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:18:33.796 11:59:23 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:18:33.796 11:59:23 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:18:33.796 11:59:23 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:18:33.796 11:59:23 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:18:33.796 11:59:23 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:18:33.796 11:59:23 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:18:33.796 11:59:23 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:18:33.796 /dev/nbd0 00:18:33.796 11:59:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:18:33.796 11:59:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:18:33.796 11:59:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:18:33.797 11:59:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:18:33.797 11:59:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:18:33.797 11:59:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:18:33.797 11:59:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:18:33.797 11:59:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:18:33.797 11:59:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:18:33.797 11:59:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:18:33.797 11:59:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:18:33.797 1+0 records in 00:18:33.797 1+0 records out 00:18:33.797 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000187007 s, 21.9 MB/s 00:18:33.797 11:59:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:33.797 11:59:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:18:33.797 11:59:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:33.797 11:59:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:18:33.797 11:59:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:18:33.797 11:59:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:18:33.797 11:59:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:18:33.797 11:59:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:18:34.056 /dev/nbd1 00:18:34.056 11:59:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:18:34.056 11:59:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:18:34.056 11:59:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:18:34.056 11:59:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:18:34.056 11:59:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:18:34.056 11:59:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:18:34.056 11:59:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:18:34.056 11:59:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:18:34.056 11:59:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:18:34.056 11:59:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:18:34.056 11:59:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:18:34.056 1+0 records in 00:18:34.056 1+0 records out 00:18:34.056 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000231852 s, 17.7 MB/s 00:18:34.056 11:59:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:34.056 11:59:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:18:34.056 11:59:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:34.056 11:59:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:18:34.056 11:59:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:18:34.056 11:59:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:18:34.056 11:59:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:18:34.056 11:59:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:18:34.056 11:59:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:18:34.056 11:59:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:18:34.056 11:59:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:18:34.056 11:59:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:18:34.056 11:59:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:18:34.056 11:59:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:18:34.056 11:59:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:18:34.315 11:59:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:18:34.315 11:59:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:18:34.315 11:59:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:18:34.315 11:59:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:18:34.315 11:59:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:18:34.315 11:59:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:18:34.315 11:59:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:18:34.315 11:59:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:18:34.315 11:59:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:18:34.315 11:59:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:18:34.574 11:59:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:18:34.574 11:59:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:18:34.574 11:59:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:18:34.574 11:59:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:18:34.574 11:59:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:18:34.574 11:59:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:18:34.574 11:59:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:18:34.574 11:59:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:18:34.574 11:59:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:18:34.574 11:59:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 692793 00:18:34.574 11:59:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@948 -- # '[' -z 692793 ']' 00:18:34.574 11:59:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # kill -0 692793 00:18:34.574 11:59:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # uname 00:18:34.574 11:59:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:34.574 11:59:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 692793 00:18:34.575 11:59:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:34.575 11:59:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:34.575 11:59:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 692793' 00:18:34.575 killing process with pid 692793 00:18:34.575 11:59:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # kill 692793 00:18:34.575 Received shutdown signal, test time was about 60.000000 seconds 00:18:34.575 00:18:34.575 Latency(us) 00:18:34.575 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:18:34.575 =================================================================================================================== 00:18:34.575 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:18:34.575 [2024-07-12 11:59:24.737236] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:34.575 11:59:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@972 -- # wait 692793 00:18:34.575 [2024-07-12 11:59:24.759802] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:34.833 11:59:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:18:34.833 00:18:34.833 real 0m16.590s 00:18:34.833 user 0m22.828s 00:18:34.833 sys 0m2.813s 00:18:34.833 11:59:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:34.833 11:59:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:18:34.833 ************************************ 00:18:34.833 END TEST raid_rebuild_test 00:18:34.833 ************************************ 00:18:34.833 11:59:24 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:34.833 11:59:24 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 2 true false true 00:18:34.833 11:59:24 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:18:34.833 11:59:24 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:34.833 11:59:24 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:34.833 ************************************ 00:18:34.833 START TEST raid_rebuild_test_sb 00:18:34.833 ************************************ 00:18:34.833 11:59:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:18:34.833 11:59:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:18:34.833 11:59:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:18:34.833 11:59:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:18:34.833 11:59:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:18:34.833 11:59:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:18:34.833 11:59:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:18:34.833 11:59:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:18:34.833 11:59:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:18:34.833 11:59:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:18:34.833 11:59:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:18:34.833 11:59:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:18:34.833 11:59:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:18:34.833 11:59:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:18:34.833 11:59:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:18:34.833 11:59:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:18:34.833 11:59:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:18:34.833 11:59:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:18:34.833 11:59:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:18:34.833 11:59:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:18:34.833 11:59:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:18:34.833 11:59:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:18:34.833 11:59:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:18:34.833 11:59:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:18:34.833 11:59:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:18:34.833 11:59:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=695766 00:18:34.833 11:59:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 695766 /var/tmp/spdk-raid.sock 00:18:34.833 11:59:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:18:34.833 11:59:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@829 -- # '[' -z 695766 ']' 00:18:34.833 11:59:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:34.833 11:59:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:34.833 11:59:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:34.833 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:34.833 11:59:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:34.833 11:59:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:34.833 [2024-07-12 11:59:25.057762] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:18:34.833 [2024-07-12 11:59:25.057799] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid695766 ] 00:18:34.833 I/O size of 3145728 is greater than zero copy threshold (65536). 00:18:34.833 Zero copy mechanism will not be used. 00:18:35.091 [2024-07-12 11:59:25.120234] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:35.091 [2024-07-12 11:59:25.199406] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:18:35.091 [2024-07-12 11:59:25.253272] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:35.091 [2024-07-12 11:59:25.253299] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:35.660 11:59:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:35.660 11:59:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@862 -- # return 0 00:18:35.660 11:59:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:18:35.660 11:59:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:35.919 BaseBdev1_malloc 00:18:35.919 11:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:18:35.919 [2024-07-12 11:59:26.148677] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:18:35.919 [2024-07-12 11:59:26.148717] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:35.919 [2024-07-12 11:59:26.148729] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ad7010 00:18:35.919 [2024-07-12 11:59:26.148735] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:35.919 [2024-07-12 11:59:26.149859] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:35.919 [2024-07-12 11:59:26.149878] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:35.919 BaseBdev1 00:18:35.919 11:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:18:35.919 11:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:36.178 BaseBdev2_malloc 00:18:36.178 11:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:18:36.437 [2024-07-12 11:59:26.477214] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:18:36.437 [2024-07-12 11:59:26.477242] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:36.437 [2024-07-12 11:59:26.477255] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ad7b60 00:18:36.437 [2024-07-12 11:59:26.477260] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:36.437 [2024-07-12 11:59:26.478295] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:36.437 [2024-07-12 11:59:26.478313] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:36.437 BaseBdev2 00:18:36.437 11:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:18:36.437 spare_malloc 00:18:36.437 11:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:18:36.695 spare_delay 00:18:36.695 11:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:18:36.955 [2024-07-12 11:59:26.977889] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:18:36.955 [2024-07-12 11:59:26.977919] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:36.955 [2024-07-12 11:59:26.977930] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c82880 00:18:36.955 [2024-07-12 11:59:26.977940] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:36.955 [2024-07-12 11:59:26.979015] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:36.955 [2024-07-12 11:59:26.979034] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:18:36.955 spare 00:18:36.955 11:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:18:36.955 [2024-07-12 11:59:27.134313] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:36.955 [2024-07-12 11:59:27.135169] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:36.955 [2024-07-12 11:59:27.135276] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c866c0 00:18:36.955 [2024-07-12 11:59:27.135284] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:36.955 [2024-07-12 11:59:27.135411] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c82b10 00:18:36.955 [2024-07-12 11:59:27.135503] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c866c0 00:18:36.955 [2024-07-12 11:59:27.135508] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c866c0 00:18:36.955 [2024-07-12 11:59:27.135580] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:36.955 11:59:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:36.955 11:59:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:36.955 11:59:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:36.955 11:59:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:36.955 11:59:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:36.955 11:59:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:36.955 11:59:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:36.955 11:59:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:36.955 11:59:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:36.955 11:59:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:36.955 11:59:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:36.955 11:59:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:37.214 11:59:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:37.214 "name": "raid_bdev1", 00:18:37.214 "uuid": "2ec41899-612d-43f6-aa81-54d23b36f4c4", 00:18:37.214 "strip_size_kb": 0, 00:18:37.214 "state": "online", 00:18:37.214 "raid_level": "raid1", 00:18:37.214 "superblock": true, 00:18:37.214 "num_base_bdevs": 2, 00:18:37.214 "num_base_bdevs_discovered": 2, 00:18:37.214 "num_base_bdevs_operational": 2, 00:18:37.214 "base_bdevs_list": [ 00:18:37.214 { 00:18:37.214 "name": "BaseBdev1", 00:18:37.214 "uuid": "6931932e-99f7-5ae3-8022-71413e7d2993", 00:18:37.214 "is_configured": true, 00:18:37.214 "data_offset": 2048, 00:18:37.214 "data_size": 63488 00:18:37.214 }, 00:18:37.214 { 00:18:37.214 "name": "BaseBdev2", 00:18:37.214 "uuid": "da16bc01-d773-53b7-8697-b9ae47c8413b", 00:18:37.214 "is_configured": true, 00:18:37.214 "data_offset": 2048, 00:18:37.214 "data_size": 63488 00:18:37.214 } 00:18:37.214 ] 00:18:37.214 }' 00:18:37.214 11:59:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:37.214 11:59:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:37.782 11:59:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:37.782 11:59:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:18:37.782 [2024-07-12 11:59:27.940524] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:37.782 11:59:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:18:37.782 11:59:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:37.782 11:59:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:18:38.041 11:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:18:38.041 11:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:18:38.041 11:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:18:38.041 11:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:18:38.041 11:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:18:38.041 11:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:18:38.041 11:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:18:38.041 11:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:18:38.041 11:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:18:38.041 11:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:18:38.041 11:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:18:38.041 11:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:18:38.041 11:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:18:38.041 11:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:18:38.041 [2024-07-12 11:59:28.285301] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c88af0 00:18:38.301 /dev/nbd0 00:18:38.301 11:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:18:38.301 11:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:18:38.301 11:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:18:38.301 11:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:18:38.301 11:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:18:38.301 11:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:18:38.301 11:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:18:38.301 11:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:18:38.301 11:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:18:38.301 11:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:18:38.301 11:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:18:38.301 1+0 records in 00:18:38.301 1+0 records out 00:18:38.301 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000204918 s, 20.0 MB/s 00:18:38.301 11:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:38.301 11:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:18:38.301 11:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:38.301 11:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:18:38.301 11:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:18:38.301 11:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:18:38.301 11:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:18:38.301 11:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:18:38.301 11:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:18:38.301 11:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:18:41.595 63488+0 records in 00:18:41.595 63488+0 records out 00:18:41.595 32505856 bytes (33 MB, 31 MiB) copied, 3.3024 s, 9.8 MB/s 00:18:41.595 11:59:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:18:41.595 11:59:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:18:41.595 11:59:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:18:41.595 11:59:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:18:41.595 11:59:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:18:41.595 11:59:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:18:41.595 11:59:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:18:41.595 [2024-07-12 11:59:31.828984] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:41.595 11:59:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:18:41.595 11:59:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:18:41.595 11:59:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:18:41.595 11:59:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:18:41.595 11:59:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:18:41.595 11:59:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:18:41.854 11:59:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:18:41.854 11:59:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:18:41.854 11:59:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:18:41.854 [2024-07-12 11:59:31.997449] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:41.854 11:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:41.854 11:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:41.854 11:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:41.854 11:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:41.854 11:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:41.854 11:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:18:41.854 11:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:41.854 11:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:41.854 11:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:41.854 11:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:41.854 11:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:41.854 11:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:42.113 11:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:42.113 "name": "raid_bdev1", 00:18:42.113 "uuid": "2ec41899-612d-43f6-aa81-54d23b36f4c4", 00:18:42.113 "strip_size_kb": 0, 00:18:42.113 "state": "online", 00:18:42.113 "raid_level": "raid1", 00:18:42.113 "superblock": true, 00:18:42.113 "num_base_bdevs": 2, 00:18:42.113 "num_base_bdevs_discovered": 1, 00:18:42.113 "num_base_bdevs_operational": 1, 00:18:42.113 "base_bdevs_list": [ 00:18:42.113 { 00:18:42.113 "name": null, 00:18:42.113 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:42.113 "is_configured": false, 00:18:42.113 "data_offset": 2048, 00:18:42.113 "data_size": 63488 00:18:42.113 }, 00:18:42.113 { 00:18:42.113 "name": "BaseBdev2", 00:18:42.113 "uuid": "da16bc01-d773-53b7-8697-b9ae47c8413b", 00:18:42.113 "is_configured": true, 00:18:42.113 "data_offset": 2048, 00:18:42.113 "data_size": 63488 00:18:42.113 } 00:18:42.113 ] 00:18:42.113 }' 00:18:42.113 11:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:42.113 11:59:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:42.682 11:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:18:42.682 [2024-07-12 11:59:32.779471] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:18:42.682 [2024-07-12 11:59:32.783817] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c88af0 00:18:42.682 [2024-07-12 11:59:32.785215] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:18:42.682 11:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:18:43.621 11:59:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:43.621 11:59:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:43.621 11:59:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:43.621 11:59:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:43.621 11:59:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:43.621 11:59:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:43.621 11:59:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:43.879 11:59:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:43.879 "name": "raid_bdev1", 00:18:43.879 "uuid": "2ec41899-612d-43f6-aa81-54d23b36f4c4", 00:18:43.879 "strip_size_kb": 0, 00:18:43.879 "state": "online", 00:18:43.879 "raid_level": "raid1", 00:18:43.879 "superblock": true, 00:18:43.879 "num_base_bdevs": 2, 00:18:43.879 "num_base_bdevs_discovered": 2, 00:18:43.879 "num_base_bdevs_operational": 2, 00:18:43.879 "process": { 00:18:43.879 "type": "rebuild", 00:18:43.879 "target": "spare", 00:18:43.879 "progress": { 00:18:43.879 "blocks": 22528, 00:18:43.879 "percent": 35 00:18:43.879 } 00:18:43.879 }, 00:18:43.879 "base_bdevs_list": [ 00:18:43.879 { 00:18:43.879 "name": "spare", 00:18:43.879 "uuid": "22dd0865-25b8-5ebd-b5bb-06a429bbbea7", 00:18:43.879 "is_configured": true, 00:18:43.879 "data_offset": 2048, 00:18:43.879 "data_size": 63488 00:18:43.879 }, 00:18:43.879 { 00:18:43.879 "name": "BaseBdev2", 00:18:43.879 "uuid": "da16bc01-d773-53b7-8697-b9ae47c8413b", 00:18:43.879 "is_configured": true, 00:18:43.879 "data_offset": 2048, 00:18:43.879 "data_size": 63488 00:18:43.879 } 00:18:43.879 ] 00:18:43.879 }' 00:18:43.879 11:59:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:43.879 11:59:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:43.879 11:59:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:43.879 11:59:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:43.879 11:59:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:18:44.138 [2024-07-12 11:59:34.203836] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:18:44.138 [2024-07-12 11:59:34.295761] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:18:44.138 [2024-07-12 11:59:34.295800] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:44.138 [2024-07-12 11:59:34.295809] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:18:44.138 [2024-07-12 11:59:34.295813] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:18:44.138 11:59:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:44.138 11:59:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:44.138 11:59:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:44.138 11:59:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:44.138 11:59:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:44.138 11:59:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:18:44.138 11:59:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:44.138 11:59:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:44.138 11:59:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:44.138 11:59:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:44.138 11:59:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:44.138 11:59:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:44.396 11:59:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:44.396 "name": "raid_bdev1", 00:18:44.396 "uuid": "2ec41899-612d-43f6-aa81-54d23b36f4c4", 00:18:44.396 "strip_size_kb": 0, 00:18:44.396 "state": "online", 00:18:44.396 "raid_level": "raid1", 00:18:44.396 "superblock": true, 00:18:44.396 "num_base_bdevs": 2, 00:18:44.396 "num_base_bdevs_discovered": 1, 00:18:44.396 "num_base_bdevs_operational": 1, 00:18:44.396 "base_bdevs_list": [ 00:18:44.396 { 00:18:44.396 "name": null, 00:18:44.396 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:44.396 "is_configured": false, 00:18:44.396 "data_offset": 2048, 00:18:44.396 "data_size": 63488 00:18:44.396 }, 00:18:44.396 { 00:18:44.396 "name": "BaseBdev2", 00:18:44.396 "uuid": "da16bc01-d773-53b7-8697-b9ae47c8413b", 00:18:44.396 "is_configured": true, 00:18:44.396 "data_offset": 2048, 00:18:44.396 "data_size": 63488 00:18:44.396 } 00:18:44.396 ] 00:18:44.396 }' 00:18:44.396 11:59:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:44.396 11:59:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:44.965 11:59:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:18:44.965 11:59:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:44.965 11:59:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:18:44.965 11:59:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:18:44.965 11:59:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:44.965 11:59:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:44.965 11:59:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:44.965 11:59:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:44.965 "name": "raid_bdev1", 00:18:44.965 "uuid": "2ec41899-612d-43f6-aa81-54d23b36f4c4", 00:18:44.965 "strip_size_kb": 0, 00:18:44.965 "state": "online", 00:18:44.965 "raid_level": "raid1", 00:18:44.965 "superblock": true, 00:18:44.965 "num_base_bdevs": 2, 00:18:44.965 "num_base_bdevs_discovered": 1, 00:18:44.965 "num_base_bdevs_operational": 1, 00:18:44.965 "base_bdevs_list": [ 00:18:44.965 { 00:18:44.965 "name": null, 00:18:44.965 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:44.965 "is_configured": false, 00:18:44.965 "data_offset": 2048, 00:18:44.965 "data_size": 63488 00:18:44.965 }, 00:18:44.965 { 00:18:44.965 "name": "BaseBdev2", 00:18:44.965 "uuid": "da16bc01-d773-53b7-8697-b9ae47c8413b", 00:18:44.965 "is_configured": true, 00:18:44.965 "data_offset": 2048, 00:18:44.965 "data_size": 63488 00:18:44.965 } 00:18:44.965 ] 00:18:44.965 }' 00:18:44.965 11:59:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:44.965 11:59:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:18:44.965 11:59:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:45.225 11:59:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:18:45.225 11:59:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:18:45.225 [2024-07-12 11:59:35.374748] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:18:45.225 [2024-07-12 11:59:35.379055] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c82b10 00:18:45.225 [2024-07-12 11:59:35.380141] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:18:45.225 11:59:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:18:46.162 11:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:46.162 11:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:46.162 11:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:46.162 11:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:46.162 11:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:46.162 11:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:46.162 11:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:46.421 11:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:46.421 "name": "raid_bdev1", 00:18:46.421 "uuid": "2ec41899-612d-43f6-aa81-54d23b36f4c4", 00:18:46.421 "strip_size_kb": 0, 00:18:46.421 "state": "online", 00:18:46.421 "raid_level": "raid1", 00:18:46.421 "superblock": true, 00:18:46.421 "num_base_bdevs": 2, 00:18:46.421 "num_base_bdevs_discovered": 2, 00:18:46.421 "num_base_bdevs_operational": 2, 00:18:46.421 "process": { 00:18:46.421 "type": "rebuild", 00:18:46.421 "target": "spare", 00:18:46.421 "progress": { 00:18:46.421 "blocks": 22528, 00:18:46.421 "percent": 35 00:18:46.421 } 00:18:46.421 }, 00:18:46.421 "base_bdevs_list": [ 00:18:46.421 { 00:18:46.421 "name": "spare", 00:18:46.421 "uuid": "22dd0865-25b8-5ebd-b5bb-06a429bbbea7", 00:18:46.421 "is_configured": true, 00:18:46.421 "data_offset": 2048, 00:18:46.421 "data_size": 63488 00:18:46.421 }, 00:18:46.421 { 00:18:46.421 "name": "BaseBdev2", 00:18:46.421 "uuid": "da16bc01-d773-53b7-8697-b9ae47c8413b", 00:18:46.421 "is_configured": true, 00:18:46.421 "data_offset": 2048, 00:18:46.421 "data_size": 63488 00:18:46.421 } 00:18:46.421 ] 00:18:46.421 }' 00:18:46.421 11:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:46.421 11:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:46.421 11:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:46.421 11:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:46.421 11:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:18:46.421 11:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:18:46.421 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:18:46.421 11:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:18:46.421 11:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:18:46.421 11:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:18:46.421 11:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=596 00:18:46.421 11:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:18:46.421 11:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:46.421 11:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:46.421 11:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:46.421 11:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:46.421 11:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:46.421 11:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:46.421 11:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:46.681 11:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:46.681 "name": "raid_bdev1", 00:18:46.681 "uuid": "2ec41899-612d-43f6-aa81-54d23b36f4c4", 00:18:46.681 "strip_size_kb": 0, 00:18:46.681 "state": "online", 00:18:46.681 "raid_level": "raid1", 00:18:46.681 "superblock": true, 00:18:46.681 "num_base_bdevs": 2, 00:18:46.681 "num_base_bdevs_discovered": 2, 00:18:46.681 "num_base_bdevs_operational": 2, 00:18:46.681 "process": { 00:18:46.681 "type": "rebuild", 00:18:46.681 "target": "spare", 00:18:46.681 "progress": { 00:18:46.681 "blocks": 26624, 00:18:46.681 "percent": 41 00:18:46.681 } 00:18:46.681 }, 00:18:46.681 "base_bdevs_list": [ 00:18:46.681 { 00:18:46.681 "name": "spare", 00:18:46.681 "uuid": "22dd0865-25b8-5ebd-b5bb-06a429bbbea7", 00:18:46.681 "is_configured": true, 00:18:46.681 "data_offset": 2048, 00:18:46.681 "data_size": 63488 00:18:46.681 }, 00:18:46.681 { 00:18:46.681 "name": "BaseBdev2", 00:18:46.681 "uuid": "da16bc01-d773-53b7-8697-b9ae47c8413b", 00:18:46.681 "is_configured": true, 00:18:46.681 "data_offset": 2048, 00:18:46.681 "data_size": 63488 00:18:46.681 } 00:18:46.681 ] 00:18:46.681 }' 00:18:46.681 11:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:46.681 11:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:46.681 11:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:46.681 11:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:46.681 11:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:18:47.616 11:59:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:18:47.616 11:59:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:47.616 11:59:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:47.616 11:59:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:47.616 11:59:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:47.616 11:59:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:47.876 11:59:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:47.876 11:59:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:47.876 11:59:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:47.876 "name": "raid_bdev1", 00:18:47.876 "uuid": "2ec41899-612d-43f6-aa81-54d23b36f4c4", 00:18:47.876 "strip_size_kb": 0, 00:18:47.876 "state": "online", 00:18:47.876 "raid_level": "raid1", 00:18:47.876 "superblock": true, 00:18:47.876 "num_base_bdevs": 2, 00:18:47.876 "num_base_bdevs_discovered": 2, 00:18:47.876 "num_base_bdevs_operational": 2, 00:18:47.876 "process": { 00:18:47.876 "type": "rebuild", 00:18:47.876 "target": "spare", 00:18:47.876 "progress": { 00:18:47.876 "blocks": 53248, 00:18:47.876 "percent": 83 00:18:47.876 } 00:18:47.876 }, 00:18:47.876 "base_bdevs_list": [ 00:18:47.876 { 00:18:47.876 "name": "spare", 00:18:47.876 "uuid": "22dd0865-25b8-5ebd-b5bb-06a429bbbea7", 00:18:47.876 "is_configured": true, 00:18:47.876 "data_offset": 2048, 00:18:47.876 "data_size": 63488 00:18:47.876 }, 00:18:47.876 { 00:18:47.876 "name": "BaseBdev2", 00:18:47.876 "uuid": "da16bc01-d773-53b7-8697-b9ae47c8413b", 00:18:47.876 "is_configured": true, 00:18:47.876 "data_offset": 2048, 00:18:47.876 "data_size": 63488 00:18:47.876 } 00:18:47.876 ] 00:18:47.876 }' 00:18:47.876 11:59:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:47.876 11:59:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:47.876 11:59:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:47.876 11:59:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:47.876 11:59:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:18:48.441 [2024-07-12 11:59:38.501714] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:18:48.441 [2024-07-12 11:59:38.501754] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:18:48.441 [2024-07-12 11:59:38.501809] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:49.006 11:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:18:49.006 11:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:49.006 11:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:49.006 11:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:49.006 11:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:49.006 11:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:49.006 11:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:49.006 11:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:49.264 11:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:49.264 "name": "raid_bdev1", 00:18:49.264 "uuid": "2ec41899-612d-43f6-aa81-54d23b36f4c4", 00:18:49.264 "strip_size_kb": 0, 00:18:49.264 "state": "online", 00:18:49.264 "raid_level": "raid1", 00:18:49.264 "superblock": true, 00:18:49.264 "num_base_bdevs": 2, 00:18:49.264 "num_base_bdevs_discovered": 2, 00:18:49.264 "num_base_bdevs_operational": 2, 00:18:49.264 "base_bdevs_list": [ 00:18:49.264 { 00:18:49.264 "name": "spare", 00:18:49.264 "uuid": "22dd0865-25b8-5ebd-b5bb-06a429bbbea7", 00:18:49.264 "is_configured": true, 00:18:49.264 "data_offset": 2048, 00:18:49.264 "data_size": 63488 00:18:49.264 }, 00:18:49.264 { 00:18:49.264 "name": "BaseBdev2", 00:18:49.264 "uuid": "da16bc01-d773-53b7-8697-b9ae47c8413b", 00:18:49.264 "is_configured": true, 00:18:49.264 "data_offset": 2048, 00:18:49.264 "data_size": 63488 00:18:49.264 } 00:18:49.264 ] 00:18:49.264 }' 00:18:49.265 11:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:49.265 11:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:18:49.265 11:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:49.265 11:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:18:49.265 11:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:18:49.265 11:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:18:49.265 11:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:49.265 11:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:18:49.265 11:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:18:49.265 11:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:49.265 11:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:49.265 11:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:49.265 11:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:49.265 "name": "raid_bdev1", 00:18:49.265 "uuid": "2ec41899-612d-43f6-aa81-54d23b36f4c4", 00:18:49.265 "strip_size_kb": 0, 00:18:49.265 "state": "online", 00:18:49.265 "raid_level": "raid1", 00:18:49.265 "superblock": true, 00:18:49.265 "num_base_bdevs": 2, 00:18:49.265 "num_base_bdevs_discovered": 2, 00:18:49.265 "num_base_bdevs_operational": 2, 00:18:49.265 "base_bdevs_list": [ 00:18:49.265 { 00:18:49.265 "name": "spare", 00:18:49.265 "uuid": "22dd0865-25b8-5ebd-b5bb-06a429bbbea7", 00:18:49.265 "is_configured": true, 00:18:49.265 "data_offset": 2048, 00:18:49.265 "data_size": 63488 00:18:49.265 }, 00:18:49.265 { 00:18:49.265 "name": "BaseBdev2", 00:18:49.265 "uuid": "da16bc01-d773-53b7-8697-b9ae47c8413b", 00:18:49.265 "is_configured": true, 00:18:49.265 "data_offset": 2048, 00:18:49.265 "data_size": 63488 00:18:49.265 } 00:18:49.265 ] 00:18:49.265 }' 00:18:49.265 11:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:49.524 11:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:18:49.524 11:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:49.524 11:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:18:49.524 11:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:49.524 11:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:49.524 11:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:49.524 11:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:49.524 11:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:49.524 11:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:49.524 11:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:49.524 11:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:49.524 11:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:49.524 11:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:49.524 11:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:49.524 11:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:49.524 11:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:49.524 "name": "raid_bdev1", 00:18:49.524 "uuid": "2ec41899-612d-43f6-aa81-54d23b36f4c4", 00:18:49.524 "strip_size_kb": 0, 00:18:49.524 "state": "online", 00:18:49.524 "raid_level": "raid1", 00:18:49.524 "superblock": true, 00:18:49.524 "num_base_bdevs": 2, 00:18:49.524 "num_base_bdevs_discovered": 2, 00:18:49.524 "num_base_bdevs_operational": 2, 00:18:49.524 "base_bdevs_list": [ 00:18:49.524 { 00:18:49.524 "name": "spare", 00:18:49.524 "uuid": "22dd0865-25b8-5ebd-b5bb-06a429bbbea7", 00:18:49.524 "is_configured": true, 00:18:49.524 "data_offset": 2048, 00:18:49.524 "data_size": 63488 00:18:49.524 }, 00:18:49.524 { 00:18:49.524 "name": "BaseBdev2", 00:18:49.524 "uuid": "da16bc01-d773-53b7-8697-b9ae47c8413b", 00:18:49.524 "is_configured": true, 00:18:49.524 "data_offset": 2048, 00:18:49.524 "data_size": 63488 00:18:49.524 } 00:18:49.524 ] 00:18:49.524 }' 00:18:49.524 11:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:49.524 11:59:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:50.090 11:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:50.347 [2024-07-12 11:59:40.338384] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:50.347 [2024-07-12 11:59:40.338402] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:50.347 [2024-07-12 11:59:40.338444] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:50.347 [2024-07-12 11:59:40.338482] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:50.347 [2024-07-12 11:59:40.338488] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c866c0 name raid_bdev1, state offline 00:18:50.347 11:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:50.347 11:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:18:50.347 11:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:18:50.347 11:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:18:50.347 11:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:18:50.347 11:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:18:50.347 11:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:18:50.347 11:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:18:50.347 11:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:18:50.347 11:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:18:50.347 11:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:18:50.347 11:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:18:50.347 11:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:18:50.347 11:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:18:50.347 11:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:18:50.605 /dev/nbd0 00:18:50.605 11:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:18:50.605 11:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:18:50.605 11:59:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:18:50.605 11:59:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:18:50.605 11:59:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:18:50.605 11:59:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:18:50.605 11:59:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:18:50.605 11:59:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:18:50.605 11:59:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:18:50.605 11:59:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:18:50.605 11:59:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:18:50.605 1+0 records in 00:18:50.605 1+0 records out 00:18:50.605 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000189206 s, 21.6 MB/s 00:18:50.605 11:59:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:50.605 11:59:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:18:50.605 11:59:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:50.605 11:59:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:18:50.605 11:59:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:18:50.605 11:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:18:50.605 11:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:18:50.605 11:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:18:50.863 /dev/nbd1 00:18:50.863 11:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:18:50.863 11:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:18:50.863 11:59:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:18:50.863 11:59:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:18:50.863 11:59:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:18:50.863 11:59:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:18:50.863 11:59:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:18:50.863 11:59:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:18:50.863 11:59:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:18:50.863 11:59:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:18:50.863 11:59:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:18:50.863 1+0 records in 00:18:50.863 1+0 records out 00:18:50.863 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000227042 s, 18.0 MB/s 00:18:50.863 11:59:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:50.863 11:59:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:18:50.863 11:59:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:50.863 11:59:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:18:50.863 11:59:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:18:50.863 11:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:18:50.863 11:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:18:50.863 11:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:18:50.863 11:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:18:50.863 11:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:18:50.863 11:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:18:50.863 11:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:18:50.863 11:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:18:50.864 11:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:18:50.864 11:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:18:51.122 11:59:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:18:51.122 11:59:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:18:51.122 11:59:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:18:51.122 11:59:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:18:51.122 11:59:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:18:51.122 11:59:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:18:51.122 11:59:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:18:51.122 11:59:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:18:51.122 11:59:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:18:51.122 11:59:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:18:51.122 11:59:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:18:51.122 11:59:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:18:51.122 11:59:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:18:51.122 11:59:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:18:51.122 11:59:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:18:51.122 11:59:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:18:51.122 11:59:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:18:51.122 11:59:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:18:51.122 11:59:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:18:51.122 11:59:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:18:51.380 11:59:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:18:51.639 [2024-07-12 11:59:41.680456] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:18:51.639 [2024-07-12 11:59:41.680486] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:51.639 [2024-07-12 11:59:41.680498] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c888f0 00:18:51.639 [2024-07-12 11:59:41.680524] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:51.639 [2024-07-12 11:59:41.681653] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:51.639 [2024-07-12 11:59:41.681673] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:18:51.639 [2024-07-12 11:59:41.681735] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:18:51.639 [2024-07-12 11:59:41.681752] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:18:51.639 [2024-07-12 11:59:41.681804] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:51.639 spare 00:18:51.639 11:59:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:51.639 11:59:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:51.639 11:59:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:51.639 11:59:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:51.639 11:59:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:51.639 11:59:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:51.639 11:59:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:51.639 11:59:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:51.639 11:59:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:51.639 11:59:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:51.639 11:59:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:51.639 11:59:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:51.639 [2024-07-12 11:59:41.782091] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c81890 00:18:51.639 [2024-07-12 11:59:41.782101] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:51.639 [2024-07-12 11:59:41.782228] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c85730 00:18:51.639 [2024-07-12 11:59:41.782330] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c81890 00:18:51.639 [2024-07-12 11:59:41.782335] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c81890 00:18:51.639 [2024-07-12 11:59:41.782398] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:51.639 11:59:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:51.639 "name": "raid_bdev1", 00:18:51.639 "uuid": "2ec41899-612d-43f6-aa81-54d23b36f4c4", 00:18:51.639 "strip_size_kb": 0, 00:18:51.639 "state": "online", 00:18:51.639 "raid_level": "raid1", 00:18:51.639 "superblock": true, 00:18:51.639 "num_base_bdevs": 2, 00:18:51.639 "num_base_bdevs_discovered": 2, 00:18:51.639 "num_base_bdevs_operational": 2, 00:18:51.639 "base_bdevs_list": [ 00:18:51.639 { 00:18:51.639 "name": "spare", 00:18:51.639 "uuid": "22dd0865-25b8-5ebd-b5bb-06a429bbbea7", 00:18:51.639 "is_configured": true, 00:18:51.639 "data_offset": 2048, 00:18:51.639 "data_size": 63488 00:18:51.639 }, 00:18:51.639 { 00:18:51.639 "name": "BaseBdev2", 00:18:51.639 "uuid": "da16bc01-d773-53b7-8697-b9ae47c8413b", 00:18:51.639 "is_configured": true, 00:18:51.639 "data_offset": 2048, 00:18:51.639 "data_size": 63488 00:18:51.639 } 00:18:51.639 ] 00:18:51.639 }' 00:18:51.639 11:59:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:51.639 11:59:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:52.206 11:59:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:18:52.206 11:59:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:52.206 11:59:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:18:52.206 11:59:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:18:52.206 11:59:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:52.206 11:59:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:52.206 11:59:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:52.462 11:59:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:52.462 "name": "raid_bdev1", 00:18:52.462 "uuid": "2ec41899-612d-43f6-aa81-54d23b36f4c4", 00:18:52.462 "strip_size_kb": 0, 00:18:52.462 "state": "online", 00:18:52.462 "raid_level": "raid1", 00:18:52.462 "superblock": true, 00:18:52.462 "num_base_bdevs": 2, 00:18:52.462 "num_base_bdevs_discovered": 2, 00:18:52.462 "num_base_bdevs_operational": 2, 00:18:52.462 "base_bdevs_list": [ 00:18:52.462 { 00:18:52.462 "name": "spare", 00:18:52.462 "uuid": "22dd0865-25b8-5ebd-b5bb-06a429bbbea7", 00:18:52.462 "is_configured": true, 00:18:52.462 "data_offset": 2048, 00:18:52.462 "data_size": 63488 00:18:52.462 }, 00:18:52.462 { 00:18:52.462 "name": "BaseBdev2", 00:18:52.462 "uuid": "da16bc01-d773-53b7-8697-b9ae47c8413b", 00:18:52.462 "is_configured": true, 00:18:52.462 "data_offset": 2048, 00:18:52.462 "data_size": 63488 00:18:52.462 } 00:18:52.462 ] 00:18:52.462 }' 00:18:52.462 11:59:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:52.462 11:59:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:18:52.462 11:59:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:52.462 11:59:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:18:52.462 11:59:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:52.462 11:59:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:18:52.719 11:59:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:18:52.719 11:59:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:18:52.719 [2024-07-12 11:59:42.947792] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:18:52.719 11:59:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:52.719 11:59:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:52.719 11:59:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:52.719 11:59:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:52.719 11:59:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:52.719 11:59:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:18:52.719 11:59:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:52.719 11:59:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:52.719 11:59:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:52.719 11:59:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:52.977 11:59:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:52.977 11:59:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:52.977 11:59:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:52.977 "name": "raid_bdev1", 00:18:52.977 "uuid": "2ec41899-612d-43f6-aa81-54d23b36f4c4", 00:18:52.977 "strip_size_kb": 0, 00:18:52.977 "state": "online", 00:18:52.977 "raid_level": "raid1", 00:18:52.977 "superblock": true, 00:18:52.977 "num_base_bdevs": 2, 00:18:52.977 "num_base_bdevs_discovered": 1, 00:18:52.977 "num_base_bdevs_operational": 1, 00:18:52.977 "base_bdevs_list": [ 00:18:52.977 { 00:18:52.977 "name": null, 00:18:52.977 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:52.977 "is_configured": false, 00:18:52.977 "data_offset": 2048, 00:18:52.977 "data_size": 63488 00:18:52.977 }, 00:18:52.977 { 00:18:52.977 "name": "BaseBdev2", 00:18:52.977 "uuid": "da16bc01-d773-53b7-8697-b9ae47c8413b", 00:18:52.977 "is_configured": true, 00:18:52.977 "data_offset": 2048, 00:18:52.977 "data_size": 63488 00:18:52.977 } 00:18:52.977 ] 00:18:52.977 }' 00:18:52.977 11:59:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:52.977 11:59:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:53.545 11:59:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:18:53.545 [2024-07-12 11:59:43.769927] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:18:53.545 [2024-07-12 11:59:43.770041] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:18:53.545 [2024-07-12 11:59:43.770050] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:18:53.545 [2024-07-12 11:59:43.770068] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:18:53.545 [2024-07-12 11:59:43.774292] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bf2ab0 00:18:53.545 [2024-07-12 11:59:43.775802] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:18:53.545 11:59:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:18:54.921 11:59:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:54.921 11:59:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:54.921 11:59:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:54.921 11:59:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:54.921 11:59:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:54.921 11:59:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:54.921 11:59:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:54.921 11:59:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:54.921 "name": "raid_bdev1", 00:18:54.921 "uuid": "2ec41899-612d-43f6-aa81-54d23b36f4c4", 00:18:54.921 "strip_size_kb": 0, 00:18:54.921 "state": "online", 00:18:54.921 "raid_level": "raid1", 00:18:54.921 "superblock": true, 00:18:54.921 "num_base_bdevs": 2, 00:18:54.921 "num_base_bdevs_discovered": 2, 00:18:54.921 "num_base_bdevs_operational": 2, 00:18:54.921 "process": { 00:18:54.921 "type": "rebuild", 00:18:54.921 "target": "spare", 00:18:54.921 "progress": { 00:18:54.921 "blocks": 22528, 00:18:54.921 "percent": 35 00:18:54.921 } 00:18:54.921 }, 00:18:54.921 "base_bdevs_list": [ 00:18:54.921 { 00:18:54.921 "name": "spare", 00:18:54.921 "uuid": "22dd0865-25b8-5ebd-b5bb-06a429bbbea7", 00:18:54.921 "is_configured": true, 00:18:54.921 "data_offset": 2048, 00:18:54.921 "data_size": 63488 00:18:54.921 }, 00:18:54.921 { 00:18:54.921 "name": "BaseBdev2", 00:18:54.921 "uuid": "da16bc01-d773-53b7-8697-b9ae47c8413b", 00:18:54.921 "is_configured": true, 00:18:54.921 "data_offset": 2048, 00:18:54.921 "data_size": 63488 00:18:54.921 } 00:18:54.921 ] 00:18:54.921 }' 00:18:54.921 11:59:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:54.921 11:59:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:54.921 11:59:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:54.921 11:59:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:54.921 11:59:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:18:54.921 [2024-07-12 11:59:45.146254] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:18:55.179 [2024-07-12 11:59:45.185587] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:18:55.179 [2024-07-12 11:59:45.185612] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:55.179 [2024-07-12 11:59:45.185621] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:18:55.179 [2024-07-12 11:59:45.185640] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:18:55.179 11:59:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:55.179 11:59:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:55.179 11:59:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:55.179 11:59:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:55.179 11:59:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:55.179 11:59:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:18:55.179 11:59:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:55.179 11:59:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:55.179 11:59:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:55.179 11:59:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:55.179 11:59:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:55.179 11:59:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:55.179 11:59:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:55.179 "name": "raid_bdev1", 00:18:55.179 "uuid": "2ec41899-612d-43f6-aa81-54d23b36f4c4", 00:18:55.179 "strip_size_kb": 0, 00:18:55.179 "state": "online", 00:18:55.179 "raid_level": "raid1", 00:18:55.179 "superblock": true, 00:18:55.179 "num_base_bdevs": 2, 00:18:55.179 "num_base_bdevs_discovered": 1, 00:18:55.179 "num_base_bdevs_operational": 1, 00:18:55.179 "base_bdevs_list": [ 00:18:55.179 { 00:18:55.179 "name": null, 00:18:55.179 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:55.179 "is_configured": false, 00:18:55.179 "data_offset": 2048, 00:18:55.179 "data_size": 63488 00:18:55.179 }, 00:18:55.179 { 00:18:55.179 "name": "BaseBdev2", 00:18:55.179 "uuid": "da16bc01-d773-53b7-8697-b9ae47c8413b", 00:18:55.179 "is_configured": true, 00:18:55.179 "data_offset": 2048, 00:18:55.179 "data_size": 63488 00:18:55.179 } 00:18:55.179 ] 00:18:55.179 }' 00:18:55.179 11:59:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:55.179 11:59:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:55.745 11:59:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:18:56.003 [2024-07-12 11:59:46.019686] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:18:56.003 [2024-07-12 11:59:46.019721] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:56.003 [2024-07-12 11:59:46.019751] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c82ab0 00:18:56.003 [2024-07-12 11:59:46.019758] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:56.003 [2024-07-12 11:59:46.020024] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:56.003 [2024-07-12 11:59:46.020034] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:18:56.003 [2024-07-12 11:59:46.020088] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:18:56.003 [2024-07-12 11:59:46.020095] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:18:56.003 [2024-07-12 11:59:46.020100] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:18:56.003 [2024-07-12 11:59:46.020111] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:18:56.003 [2024-07-12 11:59:46.024294] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ace8d0 00:18:56.003 [2024-07-12 11:59:46.025339] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:18:56.003 spare 00:18:56.003 11:59:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:18:56.939 11:59:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:56.939 11:59:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:56.939 11:59:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:56.939 11:59:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:56.939 11:59:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:56.939 11:59:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:56.939 11:59:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:57.198 11:59:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:57.198 "name": "raid_bdev1", 00:18:57.198 "uuid": "2ec41899-612d-43f6-aa81-54d23b36f4c4", 00:18:57.198 "strip_size_kb": 0, 00:18:57.198 "state": "online", 00:18:57.198 "raid_level": "raid1", 00:18:57.198 "superblock": true, 00:18:57.198 "num_base_bdevs": 2, 00:18:57.198 "num_base_bdevs_discovered": 2, 00:18:57.198 "num_base_bdevs_operational": 2, 00:18:57.198 "process": { 00:18:57.198 "type": "rebuild", 00:18:57.198 "target": "spare", 00:18:57.198 "progress": { 00:18:57.198 "blocks": 22528, 00:18:57.198 "percent": 35 00:18:57.198 } 00:18:57.198 }, 00:18:57.198 "base_bdevs_list": [ 00:18:57.198 { 00:18:57.198 "name": "spare", 00:18:57.198 "uuid": "22dd0865-25b8-5ebd-b5bb-06a429bbbea7", 00:18:57.198 "is_configured": true, 00:18:57.198 "data_offset": 2048, 00:18:57.198 "data_size": 63488 00:18:57.198 }, 00:18:57.198 { 00:18:57.198 "name": "BaseBdev2", 00:18:57.198 "uuid": "da16bc01-d773-53b7-8697-b9ae47c8413b", 00:18:57.198 "is_configured": true, 00:18:57.198 "data_offset": 2048, 00:18:57.198 "data_size": 63488 00:18:57.198 } 00:18:57.198 ] 00:18:57.198 }' 00:18:57.198 11:59:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:57.198 11:59:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:57.198 11:59:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:57.198 11:59:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:57.198 11:59:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:18:57.198 [2024-07-12 11:59:47.439912] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:18:57.456 [2024-07-12 11:59:47.535799] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:18:57.456 [2024-07-12 11:59:47.535825] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:57.456 [2024-07-12 11:59:47.535833] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:18:57.456 [2024-07-12 11:59:47.535837] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:18:57.456 11:59:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:57.456 11:59:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:57.456 11:59:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:57.456 11:59:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:57.456 11:59:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:57.456 11:59:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:18:57.456 11:59:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:57.456 11:59:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:57.456 11:59:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:57.457 11:59:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:57.457 11:59:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:57.457 11:59:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:57.716 11:59:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:57.716 "name": "raid_bdev1", 00:18:57.716 "uuid": "2ec41899-612d-43f6-aa81-54d23b36f4c4", 00:18:57.716 "strip_size_kb": 0, 00:18:57.716 "state": "online", 00:18:57.716 "raid_level": "raid1", 00:18:57.716 "superblock": true, 00:18:57.716 "num_base_bdevs": 2, 00:18:57.716 "num_base_bdevs_discovered": 1, 00:18:57.716 "num_base_bdevs_operational": 1, 00:18:57.716 "base_bdevs_list": [ 00:18:57.716 { 00:18:57.716 "name": null, 00:18:57.716 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:57.716 "is_configured": false, 00:18:57.716 "data_offset": 2048, 00:18:57.716 "data_size": 63488 00:18:57.716 }, 00:18:57.716 { 00:18:57.716 "name": "BaseBdev2", 00:18:57.716 "uuid": "da16bc01-d773-53b7-8697-b9ae47c8413b", 00:18:57.716 "is_configured": true, 00:18:57.716 "data_offset": 2048, 00:18:57.716 "data_size": 63488 00:18:57.716 } 00:18:57.716 ] 00:18:57.716 }' 00:18:57.716 11:59:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:57.716 11:59:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:57.977 11:59:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:18:57.977 11:59:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:57.977 11:59:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:18:57.977 11:59:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:18:57.977 11:59:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:57.977 11:59:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:57.977 11:59:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:58.328 11:59:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:58.328 "name": "raid_bdev1", 00:18:58.328 "uuid": "2ec41899-612d-43f6-aa81-54d23b36f4c4", 00:18:58.328 "strip_size_kb": 0, 00:18:58.328 "state": "online", 00:18:58.328 "raid_level": "raid1", 00:18:58.328 "superblock": true, 00:18:58.328 "num_base_bdevs": 2, 00:18:58.328 "num_base_bdevs_discovered": 1, 00:18:58.328 "num_base_bdevs_operational": 1, 00:18:58.328 "base_bdevs_list": [ 00:18:58.328 { 00:18:58.328 "name": null, 00:18:58.328 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:58.328 "is_configured": false, 00:18:58.328 "data_offset": 2048, 00:18:58.328 "data_size": 63488 00:18:58.328 }, 00:18:58.328 { 00:18:58.328 "name": "BaseBdev2", 00:18:58.328 "uuid": "da16bc01-d773-53b7-8697-b9ae47c8413b", 00:18:58.328 "is_configured": true, 00:18:58.328 "data_offset": 2048, 00:18:58.328 "data_size": 63488 00:18:58.328 } 00:18:58.328 ] 00:18:58.328 }' 00:18:58.328 11:59:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:58.328 11:59:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:18:58.328 11:59:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:58.328 11:59:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:18:58.328 11:59:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:18:58.587 11:59:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:18:58.587 [2024-07-12 11:59:48.746986] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:18:58.587 [2024-07-12 11:59:48.747018] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:58.587 [2024-07-12 11:59:48.747031] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c84fa0 00:18:58.587 [2024-07-12 11:59:48.747037] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:58.587 [2024-07-12 11:59:48.747278] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:58.587 [2024-07-12 11:59:48.747288] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:58.587 [2024-07-12 11:59:48.747330] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:18:58.587 [2024-07-12 11:59:48.747338] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:18:58.587 [2024-07-12 11:59:48.747344] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:18:58.587 BaseBdev1 00:18:58.587 11:59:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:18:59.524 11:59:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:59.524 11:59:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:59.524 11:59:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:59.524 11:59:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:59.524 11:59:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:59.524 11:59:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:18:59.524 11:59:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:59.524 11:59:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:59.524 11:59:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:59.524 11:59:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:59.524 11:59:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:59.524 11:59:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:59.783 11:59:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:59.783 "name": "raid_bdev1", 00:18:59.783 "uuid": "2ec41899-612d-43f6-aa81-54d23b36f4c4", 00:18:59.783 "strip_size_kb": 0, 00:18:59.783 "state": "online", 00:18:59.783 "raid_level": "raid1", 00:18:59.783 "superblock": true, 00:18:59.783 "num_base_bdevs": 2, 00:18:59.783 "num_base_bdevs_discovered": 1, 00:18:59.783 "num_base_bdevs_operational": 1, 00:18:59.783 "base_bdevs_list": [ 00:18:59.783 { 00:18:59.783 "name": null, 00:18:59.783 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:59.783 "is_configured": false, 00:18:59.783 "data_offset": 2048, 00:18:59.783 "data_size": 63488 00:18:59.783 }, 00:18:59.783 { 00:18:59.783 "name": "BaseBdev2", 00:18:59.783 "uuid": "da16bc01-d773-53b7-8697-b9ae47c8413b", 00:18:59.783 "is_configured": true, 00:18:59.783 "data_offset": 2048, 00:18:59.783 "data_size": 63488 00:18:59.783 } 00:18:59.783 ] 00:18:59.783 }' 00:18:59.783 11:59:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:59.783 11:59:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:00.350 11:59:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:00.350 11:59:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:00.350 11:59:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:00.350 11:59:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:00.350 11:59:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:00.350 11:59:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:00.350 11:59:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:00.609 11:59:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:00.609 "name": "raid_bdev1", 00:19:00.609 "uuid": "2ec41899-612d-43f6-aa81-54d23b36f4c4", 00:19:00.609 "strip_size_kb": 0, 00:19:00.609 "state": "online", 00:19:00.609 "raid_level": "raid1", 00:19:00.609 "superblock": true, 00:19:00.609 "num_base_bdevs": 2, 00:19:00.609 "num_base_bdevs_discovered": 1, 00:19:00.609 "num_base_bdevs_operational": 1, 00:19:00.609 "base_bdevs_list": [ 00:19:00.609 { 00:19:00.609 "name": null, 00:19:00.609 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:00.609 "is_configured": false, 00:19:00.609 "data_offset": 2048, 00:19:00.609 "data_size": 63488 00:19:00.609 }, 00:19:00.609 { 00:19:00.609 "name": "BaseBdev2", 00:19:00.609 "uuid": "da16bc01-d773-53b7-8697-b9ae47c8413b", 00:19:00.609 "is_configured": true, 00:19:00.609 "data_offset": 2048, 00:19:00.609 "data_size": 63488 00:19:00.609 } 00:19:00.609 ] 00:19:00.609 }' 00:19:00.609 11:59:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:00.609 11:59:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:00.609 11:59:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:00.609 11:59:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:00.609 11:59:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:19:00.609 11:59:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:19:00.609 11:59:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:19:00.609 11:59:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:00.609 11:59:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:00.609 11:59:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:00.609 11:59:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:00.609 11:59:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:00.609 11:59:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:00.609 11:59:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:00.609 11:59:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:19:00.609 11:59:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:19:00.868 [2024-07-12 11:59:50.888540] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:00.868 [2024-07-12 11:59:50.888637] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:19:00.868 [2024-07-12 11:59:50.888646] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:19:00.868 request: 00:19:00.868 { 00:19:00.868 "raid_bdev": "raid_bdev1", 00:19:00.868 "base_bdev": "BaseBdev1", 00:19:00.868 "method": "bdev_raid_add_base_bdev", 00:19:00.868 "req_id": 1 00:19:00.868 } 00:19:00.868 Got JSON-RPC error response 00:19:00.868 response: 00:19:00.868 { 00:19:00.868 "code": -22, 00:19:00.868 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:19:00.868 } 00:19:00.868 11:59:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:19:00.868 11:59:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:19:00.868 11:59:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:19:00.868 11:59:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:19:00.868 11:59:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:19:01.805 11:59:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:01.805 11:59:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:01.805 11:59:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:01.805 11:59:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:01.805 11:59:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:01.805 11:59:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:19:01.805 11:59:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:01.805 11:59:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:01.805 11:59:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:01.805 11:59:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:01.805 11:59:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:01.805 11:59:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:02.064 11:59:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:02.064 "name": "raid_bdev1", 00:19:02.064 "uuid": "2ec41899-612d-43f6-aa81-54d23b36f4c4", 00:19:02.064 "strip_size_kb": 0, 00:19:02.064 "state": "online", 00:19:02.064 "raid_level": "raid1", 00:19:02.064 "superblock": true, 00:19:02.064 "num_base_bdevs": 2, 00:19:02.064 "num_base_bdevs_discovered": 1, 00:19:02.064 "num_base_bdevs_operational": 1, 00:19:02.064 "base_bdevs_list": [ 00:19:02.064 { 00:19:02.064 "name": null, 00:19:02.064 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:02.064 "is_configured": false, 00:19:02.064 "data_offset": 2048, 00:19:02.064 "data_size": 63488 00:19:02.064 }, 00:19:02.064 { 00:19:02.064 "name": "BaseBdev2", 00:19:02.064 "uuid": "da16bc01-d773-53b7-8697-b9ae47c8413b", 00:19:02.064 "is_configured": true, 00:19:02.064 "data_offset": 2048, 00:19:02.064 "data_size": 63488 00:19:02.064 } 00:19:02.064 ] 00:19:02.064 }' 00:19:02.064 11:59:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:02.064 11:59:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:02.630 11:59:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:02.630 11:59:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:02.630 11:59:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:02.630 11:59:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:02.630 11:59:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:02.630 11:59:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:02.630 11:59:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:02.630 11:59:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:02.630 "name": "raid_bdev1", 00:19:02.630 "uuid": "2ec41899-612d-43f6-aa81-54d23b36f4c4", 00:19:02.630 "strip_size_kb": 0, 00:19:02.630 "state": "online", 00:19:02.630 "raid_level": "raid1", 00:19:02.630 "superblock": true, 00:19:02.630 "num_base_bdevs": 2, 00:19:02.630 "num_base_bdevs_discovered": 1, 00:19:02.630 "num_base_bdevs_operational": 1, 00:19:02.630 "base_bdevs_list": [ 00:19:02.630 { 00:19:02.630 "name": null, 00:19:02.630 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:02.630 "is_configured": false, 00:19:02.630 "data_offset": 2048, 00:19:02.630 "data_size": 63488 00:19:02.630 }, 00:19:02.630 { 00:19:02.630 "name": "BaseBdev2", 00:19:02.630 "uuid": "da16bc01-d773-53b7-8697-b9ae47c8413b", 00:19:02.630 "is_configured": true, 00:19:02.630 "data_offset": 2048, 00:19:02.630 "data_size": 63488 00:19:02.630 } 00:19:02.630 ] 00:19:02.630 }' 00:19:02.630 11:59:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:02.630 11:59:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:02.630 11:59:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:02.630 11:59:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:02.630 11:59:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 695766 00:19:02.630 11:59:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@948 -- # '[' -z 695766 ']' 00:19:02.630 11:59:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # kill -0 695766 00:19:02.630 11:59:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # uname 00:19:02.630 11:59:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:02.630 11:59:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 695766 00:19:02.630 11:59:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:02.630 11:59:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:02.630 11:59:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 695766' 00:19:02.630 killing process with pid 695766 00:19:02.630 11:59:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # kill 695766 00:19:02.630 Received shutdown signal, test time was about 60.000000 seconds 00:19:02.630 00:19:02.630 Latency(us) 00:19:02.630 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:19:02.630 =================================================================================================================== 00:19:02.630 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:19:02.630 [2024-07-12 11:59:52.870754] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:02.630 [2024-07-12 11:59:52.870817] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:02.630 [2024-07-12 11:59:52.870848] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:02.630 [2024-07-12 11:59:52.870854] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c81890 name raid_bdev1, state offline 00:19:02.630 11:59:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@972 -- # wait 695766 00:19:02.889 [2024-07-12 11:59:52.894400] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:02.889 11:59:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:19:02.889 00:19:02.889 real 0m28.066s 00:19:02.889 user 0m40.852s 00:19:02.889 sys 0m3.927s 00:19:02.889 11:59:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:02.889 11:59:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:02.889 ************************************ 00:19:02.889 END TEST raid_rebuild_test_sb 00:19:02.889 ************************************ 00:19:02.889 11:59:53 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:02.889 11:59:53 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 2 false true true 00:19:02.889 11:59:53 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:19:02.889 11:59:53 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:02.889 11:59:53 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:03.148 ************************************ 00:19:03.148 START TEST raid_rebuild_test_io 00:19:03.148 ************************************ 00:19:03.148 11:59:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 false true true 00:19:03.148 11:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:19:03.148 11:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:19:03.148 11:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:19:03.148 11:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:19:03.148 11:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:19:03.148 11:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:19:03.148 11:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:03.148 11:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:19:03.148 11:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:19:03.148 11:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:03.148 11:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:19:03.148 11:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:19:03.148 11:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:03.148 11:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:19:03.148 11:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:19:03.148 11:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:19:03.148 11:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:19:03.148 11:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:19:03.148 11:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:19:03.148 11:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:19:03.148 11:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:19:03.148 11:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:19:03.148 11:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:19:03.148 11:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=700784 00:19:03.148 11:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 700784 /var/tmp/spdk-raid.sock 00:19:03.148 11:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:19:03.148 11:59:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@829 -- # '[' -z 700784 ']' 00:19:03.148 11:59:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:03.148 11:59:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:03.148 11:59:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:03.148 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:03.148 11:59:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:03.148 11:59:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:19:03.148 [2024-07-12 11:59:53.194221] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:19:03.148 [2024-07-12 11:59:53.194259] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid700784 ] 00:19:03.148 I/O size of 3145728 is greater than zero copy threshold (65536). 00:19:03.148 Zero copy mechanism will not be used. 00:19:03.148 [2024-07-12 11:59:53.258710] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:03.148 [2024-07-12 11:59:53.329697] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:19:03.148 [2024-07-12 11:59:53.381318] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:03.148 [2024-07-12 11:59:53.381345] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:04.083 11:59:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:04.083 11:59:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@862 -- # return 0 00:19:04.083 11:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:19:04.083 11:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:04.083 BaseBdev1_malloc 00:19:04.083 11:59:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:19:04.083 [2024-07-12 11:59:54.276913] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:19:04.083 [2024-07-12 11:59:54.276947] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:04.083 [2024-07-12 11:59:54.276959] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2049010 00:19:04.083 [2024-07-12 11:59:54.276981] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:04.083 [2024-07-12 11:59:54.278074] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:04.083 [2024-07-12 11:59:54.278093] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:04.083 BaseBdev1 00:19:04.083 11:59:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:19:04.083 11:59:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:04.341 BaseBdev2_malloc 00:19:04.341 11:59:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:19:04.599 [2024-07-12 11:59:54.605103] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:19:04.599 [2024-07-12 11:59:54.605131] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:04.599 [2024-07-12 11:59:54.605142] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2049b60 00:19:04.599 [2024-07-12 11:59:54.605148] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:04.599 [2024-07-12 11:59:54.606104] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:04.599 [2024-07-12 11:59:54.606123] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:04.599 BaseBdev2 00:19:04.599 11:59:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:19:04.599 spare_malloc 00:19:04.599 11:59:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:19:04.857 spare_delay 00:19:04.857 11:59:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:19:05.114 [2024-07-12 11:59:55.145715] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:19:05.115 [2024-07-12 11:59:55.145739] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:05.115 [2024-07-12 11:59:55.145749] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21f4880 00:19:05.115 [2024-07-12 11:59:55.145755] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:05.115 [2024-07-12 11:59:55.146693] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:05.115 [2024-07-12 11:59:55.146712] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:19:05.115 spare 00:19:05.115 11:59:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:19:05.115 [2024-07-12 11:59:55.314163] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:05.115 [2024-07-12 11:59:55.314949] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:05.115 [2024-07-12 11:59:55.314998] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x21f86c0 00:19:05.115 [2024-07-12 11:59:55.315004] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:19:05.115 [2024-07-12 11:59:55.315128] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21f4b10 00:19:05.115 [2024-07-12 11:59:55.315219] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21f86c0 00:19:05.115 [2024-07-12 11:59:55.315225] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x21f86c0 00:19:05.115 [2024-07-12 11:59:55.315290] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:05.115 11:59:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:05.115 11:59:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:05.115 11:59:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:05.115 11:59:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:05.115 11:59:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:05.115 11:59:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:19:05.115 11:59:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:05.115 11:59:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:05.115 11:59:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:05.115 11:59:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:05.115 11:59:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:05.115 11:59:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:05.372 11:59:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:05.372 "name": "raid_bdev1", 00:19:05.372 "uuid": "e7664c71-fd9b-4873-b159-62f9df32ee09", 00:19:05.372 "strip_size_kb": 0, 00:19:05.372 "state": "online", 00:19:05.372 "raid_level": "raid1", 00:19:05.372 "superblock": false, 00:19:05.372 "num_base_bdevs": 2, 00:19:05.372 "num_base_bdevs_discovered": 2, 00:19:05.372 "num_base_bdevs_operational": 2, 00:19:05.372 "base_bdevs_list": [ 00:19:05.372 { 00:19:05.372 "name": "BaseBdev1", 00:19:05.372 "uuid": "27dcaecf-5dc1-5f50-b399-95956a54cc9c", 00:19:05.372 "is_configured": true, 00:19:05.372 "data_offset": 0, 00:19:05.372 "data_size": 65536 00:19:05.372 }, 00:19:05.372 { 00:19:05.372 "name": "BaseBdev2", 00:19:05.372 "uuid": "b06b9963-b96c-5f9b-b716-bddec715576b", 00:19:05.372 "is_configured": true, 00:19:05.372 "data_offset": 0, 00:19:05.372 "data_size": 65536 00:19:05.372 } 00:19:05.372 ] 00:19:05.372 }' 00:19:05.372 11:59:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:05.372 11:59:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:19:05.937 11:59:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:05.937 11:59:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:19:05.937 [2024-07-12 11:59:56.160494] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:06.195 11:59:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:19:06.195 11:59:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:06.195 11:59:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:19:06.195 11:59:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:19:06.195 11:59:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:19:06.195 11:59:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:19:06.195 11:59:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:19:06.195 [2024-07-12 11:59:56.430903] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21f3d40 00:19:06.195 I/O size of 3145728 is greater than zero copy threshold (65536). 00:19:06.195 Zero copy mechanism will not be used. 00:19:06.195 Running I/O for 60 seconds... 00:19:06.453 [2024-07-12 11:59:56.512296] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:06.453 [2024-07-12 11:59:56.522479] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x21f3d40 00:19:06.453 11:59:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:06.453 11:59:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:06.453 11:59:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:06.453 11:59:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:06.453 11:59:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:06.454 11:59:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:19:06.454 11:59:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:06.454 11:59:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:06.454 11:59:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:06.454 11:59:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:06.454 11:59:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:06.454 11:59:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:06.712 11:59:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:06.712 "name": "raid_bdev1", 00:19:06.712 "uuid": "e7664c71-fd9b-4873-b159-62f9df32ee09", 00:19:06.712 "strip_size_kb": 0, 00:19:06.712 "state": "online", 00:19:06.712 "raid_level": "raid1", 00:19:06.712 "superblock": false, 00:19:06.712 "num_base_bdevs": 2, 00:19:06.712 "num_base_bdevs_discovered": 1, 00:19:06.712 "num_base_bdevs_operational": 1, 00:19:06.712 "base_bdevs_list": [ 00:19:06.712 { 00:19:06.712 "name": null, 00:19:06.712 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:06.712 "is_configured": false, 00:19:06.712 "data_offset": 0, 00:19:06.712 "data_size": 65536 00:19:06.712 }, 00:19:06.712 { 00:19:06.712 "name": "BaseBdev2", 00:19:06.712 "uuid": "b06b9963-b96c-5f9b-b716-bddec715576b", 00:19:06.712 "is_configured": true, 00:19:06.712 "data_offset": 0, 00:19:06.712 "data_size": 65536 00:19:06.712 } 00:19:06.712 ] 00:19:06.712 }' 00:19:06.712 11:59:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:06.712 11:59:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:19:07.279 11:59:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:07.279 [2024-07-12 11:59:57.375101] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:07.279 11:59:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:19:07.279 [2024-07-12 11:59:57.443316] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x217be90 00:19:07.279 [2024-07-12 11:59:57.444810] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:07.538 [2024-07-12 11:59:57.560399] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:19:07.538 [2024-07-12 11:59:57.772557] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:19:07.538 [2024-07-12 11:59:57.772686] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:19:08.105 [2024-07-12 11:59:58.103680] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:19:08.105 [2024-07-12 11:59:58.204882] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:19:08.105 [2024-07-12 11:59:58.205036] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:19:08.363 11:59:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:08.363 11:59:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:08.363 11:59:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:08.363 11:59:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:08.363 11:59:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:08.363 11:59:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:08.363 11:59:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:08.363 [2024-07-12 11:59:58.440065] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:19:08.363 [2024-07-12 11:59:58.440396] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:19:08.363 11:59:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:08.363 "name": "raid_bdev1", 00:19:08.363 "uuid": "e7664c71-fd9b-4873-b159-62f9df32ee09", 00:19:08.363 "strip_size_kb": 0, 00:19:08.363 "state": "online", 00:19:08.363 "raid_level": "raid1", 00:19:08.363 "superblock": false, 00:19:08.363 "num_base_bdevs": 2, 00:19:08.363 "num_base_bdevs_discovered": 2, 00:19:08.363 "num_base_bdevs_operational": 2, 00:19:08.363 "process": { 00:19:08.363 "type": "rebuild", 00:19:08.363 "target": "spare", 00:19:08.363 "progress": { 00:19:08.363 "blocks": 14336, 00:19:08.363 "percent": 21 00:19:08.363 } 00:19:08.363 }, 00:19:08.363 "base_bdevs_list": [ 00:19:08.363 { 00:19:08.363 "name": "spare", 00:19:08.363 "uuid": "1fa806fe-df8d-514a-b7df-87fd18aba9f7", 00:19:08.363 "is_configured": true, 00:19:08.363 "data_offset": 0, 00:19:08.363 "data_size": 65536 00:19:08.363 }, 00:19:08.363 { 00:19:08.363 "name": "BaseBdev2", 00:19:08.363 "uuid": "b06b9963-b96c-5f9b-b716-bddec715576b", 00:19:08.363 "is_configured": true, 00:19:08.363 "data_offset": 0, 00:19:08.363 "data_size": 65536 00:19:08.363 } 00:19:08.363 ] 00:19:08.363 }' 00:19:08.363 11:59:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:08.622 11:59:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:08.622 11:59:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:08.622 [2024-07-12 11:59:58.652846] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:19:08.622 [2024-07-12 11:59:58.653064] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:19:08.622 11:59:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:08.622 11:59:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:19:08.622 [2024-07-12 11:59:58.824574] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:08.881 [2024-07-12 11:59:58.902770] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:19:08.881 [2024-07-12 11:59:58.904146] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:08.881 [2024-07-12 11:59:58.904165] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:08.881 [2024-07-12 11:59:58.904172] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:19:08.881 [2024-07-12 11:59:58.920215] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x21f3d40 00:19:08.881 11:59:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:08.881 11:59:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:08.881 11:59:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:08.881 11:59:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:08.881 11:59:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:08.881 11:59:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:19:08.881 11:59:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:08.881 11:59:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:08.881 11:59:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:08.881 11:59:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:08.881 11:59:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:08.881 11:59:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:09.140 11:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:09.140 "name": "raid_bdev1", 00:19:09.140 "uuid": "e7664c71-fd9b-4873-b159-62f9df32ee09", 00:19:09.140 "strip_size_kb": 0, 00:19:09.140 "state": "online", 00:19:09.140 "raid_level": "raid1", 00:19:09.140 "superblock": false, 00:19:09.140 "num_base_bdevs": 2, 00:19:09.140 "num_base_bdevs_discovered": 1, 00:19:09.140 "num_base_bdevs_operational": 1, 00:19:09.140 "base_bdevs_list": [ 00:19:09.140 { 00:19:09.140 "name": null, 00:19:09.140 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:09.140 "is_configured": false, 00:19:09.140 "data_offset": 0, 00:19:09.140 "data_size": 65536 00:19:09.140 }, 00:19:09.140 { 00:19:09.140 "name": "BaseBdev2", 00:19:09.140 "uuid": "b06b9963-b96c-5f9b-b716-bddec715576b", 00:19:09.140 "is_configured": true, 00:19:09.140 "data_offset": 0, 00:19:09.140 "data_size": 65536 00:19:09.140 } 00:19:09.140 ] 00:19:09.140 }' 00:19:09.140 11:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:09.140 11:59:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:19:09.398 11:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:09.398 11:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:09.398 11:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:09.398 11:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:09.398 11:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:09.398 11:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:09.398 11:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:09.656 11:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:09.656 "name": "raid_bdev1", 00:19:09.656 "uuid": "e7664c71-fd9b-4873-b159-62f9df32ee09", 00:19:09.656 "strip_size_kb": 0, 00:19:09.656 "state": "online", 00:19:09.656 "raid_level": "raid1", 00:19:09.656 "superblock": false, 00:19:09.656 "num_base_bdevs": 2, 00:19:09.656 "num_base_bdevs_discovered": 1, 00:19:09.656 "num_base_bdevs_operational": 1, 00:19:09.656 "base_bdevs_list": [ 00:19:09.656 { 00:19:09.656 "name": null, 00:19:09.656 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:09.656 "is_configured": false, 00:19:09.656 "data_offset": 0, 00:19:09.656 "data_size": 65536 00:19:09.656 }, 00:19:09.656 { 00:19:09.656 "name": "BaseBdev2", 00:19:09.656 "uuid": "b06b9963-b96c-5f9b-b716-bddec715576b", 00:19:09.656 "is_configured": true, 00:19:09.656 "data_offset": 0, 00:19:09.656 "data_size": 65536 00:19:09.656 } 00:19:09.656 ] 00:19:09.656 }' 00:19:09.656 11:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:09.656 11:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:09.656 11:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:09.656 11:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:09.656 11:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:09.915 [2024-07-12 12:00:00.043479] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:09.915 [2024-07-12 12:00:00.085139] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x205b730 00:19:09.915 [2024-07-12 12:00:00.086208] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:09.915 12:00:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:19:10.174 [2024-07-12 12:00:00.198473] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:19:10.174 [2024-07-12 12:00:00.198904] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:19:10.174 [2024-07-12 12:00:00.406495] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:19:10.174 [2024-07-12 12:00:00.406697] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:19:10.741 [2024-07-12 12:00:00.746508] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:19:10.999 [2024-07-12 12:00:01.073556] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:19:10.999 12:00:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:10.999 12:00:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:10.999 12:00:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:10.999 12:00:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:10.999 12:00:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:10.999 12:00:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:10.999 12:00:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:11.257 12:00:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:11.257 "name": "raid_bdev1", 00:19:11.257 "uuid": "e7664c71-fd9b-4873-b159-62f9df32ee09", 00:19:11.257 "strip_size_kb": 0, 00:19:11.257 "state": "online", 00:19:11.257 "raid_level": "raid1", 00:19:11.257 "superblock": false, 00:19:11.257 "num_base_bdevs": 2, 00:19:11.257 "num_base_bdevs_discovered": 2, 00:19:11.257 "num_base_bdevs_operational": 2, 00:19:11.257 "process": { 00:19:11.257 "type": "rebuild", 00:19:11.257 "target": "spare", 00:19:11.257 "progress": { 00:19:11.257 "blocks": 14336, 00:19:11.257 "percent": 21 00:19:11.257 } 00:19:11.257 }, 00:19:11.257 "base_bdevs_list": [ 00:19:11.257 { 00:19:11.257 "name": "spare", 00:19:11.257 "uuid": "1fa806fe-df8d-514a-b7df-87fd18aba9f7", 00:19:11.257 "is_configured": true, 00:19:11.257 "data_offset": 0, 00:19:11.257 "data_size": 65536 00:19:11.257 }, 00:19:11.257 { 00:19:11.257 "name": "BaseBdev2", 00:19:11.257 "uuid": "b06b9963-b96c-5f9b-b716-bddec715576b", 00:19:11.257 "is_configured": true, 00:19:11.257 "data_offset": 0, 00:19:11.257 "data_size": 65536 00:19:11.257 } 00:19:11.257 ] 00:19:11.257 }' 00:19:11.257 12:00:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:11.257 12:00:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:11.257 12:00:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:11.257 12:00:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:11.257 12:00:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:19:11.257 12:00:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:19:11.257 12:00:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:19:11.257 12:00:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:19:11.257 12:00:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=621 00:19:11.257 12:00:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:11.257 12:00:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:11.257 12:00:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:11.257 12:00:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:11.257 12:00:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:11.257 12:00:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:11.257 12:00:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:11.257 12:00:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:11.513 12:00:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:11.513 "name": "raid_bdev1", 00:19:11.513 "uuid": "e7664c71-fd9b-4873-b159-62f9df32ee09", 00:19:11.513 "strip_size_kb": 0, 00:19:11.513 "state": "online", 00:19:11.513 "raid_level": "raid1", 00:19:11.513 "superblock": false, 00:19:11.513 "num_base_bdevs": 2, 00:19:11.513 "num_base_bdevs_discovered": 2, 00:19:11.513 "num_base_bdevs_operational": 2, 00:19:11.513 "process": { 00:19:11.513 "type": "rebuild", 00:19:11.513 "target": "spare", 00:19:11.513 "progress": { 00:19:11.513 "blocks": 18432, 00:19:11.513 "percent": 28 00:19:11.513 } 00:19:11.513 }, 00:19:11.513 "base_bdevs_list": [ 00:19:11.513 { 00:19:11.513 "name": "spare", 00:19:11.513 "uuid": "1fa806fe-df8d-514a-b7df-87fd18aba9f7", 00:19:11.513 "is_configured": true, 00:19:11.513 "data_offset": 0, 00:19:11.513 "data_size": 65536 00:19:11.513 }, 00:19:11.513 { 00:19:11.513 "name": "BaseBdev2", 00:19:11.513 "uuid": "b06b9963-b96c-5f9b-b716-bddec715576b", 00:19:11.513 "is_configured": true, 00:19:11.513 "data_offset": 0, 00:19:11.513 "data_size": 65536 00:19:11.513 } 00:19:11.513 ] 00:19:11.513 }' 00:19:11.513 12:00:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:11.513 12:00:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:11.513 12:00:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:11.513 12:00:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:11.513 12:00:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:19:11.513 [2024-07-12 12:00:01.615045] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:19:11.770 [2024-07-12 12:00:01.968012] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:19:12.333 [2024-07-12 12:00:02.276397] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:19:12.333 [2024-07-12 12:00:02.276666] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:19:12.333 [2024-07-12 12:00:02.384368] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:19:12.333 [2024-07-12 12:00:02.384456] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:19:12.591 12:00:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:12.591 12:00:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:12.591 12:00:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:12.591 12:00:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:12.591 12:00:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:12.591 12:00:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:12.591 12:00:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:12.591 12:00:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:12.591 [2024-07-12 12:00:02.729117] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:19:12.591 12:00:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:12.591 "name": "raid_bdev1", 00:19:12.591 "uuid": "e7664c71-fd9b-4873-b159-62f9df32ee09", 00:19:12.591 "strip_size_kb": 0, 00:19:12.591 "state": "online", 00:19:12.591 "raid_level": "raid1", 00:19:12.591 "superblock": false, 00:19:12.591 "num_base_bdevs": 2, 00:19:12.591 "num_base_bdevs_discovered": 2, 00:19:12.591 "num_base_bdevs_operational": 2, 00:19:12.591 "process": { 00:19:12.591 "type": "rebuild", 00:19:12.591 "target": "spare", 00:19:12.591 "progress": { 00:19:12.591 "blocks": 40960, 00:19:12.591 "percent": 62 00:19:12.591 } 00:19:12.591 }, 00:19:12.591 "base_bdevs_list": [ 00:19:12.591 { 00:19:12.591 "name": "spare", 00:19:12.591 "uuid": "1fa806fe-df8d-514a-b7df-87fd18aba9f7", 00:19:12.591 "is_configured": true, 00:19:12.591 "data_offset": 0, 00:19:12.591 "data_size": 65536 00:19:12.591 }, 00:19:12.591 { 00:19:12.591 "name": "BaseBdev2", 00:19:12.591 "uuid": "b06b9963-b96c-5f9b-b716-bddec715576b", 00:19:12.591 "is_configured": true, 00:19:12.591 "data_offset": 0, 00:19:12.591 "data_size": 65536 00:19:12.591 } 00:19:12.591 ] 00:19:12.591 }' 00:19:12.591 12:00:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:12.591 12:00:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:12.591 12:00:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:12.849 12:00:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:12.849 12:00:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:19:12.849 [2024-07-12 12:00:02.948306] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:19:12.849 [2024-07-12 12:00:03.061507] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:19:13.416 [2024-07-12 12:00:03.505087] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:19:13.673 [2024-07-12 12:00:03.829222] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:19:13.673 12:00:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:13.673 12:00:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:13.673 12:00:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:13.673 12:00:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:13.673 12:00:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:13.673 12:00:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:13.673 12:00:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:13.673 12:00:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:13.931 12:00:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:13.931 "name": "raid_bdev1", 00:19:13.931 "uuid": "e7664c71-fd9b-4873-b159-62f9df32ee09", 00:19:13.931 "strip_size_kb": 0, 00:19:13.931 "state": "online", 00:19:13.931 "raid_level": "raid1", 00:19:13.931 "superblock": false, 00:19:13.931 "num_base_bdevs": 2, 00:19:13.931 "num_base_bdevs_discovered": 2, 00:19:13.931 "num_base_bdevs_operational": 2, 00:19:13.931 "process": { 00:19:13.931 "type": "rebuild", 00:19:13.931 "target": "spare", 00:19:13.931 "progress": { 00:19:13.931 "blocks": 57344, 00:19:13.931 "percent": 87 00:19:13.931 } 00:19:13.931 }, 00:19:13.931 "base_bdevs_list": [ 00:19:13.931 { 00:19:13.931 "name": "spare", 00:19:13.931 "uuid": "1fa806fe-df8d-514a-b7df-87fd18aba9f7", 00:19:13.931 "is_configured": true, 00:19:13.931 "data_offset": 0, 00:19:13.931 "data_size": 65536 00:19:13.931 }, 00:19:13.931 { 00:19:13.931 "name": "BaseBdev2", 00:19:13.931 "uuid": "b06b9963-b96c-5f9b-b716-bddec715576b", 00:19:13.931 "is_configured": true, 00:19:13.931 "data_offset": 0, 00:19:13.931 "data_size": 65536 00:19:13.931 } 00:19:13.931 ] 00:19:13.931 }' 00:19:13.931 12:00:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:13.931 [2024-07-12 12:00:04.043489] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:19:13.931 12:00:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:13.931 12:00:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:13.931 12:00:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:13.931 12:00:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:19:14.497 [2024-07-12 12:00:04.476845] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:19:14.497 [2024-07-12 12:00:04.577141] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:19:14.497 [2024-07-12 12:00:04.578236] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:15.062 12:00:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:15.062 12:00:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:15.062 12:00:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:15.062 12:00:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:15.062 12:00:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:15.062 12:00:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:15.062 12:00:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:15.062 12:00:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:15.062 12:00:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:15.062 "name": "raid_bdev1", 00:19:15.062 "uuid": "e7664c71-fd9b-4873-b159-62f9df32ee09", 00:19:15.062 "strip_size_kb": 0, 00:19:15.062 "state": "online", 00:19:15.062 "raid_level": "raid1", 00:19:15.062 "superblock": false, 00:19:15.062 "num_base_bdevs": 2, 00:19:15.062 "num_base_bdevs_discovered": 2, 00:19:15.062 "num_base_bdevs_operational": 2, 00:19:15.062 "base_bdevs_list": [ 00:19:15.062 { 00:19:15.062 "name": "spare", 00:19:15.062 "uuid": "1fa806fe-df8d-514a-b7df-87fd18aba9f7", 00:19:15.062 "is_configured": true, 00:19:15.062 "data_offset": 0, 00:19:15.062 "data_size": 65536 00:19:15.062 }, 00:19:15.062 { 00:19:15.062 "name": "BaseBdev2", 00:19:15.062 "uuid": "b06b9963-b96c-5f9b-b716-bddec715576b", 00:19:15.062 "is_configured": true, 00:19:15.062 "data_offset": 0, 00:19:15.062 "data_size": 65536 00:19:15.062 } 00:19:15.062 ] 00:19:15.062 }' 00:19:15.062 12:00:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:15.320 12:00:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:19:15.320 12:00:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:15.320 12:00:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:19:15.320 12:00:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:19:15.320 12:00:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:15.320 12:00:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:15.320 12:00:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:15.320 12:00:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:15.320 12:00:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:15.320 12:00:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:15.320 12:00:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:15.320 12:00:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:15.320 "name": "raid_bdev1", 00:19:15.320 "uuid": "e7664c71-fd9b-4873-b159-62f9df32ee09", 00:19:15.320 "strip_size_kb": 0, 00:19:15.320 "state": "online", 00:19:15.320 "raid_level": "raid1", 00:19:15.320 "superblock": false, 00:19:15.320 "num_base_bdevs": 2, 00:19:15.320 "num_base_bdevs_discovered": 2, 00:19:15.320 "num_base_bdevs_operational": 2, 00:19:15.320 "base_bdevs_list": [ 00:19:15.320 { 00:19:15.320 "name": "spare", 00:19:15.320 "uuid": "1fa806fe-df8d-514a-b7df-87fd18aba9f7", 00:19:15.320 "is_configured": true, 00:19:15.320 "data_offset": 0, 00:19:15.320 "data_size": 65536 00:19:15.320 }, 00:19:15.320 { 00:19:15.320 "name": "BaseBdev2", 00:19:15.320 "uuid": "b06b9963-b96c-5f9b-b716-bddec715576b", 00:19:15.320 "is_configured": true, 00:19:15.320 "data_offset": 0, 00:19:15.320 "data_size": 65536 00:19:15.320 } 00:19:15.320 ] 00:19:15.320 }' 00:19:15.320 12:00:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:15.579 12:00:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:15.579 12:00:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:15.579 12:00:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:15.579 12:00:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:15.579 12:00:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:15.579 12:00:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:15.579 12:00:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:15.579 12:00:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:15.579 12:00:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:19:15.579 12:00:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:15.579 12:00:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:15.579 12:00:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:15.579 12:00:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:15.579 12:00:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:15.579 12:00:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:15.579 12:00:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:15.579 "name": "raid_bdev1", 00:19:15.579 "uuid": "e7664c71-fd9b-4873-b159-62f9df32ee09", 00:19:15.579 "strip_size_kb": 0, 00:19:15.579 "state": "online", 00:19:15.579 "raid_level": "raid1", 00:19:15.579 "superblock": false, 00:19:15.579 "num_base_bdevs": 2, 00:19:15.579 "num_base_bdevs_discovered": 2, 00:19:15.579 "num_base_bdevs_operational": 2, 00:19:15.579 "base_bdevs_list": [ 00:19:15.579 { 00:19:15.579 "name": "spare", 00:19:15.579 "uuid": "1fa806fe-df8d-514a-b7df-87fd18aba9f7", 00:19:15.579 "is_configured": true, 00:19:15.579 "data_offset": 0, 00:19:15.579 "data_size": 65536 00:19:15.579 }, 00:19:15.579 { 00:19:15.579 "name": "BaseBdev2", 00:19:15.579 "uuid": "b06b9963-b96c-5f9b-b716-bddec715576b", 00:19:15.579 "is_configured": true, 00:19:15.579 "data_offset": 0, 00:19:15.579 "data_size": 65536 00:19:15.579 } 00:19:15.579 ] 00:19:15.579 }' 00:19:15.579 12:00:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:15.579 12:00:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:19:16.145 12:00:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:16.403 [2024-07-12 12:00:06.429262] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:16.403 [2024-07-12 12:00:06.429292] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:16.403 00:19:16.403 Latency(us) 00:19:16.403 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:19:16.403 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:19:16.403 raid_bdev1 : 10.06 114.91 344.72 0.00 0.00 12003.41 243.81 107853.53 00:19:16.403 =================================================================================================================== 00:19:16.403 Total : 114.91 344.72 0.00 0.00 12003.41 243.81 107853.53 00:19:16.403 [2024-07-12 12:00:06.520205] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:16.403 [2024-07-12 12:00:06.520241] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:16.403 [2024-07-12 12:00:06.520291] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:16.403 [2024-07-12 12:00:06.520297] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21f86c0 name raid_bdev1, state offline 00:19:16.403 0 00:19:16.403 12:00:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:16.403 12:00:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:19:16.663 12:00:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:19:16.663 12:00:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:19:16.663 12:00:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:19:16.663 12:00:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:19:16.663 12:00:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:16.663 12:00:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:19:16.663 12:00:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:19:16.663 12:00:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:19:16.663 12:00:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:19:16.663 12:00:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:19:16.663 12:00:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:19:16.663 12:00:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:16.663 12:00:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:19:16.663 /dev/nbd0 00:19:16.663 12:00:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:19:16.663 12:00:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:19:16.663 12:00:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:19:16.663 12:00:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:19:16.663 12:00:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:19:16.663 12:00:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:19:16.663 12:00:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:19:16.663 12:00:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:19:16.663 12:00:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:19:16.663 12:00:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:19:16.663 12:00:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:16.663 1+0 records in 00:19:16.663 1+0 records out 00:19:16.663 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000212914 s, 19.2 MB/s 00:19:16.663 12:00:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:16.663 12:00:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:19:16.663 12:00:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:16.663 12:00:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:19:16.663 12:00:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:19:16.663 12:00:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:16.663 12:00:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:16.663 12:00:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:19:16.922 12:00:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:19:16.922 12:00:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:19:16.922 12:00:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:16.922 12:00:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:19:16.922 12:00:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:19:16.922 12:00:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:19:16.922 12:00:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:19:16.922 12:00:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:19:16.922 12:00:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:19:16.922 12:00:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:16.922 12:00:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:19:16.922 /dev/nbd1 00:19:16.922 12:00:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:19:16.922 12:00:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:19:16.922 12:00:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:19:16.922 12:00:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:19:16.922 12:00:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:19:16.922 12:00:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:19:16.922 12:00:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:19:16.922 12:00:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:19:16.922 12:00:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:19:16.922 12:00:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:19:16.922 12:00:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:16.922 1+0 records in 00:19:16.922 1+0 records out 00:19:16.922 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000208698 s, 19.6 MB/s 00:19:16.922 12:00:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:16.922 12:00:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:19:16.922 12:00:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:16.922 12:00:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:19:16.922 12:00:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:19:16.922 12:00:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:16.922 12:00:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:16.922 12:00:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:19:16.922 12:00:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:19:16.922 12:00:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:16.922 12:00:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:19:16.922 12:00:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:19:16.922 12:00:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:19:16.922 12:00:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:16.922 12:00:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:19:17.179 12:00:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:19:17.179 12:00:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:19:17.179 12:00:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:19:17.179 12:00:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:17.179 12:00:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:17.179 12:00:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:19:17.179 12:00:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:19:17.179 12:00:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:19:17.179 12:00:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:19:17.179 12:00:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:17.179 12:00:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:19:17.179 12:00:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:19:17.179 12:00:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:19:17.179 12:00:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:17.179 12:00:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:19:17.437 12:00:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:19:17.437 12:00:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:19:17.437 12:00:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:19:17.437 12:00:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:17.437 12:00:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:17.437 12:00:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:19:17.437 12:00:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:19:17.437 12:00:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:19:17.437 12:00:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:19:17.437 12:00:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 700784 00:19:17.437 12:00:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@948 -- # '[' -z 700784 ']' 00:19:17.437 12:00:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # kill -0 700784 00:19:17.437 12:00:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # uname 00:19:17.437 12:00:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:17.437 12:00:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 700784 00:19:17.437 12:00:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:17.437 12:00:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:17.437 12:00:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 700784' 00:19:17.437 killing process with pid 700784 00:19:17.437 12:00:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # kill 700784 00:19:17.437 Received shutdown signal, test time was about 11.153373 seconds 00:19:17.437 00:19:17.437 Latency(us) 00:19:17.437 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:19:17.437 =================================================================================================================== 00:19:17.437 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:19:17.437 [2024-07-12 12:00:07.613114] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:17.437 12:00:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@972 -- # wait 700784 00:19:17.437 [2024-07-12 12:00:07.631397] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:17.696 12:00:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:19:17.696 00:19:17.696 real 0m14.677s 00:19:17.696 user 0m21.866s 00:19:17.696 sys 0m1.875s 00:19:17.696 12:00:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:17.696 12:00:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:19:17.696 ************************************ 00:19:17.696 END TEST raid_rebuild_test_io 00:19:17.696 ************************************ 00:19:17.696 12:00:07 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:17.696 12:00:07 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 2 true true true 00:19:17.696 12:00:07 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:19:17.696 12:00:07 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:17.696 12:00:07 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:17.696 ************************************ 00:19:17.696 START TEST raid_rebuild_test_sb_io 00:19:17.696 ************************************ 00:19:17.696 12:00:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true true true 00:19:17.696 12:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:19:17.696 12:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:19:17.696 12:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:19:17.696 12:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:19:17.696 12:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:19:17.696 12:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:19:17.696 12:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:17.696 12:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:19:17.696 12:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:19:17.696 12:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:17.696 12:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:19:17.696 12:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:19:17.696 12:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:17.696 12:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:19:17.696 12:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:19:17.696 12:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:19:17.696 12:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:19:17.696 12:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:19:17.696 12:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:19:17.696 12:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:19:17.696 12:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:19:17.696 12:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:19:17.696 12:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:19:17.696 12:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:19:17.696 12:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=703638 00:19:17.696 12:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 703638 /var/tmp/spdk-raid.sock 00:19:17.696 12:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:19:17.696 12:00:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@829 -- # '[' -z 703638 ']' 00:19:17.696 12:00:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:17.696 12:00:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:17.696 12:00:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:17.696 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:17.696 12:00:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:17.696 12:00:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:17.991 [2024-07-12 12:00:07.942543] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:19:17.991 [2024-07-12 12:00:07.942581] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid703638 ] 00:19:17.991 I/O size of 3145728 is greater than zero copy threshold (65536). 00:19:17.991 Zero copy mechanism will not be used. 00:19:17.991 [2024-07-12 12:00:08.006659] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:17.991 [2024-07-12 12:00:08.084475] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:19:17.991 [2024-07-12 12:00:08.138083] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:17.991 [2024-07-12 12:00:08.138108] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:18.554 12:00:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:18.554 12:00:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@862 -- # return 0 00:19:18.554 12:00:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:19:18.554 12:00:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:18.811 BaseBdev1_malloc 00:19:18.812 12:00:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:19:19.070 [2024-07-12 12:00:09.065574] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:19:19.070 [2024-07-12 12:00:09.065607] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:19.070 [2024-07-12 12:00:09.065619] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x8bf010 00:19:19.070 [2024-07-12 12:00:09.065625] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:19.070 [2024-07-12 12:00:09.066768] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:19.070 [2024-07-12 12:00:09.066788] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:19.070 BaseBdev1 00:19:19.070 12:00:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:19:19.070 12:00:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:19.070 BaseBdev2_malloc 00:19:19.070 12:00:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:19:19.328 [2024-07-12 12:00:09.394048] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:19:19.328 [2024-07-12 12:00:09.394082] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:19.328 [2024-07-12 12:00:09.394095] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x8bfb60 00:19:19.328 [2024-07-12 12:00:09.394117] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:19.328 [2024-07-12 12:00:09.395177] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:19.328 [2024-07-12 12:00:09.395197] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:19.328 BaseBdev2 00:19:19.328 12:00:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:19:19.328 spare_malloc 00:19:19.586 12:00:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:19:19.586 spare_delay 00:19:19.586 12:00:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:19:19.844 [2024-07-12 12:00:09.902884] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:19:19.844 [2024-07-12 12:00:09.902913] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:19.844 [2024-07-12 12:00:09.902924] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa6a880 00:19:19.844 [2024-07-12 12:00:09.902929] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:19.844 [2024-07-12 12:00:09.904027] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:19.844 [2024-07-12 12:00:09.904046] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:19:19.844 spare 00:19:19.844 12:00:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:19:19.844 [2024-07-12 12:00:10.067343] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:19.844 [2024-07-12 12:00:10.068310] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:19.844 [2024-07-12 12:00:10.068430] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xa6e6c0 00:19:19.844 [2024-07-12 12:00:10.068438] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:19.844 [2024-07-12 12:00:10.068589] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa6ab10 00:19:19.844 [2024-07-12 12:00:10.068691] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa6e6c0 00:19:19.844 [2024-07-12 12:00:10.068697] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xa6e6c0 00:19:19.844 [2024-07-12 12:00:10.068768] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:19.844 12:00:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:19.844 12:00:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:19.845 12:00:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:19.845 12:00:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:19.845 12:00:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:19.845 12:00:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:19:19.845 12:00:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:19.845 12:00:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:19.845 12:00:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:19.845 12:00:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:19.845 12:00:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:19.845 12:00:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:20.103 12:00:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:20.103 "name": "raid_bdev1", 00:19:20.103 "uuid": "fe170700-baa0-4a05-9465-8885914c5f04", 00:19:20.103 "strip_size_kb": 0, 00:19:20.103 "state": "online", 00:19:20.103 "raid_level": "raid1", 00:19:20.103 "superblock": true, 00:19:20.103 "num_base_bdevs": 2, 00:19:20.103 "num_base_bdevs_discovered": 2, 00:19:20.103 "num_base_bdevs_operational": 2, 00:19:20.103 "base_bdevs_list": [ 00:19:20.103 { 00:19:20.103 "name": "BaseBdev1", 00:19:20.103 "uuid": "f97c6620-2e22-52e9-be62-edef92ff5c37", 00:19:20.103 "is_configured": true, 00:19:20.103 "data_offset": 2048, 00:19:20.103 "data_size": 63488 00:19:20.103 }, 00:19:20.103 { 00:19:20.103 "name": "BaseBdev2", 00:19:20.103 "uuid": "2be68b8a-8450-59c3-9985-cb04f3b8a099", 00:19:20.103 "is_configured": true, 00:19:20.103 "data_offset": 2048, 00:19:20.103 "data_size": 63488 00:19:20.103 } 00:19:20.103 ] 00:19:20.103 }' 00:19:20.103 12:00:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:20.103 12:00:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:20.666 12:00:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:20.666 12:00:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:19:20.666 [2024-07-12 12:00:10.857501] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:20.666 12:00:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:19:20.666 12:00:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:20.666 12:00:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:19:20.922 12:00:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:19:20.922 12:00:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:19:20.922 12:00:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:19:20.922 12:00:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:19:20.922 [2024-07-12 12:00:11.139846] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa6ff60 00:19:20.922 I/O size of 3145728 is greater than zero copy threshold (65536). 00:19:20.922 Zero copy mechanism will not be used. 00:19:20.922 Running I/O for 60 seconds... 00:19:21.180 [2024-07-12 12:00:11.204449] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:21.180 [2024-07-12 12:00:11.209533] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xa6ff60 00:19:21.180 12:00:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:21.180 12:00:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:21.180 12:00:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:21.180 12:00:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:21.180 12:00:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:21.180 12:00:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:19:21.180 12:00:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:21.180 12:00:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:21.180 12:00:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:21.180 12:00:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:21.180 12:00:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:21.180 12:00:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:21.447 12:00:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:21.447 "name": "raid_bdev1", 00:19:21.447 "uuid": "fe170700-baa0-4a05-9465-8885914c5f04", 00:19:21.447 "strip_size_kb": 0, 00:19:21.447 "state": "online", 00:19:21.447 "raid_level": "raid1", 00:19:21.447 "superblock": true, 00:19:21.447 "num_base_bdevs": 2, 00:19:21.447 "num_base_bdevs_discovered": 1, 00:19:21.447 "num_base_bdevs_operational": 1, 00:19:21.447 "base_bdevs_list": [ 00:19:21.447 { 00:19:21.447 "name": null, 00:19:21.447 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:21.447 "is_configured": false, 00:19:21.447 "data_offset": 2048, 00:19:21.447 "data_size": 63488 00:19:21.447 }, 00:19:21.447 { 00:19:21.447 "name": "BaseBdev2", 00:19:21.447 "uuid": "2be68b8a-8450-59c3-9985-cb04f3b8a099", 00:19:21.447 "is_configured": true, 00:19:21.447 "data_offset": 2048, 00:19:21.447 "data_size": 63488 00:19:21.447 } 00:19:21.447 ] 00:19:21.447 }' 00:19:21.447 12:00:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:21.447 12:00:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:21.707 12:00:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:21.990 [2024-07-12 12:00:12.076094] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:21.990 [2024-07-12 12:00:12.106134] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa69ed0 00:19:21.990 [2024-07-12 12:00:12.107789] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:21.990 12:00:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:19:22.264 [2024-07-12 12:00:12.220496] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:19:22.264 [2024-07-12 12:00:12.220845] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:19:22.264 [2024-07-12 12:00:12.439048] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:19:22.264 [2024-07-12 12:00:12.439143] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:19:22.523 [2024-07-12 12:00:12.753226] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:19:22.523 [2024-07-12 12:00:12.753540] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:19:22.781 [2024-07-12 12:00:12.961681] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:19:23.043 12:00:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:23.043 12:00:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:23.043 12:00:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:23.043 12:00:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:23.043 12:00:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:23.043 12:00:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:23.043 12:00:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:23.043 12:00:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:23.043 "name": "raid_bdev1", 00:19:23.043 "uuid": "fe170700-baa0-4a05-9465-8885914c5f04", 00:19:23.043 "strip_size_kb": 0, 00:19:23.043 "state": "online", 00:19:23.043 "raid_level": "raid1", 00:19:23.043 "superblock": true, 00:19:23.043 "num_base_bdevs": 2, 00:19:23.043 "num_base_bdevs_discovered": 2, 00:19:23.043 "num_base_bdevs_operational": 2, 00:19:23.043 "process": { 00:19:23.043 "type": "rebuild", 00:19:23.043 "target": "spare", 00:19:23.043 "progress": { 00:19:23.043 "blocks": 14336, 00:19:23.043 "percent": 22 00:19:23.043 } 00:19:23.043 }, 00:19:23.043 "base_bdevs_list": [ 00:19:23.043 { 00:19:23.043 "name": "spare", 00:19:23.043 "uuid": "a58b7ebb-47b8-5531-aaa3-acc8788361aa", 00:19:23.043 "is_configured": true, 00:19:23.043 "data_offset": 2048, 00:19:23.043 "data_size": 63488 00:19:23.043 }, 00:19:23.043 { 00:19:23.043 "name": "BaseBdev2", 00:19:23.043 "uuid": "2be68b8a-8450-59c3-9985-cb04f3b8a099", 00:19:23.043 "is_configured": true, 00:19:23.043 "data_offset": 2048, 00:19:23.043 "data_size": 63488 00:19:23.043 } 00:19:23.043 ] 00:19:23.043 }' 00:19:23.043 12:00:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:23.304 [2024-07-12 12:00:13.299334] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:19:23.304 12:00:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:23.304 12:00:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:23.304 12:00:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:23.304 12:00:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:19:23.304 [2024-07-12 12:00:13.520601] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:23.563 [2024-07-12 12:00:13.654540] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:19:23.563 [2024-07-12 12:00:13.655769] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:23.563 [2024-07-12 12:00:13.655788] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:23.563 [2024-07-12 12:00:13.655794] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:19:23.563 [2024-07-12 12:00:13.676499] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xa6ff60 00:19:23.563 12:00:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:23.563 12:00:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:23.563 12:00:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:23.563 12:00:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:23.563 12:00:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:23.563 12:00:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:19:23.563 12:00:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:23.563 12:00:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:23.563 12:00:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:23.563 12:00:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:23.563 12:00:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:23.563 12:00:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:23.820 12:00:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:23.820 "name": "raid_bdev1", 00:19:23.820 "uuid": "fe170700-baa0-4a05-9465-8885914c5f04", 00:19:23.820 "strip_size_kb": 0, 00:19:23.820 "state": "online", 00:19:23.820 "raid_level": "raid1", 00:19:23.820 "superblock": true, 00:19:23.820 "num_base_bdevs": 2, 00:19:23.820 "num_base_bdevs_discovered": 1, 00:19:23.821 "num_base_bdevs_operational": 1, 00:19:23.821 "base_bdevs_list": [ 00:19:23.821 { 00:19:23.821 "name": null, 00:19:23.821 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:23.821 "is_configured": false, 00:19:23.821 "data_offset": 2048, 00:19:23.821 "data_size": 63488 00:19:23.821 }, 00:19:23.821 { 00:19:23.821 "name": "BaseBdev2", 00:19:23.821 "uuid": "2be68b8a-8450-59c3-9985-cb04f3b8a099", 00:19:23.821 "is_configured": true, 00:19:23.821 "data_offset": 2048, 00:19:23.821 "data_size": 63488 00:19:23.821 } 00:19:23.821 ] 00:19:23.821 }' 00:19:23.821 12:00:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:23.821 12:00:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:24.387 12:00:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:24.387 12:00:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:24.387 12:00:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:24.387 12:00:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:24.387 12:00:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:24.387 12:00:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:24.387 12:00:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:24.387 12:00:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:24.387 "name": "raid_bdev1", 00:19:24.387 "uuid": "fe170700-baa0-4a05-9465-8885914c5f04", 00:19:24.387 "strip_size_kb": 0, 00:19:24.387 "state": "online", 00:19:24.387 "raid_level": "raid1", 00:19:24.387 "superblock": true, 00:19:24.387 "num_base_bdevs": 2, 00:19:24.387 "num_base_bdevs_discovered": 1, 00:19:24.387 "num_base_bdevs_operational": 1, 00:19:24.387 "base_bdevs_list": [ 00:19:24.387 { 00:19:24.387 "name": null, 00:19:24.387 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:24.387 "is_configured": false, 00:19:24.387 "data_offset": 2048, 00:19:24.387 "data_size": 63488 00:19:24.387 }, 00:19:24.387 { 00:19:24.387 "name": "BaseBdev2", 00:19:24.387 "uuid": "2be68b8a-8450-59c3-9985-cb04f3b8a099", 00:19:24.387 "is_configured": true, 00:19:24.387 "data_offset": 2048, 00:19:24.387 "data_size": 63488 00:19:24.387 } 00:19:24.387 ] 00:19:24.387 }' 00:19:24.387 12:00:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:24.387 12:00:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:24.387 12:00:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:24.387 12:00:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:24.387 12:00:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:24.646 [2024-07-12 12:00:14.774266] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:24.646 [2024-07-12 12:00:14.808757] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x8d0f50 00:19:24.646 [2024-07-12 12:00:14.809799] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:24.646 12:00:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:19:24.904 [2024-07-12 12:00:14.921915] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:19:24.904 [2024-07-12 12:00:14.922292] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:19:24.904 [2024-07-12 12:00:15.035637] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:19:24.904 [2024-07-12 12:00:15.035732] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:19:25.163 [2024-07-12 12:00:15.272458] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:19:25.163 [2024-07-12 12:00:15.272785] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:19:25.424 [2024-07-12 12:00:15.481220] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:19:25.682 [2024-07-12 12:00:15.715452] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:19:25.682 12:00:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:25.682 12:00:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:25.682 12:00:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:25.682 12:00:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:25.682 12:00:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:25.682 [2024-07-12 12:00:15.824028] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:19:25.682 12:00:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:25.682 12:00:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:25.940 12:00:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:25.940 "name": "raid_bdev1", 00:19:25.940 "uuid": "fe170700-baa0-4a05-9465-8885914c5f04", 00:19:25.940 "strip_size_kb": 0, 00:19:25.940 "state": "online", 00:19:25.940 "raid_level": "raid1", 00:19:25.940 "superblock": true, 00:19:25.940 "num_base_bdevs": 2, 00:19:25.940 "num_base_bdevs_discovered": 2, 00:19:25.940 "num_base_bdevs_operational": 2, 00:19:25.940 "process": { 00:19:25.940 "type": "rebuild", 00:19:25.940 "target": "spare", 00:19:25.940 "progress": { 00:19:25.940 "blocks": 16384, 00:19:25.940 "percent": 25 00:19:25.940 } 00:19:25.940 }, 00:19:25.940 "base_bdevs_list": [ 00:19:25.940 { 00:19:25.940 "name": "spare", 00:19:25.940 "uuid": "a58b7ebb-47b8-5531-aaa3-acc8788361aa", 00:19:25.940 "is_configured": true, 00:19:25.940 "data_offset": 2048, 00:19:25.940 "data_size": 63488 00:19:25.940 }, 00:19:25.940 { 00:19:25.940 "name": "BaseBdev2", 00:19:25.940 "uuid": "2be68b8a-8450-59c3-9985-cb04f3b8a099", 00:19:25.940 "is_configured": true, 00:19:25.940 "data_offset": 2048, 00:19:25.940 "data_size": 63488 00:19:25.940 } 00:19:25.940 ] 00:19:25.940 }' 00:19:25.940 12:00:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:25.940 12:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:25.940 12:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:25.940 12:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:25.940 12:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:19:25.940 12:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:19:25.940 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:19:25.940 12:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:19:25.940 12:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:19:25.940 12:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:19:25.940 12:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=636 00:19:25.940 12:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:25.940 12:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:25.940 12:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:25.940 12:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:25.940 12:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:25.940 12:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:25.940 12:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:25.940 12:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:26.199 12:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:26.199 "name": "raid_bdev1", 00:19:26.199 "uuid": "fe170700-baa0-4a05-9465-8885914c5f04", 00:19:26.199 "strip_size_kb": 0, 00:19:26.199 "state": "online", 00:19:26.199 "raid_level": "raid1", 00:19:26.199 "superblock": true, 00:19:26.199 "num_base_bdevs": 2, 00:19:26.199 "num_base_bdevs_discovered": 2, 00:19:26.199 "num_base_bdevs_operational": 2, 00:19:26.199 "process": { 00:19:26.199 "type": "rebuild", 00:19:26.199 "target": "spare", 00:19:26.199 "progress": { 00:19:26.199 "blocks": 20480, 00:19:26.199 "percent": 32 00:19:26.199 } 00:19:26.199 }, 00:19:26.199 "base_bdevs_list": [ 00:19:26.199 { 00:19:26.199 "name": "spare", 00:19:26.199 "uuid": "a58b7ebb-47b8-5531-aaa3-acc8788361aa", 00:19:26.199 "is_configured": true, 00:19:26.199 "data_offset": 2048, 00:19:26.199 "data_size": 63488 00:19:26.199 }, 00:19:26.199 { 00:19:26.199 "name": "BaseBdev2", 00:19:26.199 "uuid": "2be68b8a-8450-59c3-9985-cb04f3b8a099", 00:19:26.199 "is_configured": true, 00:19:26.199 "data_offset": 2048, 00:19:26.199 "data_size": 63488 00:19:26.199 } 00:19:26.199 ] 00:19:26.199 }' 00:19:26.199 12:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:26.199 [2024-07-12 12:00:16.294187] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:19:26.199 12:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:26.199 12:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:26.199 12:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:26.199 12:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:19:27.134 12:00:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:27.134 12:00:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:27.134 12:00:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:27.134 12:00:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:27.134 12:00:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:27.134 12:00:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:27.134 12:00:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:27.134 12:00:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:27.397 12:00:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:27.397 "name": "raid_bdev1", 00:19:27.397 "uuid": "fe170700-baa0-4a05-9465-8885914c5f04", 00:19:27.397 "strip_size_kb": 0, 00:19:27.397 "state": "online", 00:19:27.397 "raid_level": "raid1", 00:19:27.397 "superblock": true, 00:19:27.397 "num_base_bdevs": 2, 00:19:27.397 "num_base_bdevs_discovered": 2, 00:19:27.397 "num_base_bdevs_operational": 2, 00:19:27.397 "process": { 00:19:27.397 "type": "rebuild", 00:19:27.397 "target": "spare", 00:19:27.397 "progress": { 00:19:27.397 "blocks": 43008, 00:19:27.397 "percent": 67 00:19:27.397 } 00:19:27.397 }, 00:19:27.397 "base_bdevs_list": [ 00:19:27.397 { 00:19:27.397 "name": "spare", 00:19:27.397 "uuid": "a58b7ebb-47b8-5531-aaa3-acc8788361aa", 00:19:27.397 "is_configured": true, 00:19:27.397 "data_offset": 2048, 00:19:27.397 "data_size": 63488 00:19:27.397 }, 00:19:27.397 { 00:19:27.397 "name": "BaseBdev2", 00:19:27.397 "uuid": "2be68b8a-8450-59c3-9985-cb04f3b8a099", 00:19:27.397 "is_configured": true, 00:19:27.397 "data_offset": 2048, 00:19:27.397 "data_size": 63488 00:19:27.397 } 00:19:27.397 ] 00:19:27.397 }' 00:19:27.397 12:00:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:27.397 12:00:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:27.397 12:00:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:27.397 12:00:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:27.397 12:00:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:19:27.659 [2024-07-12 12:00:17.870812] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:19:28.226 [2024-07-12 12:00:18.295797] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:19:28.484 12:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:28.484 12:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:28.484 12:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:28.484 12:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:28.484 12:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:28.484 12:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:28.484 12:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:28.484 12:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:28.484 [2024-07-12 12:00:18.608537] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:19:28.484 [2024-07-12 12:00:18.708787] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:19:28.484 [2024-07-12 12:00:18.710266] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:28.742 12:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:28.742 "name": "raid_bdev1", 00:19:28.742 "uuid": "fe170700-baa0-4a05-9465-8885914c5f04", 00:19:28.742 "strip_size_kb": 0, 00:19:28.742 "state": "online", 00:19:28.742 "raid_level": "raid1", 00:19:28.742 "superblock": true, 00:19:28.742 "num_base_bdevs": 2, 00:19:28.742 "num_base_bdevs_discovered": 2, 00:19:28.742 "num_base_bdevs_operational": 2, 00:19:28.742 "base_bdevs_list": [ 00:19:28.742 { 00:19:28.742 "name": "spare", 00:19:28.742 "uuid": "a58b7ebb-47b8-5531-aaa3-acc8788361aa", 00:19:28.742 "is_configured": true, 00:19:28.742 "data_offset": 2048, 00:19:28.742 "data_size": 63488 00:19:28.742 }, 00:19:28.742 { 00:19:28.742 "name": "BaseBdev2", 00:19:28.742 "uuid": "2be68b8a-8450-59c3-9985-cb04f3b8a099", 00:19:28.742 "is_configured": true, 00:19:28.742 "data_offset": 2048, 00:19:28.742 "data_size": 63488 00:19:28.742 } 00:19:28.742 ] 00:19:28.742 }' 00:19:28.742 12:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:28.742 12:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:19:28.742 12:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:28.742 12:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:19:28.742 12:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:19:28.742 12:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:28.742 12:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:28.742 12:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:28.742 12:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:28.742 12:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:28.742 12:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:28.742 12:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:29.000 12:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:29.000 "name": "raid_bdev1", 00:19:29.000 "uuid": "fe170700-baa0-4a05-9465-8885914c5f04", 00:19:29.000 "strip_size_kb": 0, 00:19:29.000 "state": "online", 00:19:29.000 "raid_level": "raid1", 00:19:29.000 "superblock": true, 00:19:29.000 "num_base_bdevs": 2, 00:19:29.000 "num_base_bdevs_discovered": 2, 00:19:29.000 "num_base_bdevs_operational": 2, 00:19:29.000 "base_bdevs_list": [ 00:19:29.000 { 00:19:29.000 "name": "spare", 00:19:29.000 "uuid": "a58b7ebb-47b8-5531-aaa3-acc8788361aa", 00:19:29.000 "is_configured": true, 00:19:29.000 "data_offset": 2048, 00:19:29.000 "data_size": 63488 00:19:29.000 }, 00:19:29.000 { 00:19:29.000 "name": "BaseBdev2", 00:19:29.000 "uuid": "2be68b8a-8450-59c3-9985-cb04f3b8a099", 00:19:29.000 "is_configured": true, 00:19:29.000 "data_offset": 2048, 00:19:29.000 "data_size": 63488 00:19:29.000 } 00:19:29.000 ] 00:19:29.000 }' 00:19:29.000 12:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:29.000 12:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:29.000 12:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:29.000 12:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:29.000 12:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:29.000 12:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:29.000 12:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:29.000 12:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:29.000 12:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:29.000 12:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:19:29.000 12:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:29.000 12:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:29.000 12:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:29.000 12:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:29.000 12:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:29.000 12:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:29.258 12:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:29.258 "name": "raid_bdev1", 00:19:29.258 "uuid": "fe170700-baa0-4a05-9465-8885914c5f04", 00:19:29.258 "strip_size_kb": 0, 00:19:29.258 "state": "online", 00:19:29.258 "raid_level": "raid1", 00:19:29.258 "superblock": true, 00:19:29.258 "num_base_bdevs": 2, 00:19:29.258 "num_base_bdevs_discovered": 2, 00:19:29.258 "num_base_bdevs_operational": 2, 00:19:29.258 "base_bdevs_list": [ 00:19:29.258 { 00:19:29.258 "name": "spare", 00:19:29.258 "uuid": "a58b7ebb-47b8-5531-aaa3-acc8788361aa", 00:19:29.258 "is_configured": true, 00:19:29.258 "data_offset": 2048, 00:19:29.258 "data_size": 63488 00:19:29.258 }, 00:19:29.258 { 00:19:29.258 "name": "BaseBdev2", 00:19:29.258 "uuid": "2be68b8a-8450-59c3-9985-cb04f3b8a099", 00:19:29.258 "is_configured": true, 00:19:29.258 "data_offset": 2048, 00:19:29.258 "data_size": 63488 00:19:29.258 } 00:19:29.258 ] 00:19:29.258 }' 00:19:29.258 12:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:29.258 12:00:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:29.516 12:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:29.774 [2024-07-12 12:00:19.873813] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:29.774 [2024-07-12 12:00:19.873839] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:29.774 00:19:29.774 Latency(us) 00:19:29.774 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:19:29.774 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:19:29.774 raid_bdev1 : 8.73 126.79 380.36 0.00 0.00 10698.68 236.01 113845.39 00:19:29.774 =================================================================================================================== 00:19:29.774 Total : 126.79 380.36 0.00 0.00 10698.68 236.01 113845.39 00:19:29.774 [2024-07-12 12:00:19.896603] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:29.774 [2024-07-12 12:00:19.896639] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:29.775 [2024-07-12 12:00:19.896687] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:29.775 [2024-07-12 12:00:19.896693] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa6e6c0 name raid_bdev1, state offline 00:19:29.775 0 00:19:29.775 12:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:29.775 12:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:19:30.033 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:19:30.033 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:19:30.033 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:19:30.033 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:19:30.033 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:30.033 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:19:30.033 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:19:30.033 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:19:30.033 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:19:30.033 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:19:30.033 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:19:30.033 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:30.033 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:19:30.033 /dev/nbd0 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:30.291 1+0 records in 00:19:30.291 1+0 records out 00:19:30.291 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00021332 s, 19.2 MB/s 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:19:30.291 /dev/nbd1 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:30.291 1+0 records in 00:19:30.291 1+0 records out 00:19:30.291 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000192052 s, 21.3 MB/s 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:30.291 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:19:30.551 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:19:30.551 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:30.551 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:19:30.551 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:19:30.551 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:19:30.551 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:30.551 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:19:30.551 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:19:30.551 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:19:30.551 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:19:30.551 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:30.551 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:30.551 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:19:30.551 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:19:30.551 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:19:30.551 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:19:30.551 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:30.551 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:19:30.551 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:19:30.551 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:19:30.551 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:30.551 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:19:30.810 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:19:30.810 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:19:30.810 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:19:30.810 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:30.810 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:30.810 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:19:30.810 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:19:30.810 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:19:30.810 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:19:30.810 12:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:19:31.069 12:00:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:19:31.069 [2024-07-12 12:00:21.256097] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:19:31.069 [2024-07-12 12:00:21.256127] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:31.069 [2024-07-12 12:00:21.256138] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x8cf9e0 00:19:31.069 [2024-07-12 12:00:21.256145] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:31.069 [2024-07-12 12:00:21.257380] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:31.069 [2024-07-12 12:00:21.257401] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:19:31.069 [2024-07-12 12:00:21.257456] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:19:31.069 [2024-07-12 12:00:21.257477] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:31.069 [2024-07-12 12:00:21.257558] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:31.069 spare 00:19:31.069 12:00:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:31.069 12:00:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:31.069 12:00:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:31.069 12:00:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:31.069 12:00:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:31.069 12:00:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:19:31.069 12:00:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:31.069 12:00:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:31.069 12:00:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:31.069 12:00:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:31.069 12:00:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:31.069 12:00:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:31.328 [2024-07-12 12:00:21.357852] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xa69700 00:19:31.328 [2024-07-12 12:00:21.357863] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:31.328 [2024-07-12 12:00:21.357990] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa68a80 00:19:31.328 [2024-07-12 12:00:21.358084] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa69700 00:19:31.328 [2024-07-12 12:00:21.358090] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xa69700 00:19:31.329 [2024-07-12 12:00:21.358155] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:31.329 12:00:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:31.329 "name": "raid_bdev1", 00:19:31.329 "uuid": "fe170700-baa0-4a05-9465-8885914c5f04", 00:19:31.329 "strip_size_kb": 0, 00:19:31.329 "state": "online", 00:19:31.329 "raid_level": "raid1", 00:19:31.329 "superblock": true, 00:19:31.329 "num_base_bdevs": 2, 00:19:31.329 "num_base_bdevs_discovered": 2, 00:19:31.329 "num_base_bdevs_operational": 2, 00:19:31.329 "base_bdevs_list": [ 00:19:31.329 { 00:19:31.329 "name": "spare", 00:19:31.329 "uuid": "a58b7ebb-47b8-5531-aaa3-acc8788361aa", 00:19:31.329 "is_configured": true, 00:19:31.329 "data_offset": 2048, 00:19:31.329 "data_size": 63488 00:19:31.329 }, 00:19:31.329 { 00:19:31.329 "name": "BaseBdev2", 00:19:31.329 "uuid": "2be68b8a-8450-59c3-9985-cb04f3b8a099", 00:19:31.329 "is_configured": true, 00:19:31.329 "data_offset": 2048, 00:19:31.329 "data_size": 63488 00:19:31.329 } 00:19:31.329 ] 00:19:31.329 }' 00:19:31.329 12:00:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:31.329 12:00:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:31.895 12:00:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:31.895 12:00:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:31.895 12:00:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:31.895 12:00:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:31.895 12:00:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:31.895 12:00:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:31.895 12:00:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:31.895 12:00:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:31.895 "name": "raid_bdev1", 00:19:31.895 "uuid": "fe170700-baa0-4a05-9465-8885914c5f04", 00:19:31.895 "strip_size_kb": 0, 00:19:31.895 "state": "online", 00:19:31.895 "raid_level": "raid1", 00:19:31.895 "superblock": true, 00:19:31.895 "num_base_bdevs": 2, 00:19:31.895 "num_base_bdevs_discovered": 2, 00:19:31.895 "num_base_bdevs_operational": 2, 00:19:31.895 "base_bdevs_list": [ 00:19:31.895 { 00:19:31.895 "name": "spare", 00:19:31.895 "uuid": "a58b7ebb-47b8-5531-aaa3-acc8788361aa", 00:19:31.895 "is_configured": true, 00:19:31.895 "data_offset": 2048, 00:19:31.895 "data_size": 63488 00:19:31.895 }, 00:19:31.895 { 00:19:31.895 "name": "BaseBdev2", 00:19:31.895 "uuid": "2be68b8a-8450-59c3-9985-cb04f3b8a099", 00:19:31.895 "is_configured": true, 00:19:31.895 "data_offset": 2048, 00:19:31.895 "data_size": 63488 00:19:31.895 } 00:19:31.895 ] 00:19:31.895 }' 00:19:31.895 12:00:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:32.154 12:00:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:32.154 12:00:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:32.154 12:00:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:32.154 12:00:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:32.154 12:00:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:19:32.154 12:00:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:19:32.154 12:00:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:19:32.413 [2024-07-12 12:00:22.523538] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:32.413 12:00:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:32.413 12:00:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:32.413 12:00:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:32.413 12:00:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:32.413 12:00:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:32.413 12:00:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:19:32.413 12:00:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:32.413 12:00:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:32.413 12:00:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:32.413 12:00:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:32.413 12:00:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:32.413 12:00:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:32.672 12:00:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:32.672 "name": "raid_bdev1", 00:19:32.672 "uuid": "fe170700-baa0-4a05-9465-8885914c5f04", 00:19:32.672 "strip_size_kb": 0, 00:19:32.672 "state": "online", 00:19:32.672 "raid_level": "raid1", 00:19:32.672 "superblock": true, 00:19:32.672 "num_base_bdevs": 2, 00:19:32.672 "num_base_bdevs_discovered": 1, 00:19:32.672 "num_base_bdevs_operational": 1, 00:19:32.672 "base_bdevs_list": [ 00:19:32.672 { 00:19:32.672 "name": null, 00:19:32.672 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:32.672 "is_configured": false, 00:19:32.672 "data_offset": 2048, 00:19:32.672 "data_size": 63488 00:19:32.672 }, 00:19:32.672 { 00:19:32.672 "name": "BaseBdev2", 00:19:32.672 "uuid": "2be68b8a-8450-59c3-9985-cb04f3b8a099", 00:19:32.672 "is_configured": true, 00:19:32.672 "data_offset": 2048, 00:19:32.672 "data_size": 63488 00:19:32.672 } 00:19:32.672 ] 00:19:32.672 }' 00:19:32.672 12:00:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:32.672 12:00:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:32.933 12:00:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:33.192 [2024-07-12 12:00:23.321679] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:33.192 [2024-07-12 12:00:23.321794] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:19:33.192 [2024-07-12 12:00:23.321803] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:19:33.192 [2024-07-12 12:00:23.321820] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:33.192 [2024-07-12 12:00:23.326404] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x8bf960 00:19:33.192 [2024-07-12 12:00:23.327772] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:33.192 12:00:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:19:34.128 12:00:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:34.128 12:00:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:34.128 12:00:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:34.128 12:00:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:34.128 12:00:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:34.128 12:00:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:34.128 12:00:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:34.386 12:00:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:34.386 "name": "raid_bdev1", 00:19:34.386 "uuid": "fe170700-baa0-4a05-9465-8885914c5f04", 00:19:34.386 "strip_size_kb": 0, 00:19:34.386 "state": "online", 00:19:34.386 "raid_level": "raid1", 00:19:34.386 "superblock": true, 00:19:34.386 "num_base_bdevs": 2, 00:19:34.386 "num_base_bdevs_discovered": 2, 00:19:34.386 "num_base_bdevs_operational": 2, 00:19:34.386 "process": { 00:19:34.386 "type": "rebuild", 00:19:34.386 "target": "spare", 00:19:34.386 "progress": { 00:19:34.386 "blocks": 22528, 00:19:34.386 "percent": 35 00:19:34.386 } 00:19:34.386 }, 00:19:34.386 "base_bdevs_list": [ 00:19:34.386 { 00:19:34.386 "name": "spare", 00:19:34.386 "uuid": "a58b7ebb-47b8-5531-aaa3-acc8788361aa", 00:19:34.386 "is_configured": true, 00:19:34.386 "data_offset": 2048, 00:19:34.386 "data_size": 63488 00:19:34.386 }, 00:19:34.386 { 00:19:34.386 "name": "BaseBdev2", 00:19:34.386 "uuid": "2be68b8a-8450-59c3-9985-cb04f3b8a099", 00:19:34.386 "is_configured": true, 00:19:34.386 "data_offset": 2048, 00:19:34.386 "data_size": 63488 00:19:34.386 } 00:19:34.386 ] 00:19:34.386 }' 00:19:34.386 12:00:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:34.386 12:00:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:34.386 12:00:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:34.386 12:00:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:34.386 12:00:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:19:34.645 [2024-07-12 12:00:24.758343] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:34.645 [2024-07-12 12:00:24.838530] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:19:34.645 [2024-07-12 12:00:24.838561] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:34.645 [2024-07-12 12:00:24.838570] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:34.645 [2024-07-12 12:00:24.838574] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:19:34.645 12:00:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:34.645 12:00:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:34.645 12:00:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:34.645 12:00:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:34.645 12:00:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:34.645 12:00:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:19:34.645 12:00:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:34.645 12:00:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:34.645 12:00:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:34.645 12:00:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:34.645 12:00:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:34.645 12:00:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:34.904 12:00:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:34.904 "name": "raid_bdev1", 00:19:34.904 "uuid": "fe170700-baa0-4a05-9465-8885914c5f04", 00:19:34.904 "strip_size_kb": 0, 00:19:34.904 "state": "online", 00:19:34.904 "raid_level": "raid1", 00:19:34.904 "superblock": true, 00:19:34.904 "num_base_bdevs": 2, 00:19:34.904 "num_base_bdevs_discovered": 1, 00:19:34.904 "num_base_bdevs_operational": 1, 00:19:34.904 "base_bdevs_list": [ 00:19:34.904 { 00:19:34.904 "name": null, 00:19:34.904 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:34.904 "is_configured": false, 00:19:34.904 "data_offset": 2048, 00:19:34.904 "data_size": 63488 00:19:34.904 }, 00:19:34.904 { 00:19:34.904 "name": "BaseBdev2", 00:19:34.904 "uuid": "2be68b8a-8450-59c3-9985-cb04f3b8a099", 00:19:34.904 "is_configured": true, 00:19:34.904 "data_offset": 2048, 00:19:34.904 "data_size": 63488 00:19:34.904 } 00:19:34.904 ] 00:19:34.904 }' 00:19:34.904 12:00:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:34.904 12:00:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:35.472 12:00:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:19:35.472 [2024-07-12 12:00:25.673003] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:19:35.472 [2024-07-12 12:00:25.673036] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:35.472 [2024-07-12 12:00:25.673047] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x8cfe80 00:19:35.472 [2024-07-12 12:00:25.673069] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:35.472 [2024-07-12 12:00:25.673331] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:35.472 [2024-07-12 12:00:25.673340] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:19:35.472 [2024-07-12 12:00:25.673394] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:19:35.472 [2024-07-12 12:00:25.673401] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:19:35.472 [2024-07-12 12:00:25.673407] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:19:35.472 [2024-07-12 12:00:25.673418] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:35.472 [2024-07-12 12:00:25.677955] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa6c2a0 00:19:35.472 spare 00:19:35.472 [2024-07-12 12:00:25.679014] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:35.472 12:00:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:19:36.864 12:00:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:36.864 12:00:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:36.864 12:00:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:36.864 12:00:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:36.864 12:00:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:36.864 12:00:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:36.864 12:00:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:36.864 12:00:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:36.864 "name": "raid_bdev1", 00:19:36.864 "uuid": "fe170700-baa0-4a05-9465-8885914c5f04", 00:19:36.864 "strip_size_kb": 0, 00:19:36.864 "state": "online", 00:19:36.864 "raid_level": "raid1", 00:19:36.864 "superblock": true, 00:19:36.864 "num_base_bdevs": 2, 00:19:36.864 "num_base_bdevs_discovered": 2, 00:19:36.864 "num_base_bdevs_operational": 2, 00:19:36.864 "process": { 00:19:36.864 "type": "rebuild", 00:19:36.864 "target": "spare", 00:19:36.864 "progress": { 00:19:36.864 "blocks": 22528, 00:19:36.864 "percent": 35 00:19:36.864 } 00:19:36.864 }, 00:19:36.864 "base_bdevs_list": [ 00:19:36.864 { 00:19:36.864 "name": "spare", 00:19:36.864 "uuid": "a58b7ebb-47b8-5531-aaa3-acc8788361aa", 00:19:36.864 "is_configured": true, 00:19:36.864 "data_offset": 2048, 00:19:36.864 "data_size": 63488 00:19:36.864 }, 00:19:36.864 { 00:19:36.864 "name": "BaseBdev2", 00:19:36.864 "uuid": "2be68b8a-8450-59c3-9985-cb04f3b8a099", 00:19:36.864 "is_configured": true, 00:19:36.864 "data_offset": 2048, 00:19:36.864 "data_size": 63488 00:19:36.864 } 00:19:36.864 ] 00:19:36.864 }' 00:19:36.864 12:00:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:36.864 12:00:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:36.864 12:00:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:36.864 12:00:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:36.864 12:00:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:19:37.124 [2024-07-12 12:00:27.113706] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:37.124 [2024-07-12 12:00:27.189558] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:19:37.124 [2024-07-12 12:00:27.189585] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:37.124 [2024-07-12 12:00:27.189594] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:37.124 [2024-07-12 12:00:27.189614] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:19:37.124 12:00:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:37.124 12:00:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:37.124 12:00:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:37.124 12:00:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:37.124 12:00:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:37.124 12:00:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:19:37.124 12:00:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:37.124 12:00:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:37.124 12:00:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:37.124 12:00:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:37.124 12:00:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:37.124 12:00:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:37.124 12:00:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:37.124 "name": "raid_bdev1", 00:19:37.124 "uuid": "fe170700-baa0-4a05-9465-8885914c5f04", 00:19:37.124 "strip_size_kb": 0, 00:19:37.124 "state": "online", 00:19:37.124 "raid_level": "raid1", 00:19:37.124 "superblock": true, 00:19:37.124 "num_base_bdevs": 2, 00:19:37.124 "num_base_bdevs_discovered": 1, 00:19:37.124 "num_base_bdevs_operational": 1, 00:19:37.124 "base_bdevs_list": [ 00:19:37.124 { 00:19:37.124 "name": null, 00:19:37.124 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:37.124 "is_configured": false, 00:19:37.124 "data_offset": 2048, 00:19:37.124 "data_size": 63488 00:19:37.124 }, 00:19:37.124 { 00:19:37.124 "name": "BaseBdev2", 00:19:37.124 "uuid": "2be68b8a-8450-59c3-9985-cb04f3b8a099", 00:19:37.124 "is_configured": true, 00:19:37.124 "data_offset": 2048, 00:19:37.124 "data_size": 63488 00:19:37.124 } 00:19:37.124 ] 00:19:37.124 }' 00:19:37.124 12:00:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:37.124 12:00:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:37.690 12:00:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:37.690 12:00:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:37.690 12:00:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:37.690 12:00:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:37.690 12:00:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:37.690 12:00:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:37.690 12:00:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:37.949 12:00:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:37.949 "name": "raid_bdev1", 00:19:37.949 "uuid": "fe170700-baa0-4a05-9465-8885914c5f04", 00:19:37.949 "strip_size_kb": 0, 00:19:37.949 "state": "online", 00:19:37.949 "raid_level": "raid1", 00:19:37.949 "superblock": true, 00:19:37.949 "num_base_bdevs": 2, 00:19:37.949 "num_base_bdevs_discovered": 1, 00:19:37.949 "num_base_bdevs_operational": 1, 00:19:37.949 "base_bdevs_list": [ 00:19:37.949 { 00:19:37.949 "name": null, 00:19:37.949 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:37.949 "is_configured": false, 00:19:37.949 "data_offset": 2048, 00:19:37.949 "data_size": 63488 00:19:37.949 }, 00:19:37.949 { 00:19:37.949 "name": "BaseBdev2", 00:19:37.949 "uuid": "2be68b8a-8450-59c3-9985-cb04f3b8a099", 00:19:37.949 "is_configured": true, 00:19:37.949 "data_offset": 2048, 00:19:37.949 "data_size": 63488 00:19:37.949 } 00:19:37.949 ] 00:19:37.949 }' 00:19:37.949 12:00:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:37.949 12:00:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:37.949 12:00:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:37.949 12:00:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:37.949 12:00:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:19:38.208 12:00:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:19:38.208 [2024-07-12 12:00:28.421154] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:19:38.208 [2024-07-12 12:00:28.421185] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:38.208 [2024-07-12 12:00:28.421195] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x8d2ba0 00:19:38.208 [2024-07-12 12:00:28.421201] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:38.208 [2024-07-12 12:00:28.421432] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:38.208 [2024-07-12 12:00:28.421441] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:38.208 [2024-07-12 12:00:28.421482] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:19:38.208 [2024-07-12 12:00:28.421489] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:19:38.208 [2024-07-12 12:00:28.421494] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:19:38.208 BaseBdev1 00:19:38.208 12:00:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:19:39.588 12:00:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:39.588 12:00:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:39.588 12:00:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:39.588 12:00:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:39.588 12:00:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:39.588 12:00:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:19:39.588 12:00:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:39.588 12:00:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:39.588 12:00:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:39.588 12:00:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:39.588 12:00:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:39.588 12:00:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:39.588 12:00:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:39.588 "name": "raid_bdev1", 00:19:39.588 "uuid": "fe170700-baa0-4a05-9465-8885914c5f04", 00:19:39.588 "strip_size_kb": 0, 00:19:39.588 "state": "online", 00:19:39.588 "raid_level": "raid1", 00:19:39.588 "superblock": true, 00:19:39.588 "num_base_bdevs": 2, 00:19:39.588 "num_base_bdevs_discovered": 1, 00:19:39.588 "num_base_bdevs_operational": 1, 00:19:39.588 "base_bdevs_list": [ 00:19:39.588 { 00:19:39.588 "name": null, 00:19:39.588 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:39.588 "is_configured": false, 00:19:39.588 "data_offset": 2048, 00:19:39.588 "data_size": 63488 00:19:39.588 }, 00:19:39.588 { 00:19:39.588 "name": "BaseBdev2", 00:19:39.588 "uuid": "2be68b8a-8450-59c3-9985-cb04f3b8a099", 00:19:39.588 "is_configured": true, 00:19:39.588 "data_offset": 2048, 00:19:39.588 "data_size": 63488 00:19:39.588 } 00:19:39.588 ] 00:19:39.588 }' 00:19:39.588 12:00:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:39.588 12:00:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:40.156 12:00:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:40.156 12:00:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:40.156 12:00:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:40.156 12:00:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:40.156 12:00:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:40.156 12:00:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:40.156 12:00:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:40.156 12:00:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:40.156 "name": "raid_bdev1", 00:19:40.156 "uuid": "fe170700-baa0-4a05-9465-8885914c5f04", 00:19:40.156 "strip_size_kb": 0, 00:19:40.156 "state": "online", 00:19:40.156 "raid_level": "raid1", 00:19:40.156 "superblock": true, 00:19:40.156 "num_base_bdevs": 2, 00:19:40.156 "num_base_bdevs_discovered": 1, 00:19:40.156 "num_base_bdevs_operational": 1, 00:19:40.156 "base_bdevs_list": [ 00:19:40.156 { 00:19:40.156 "name": null, 00:19:40.156 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:40.156 "is_configured": false, 00:19:40.156 "data_offset": 2048, 00:19:40.156 "data_size": 63488 00:19:40.156 }, 00:19:40.156 { 00:19:40.156 "name": "BaseBdev2", 00:19:40.156 "uuid": "2be68b8a-8450-59c3-9985-cb04f3b8a099", 00:19:40.156 "is_configured": true, 00:19:40.156 "data_offset": 2048, 00:19:40.156 "data_size": 63488 00:19:40.156 } 00:19:40.156 ] 00:19:40.156 }' 00:19:40.156 12:00:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:40.156 12:00:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:40.156 12:00:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:40.156 12:00:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:40.156 12:00:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:19:40.156 12:00:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:19:40.156 12:00:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:19:40.156 12:00:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:40.156 12:00:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:40.156 12:00:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:40.156 12:00:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:40.156 12:00:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:40.156 12:00:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:40.156 12:00:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:40.156 12:00:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:19:40.156 12:00:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:19:40.415 [2024-07-12 12:00:30.522771] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:40.415 [2024-07-12 12:00:30.522860] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:19:40.415 [2024-07-12 12:00:30.522868] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:19:40.415 request: 00:19:40.415 { 00:19:40.415 "raid_bdev": "raid_bdev1", 00:19:40.415 "base_bdev": "BaseBdev1", 00:19:40.415 "method": "bdev_raid_add_base_bdev", 00:19:40.415 "req_id": 1 00:19:40.415 } 00:19:40.415 Got JSON-RPC error response 00:19:40.415 response: 00:19:40.415 { 00:19:40.415 "code": -22, 00:19:40.415 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:19:40.415 } 00:19:40.415 12:00:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:19:40.415 12:00:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:19:40.415 12:00:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:19:40.415 12:00:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:19:40.415 12:00:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:19:41.360 12:00:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:41.360 12:00:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:41.360 12:00:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:41.360 12:00:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:41.360 12:00:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:41.360 12:00:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:19:41.360 12:00:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:41.360 12:00:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:41.360 12:00:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:41.360 12:00:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:41.360 12:00:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:41.360 12:00:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:41.618 12:00:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:41.618 "name": "raid_bdev1", 00:19:41.618 "uuid": "fe170700-baa0-4a05-9465-8885914c5f04", 00:19:41.618 "strip_size_kb": 0, 00:19:41.618 "state": "online", 00:19:41.618 "raid_level": "raid1", 00:19:41.618 "superblock": true, 00:19:41.618 "num_base_bdevs": 2, 00:19:41.618 "num_base_bdevs_discovered": 1, 00:19:41.618 "num_base_bdevs_operational": 1, 00:19:41.618 "base_bdevs_list": [ 00:19:41.618 { 00:19:41.618 "name": null, 00:19:41.618 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:41.618 "is_configured": false, 00:19:41.618 "data_offset": 2048, 00:19:41.618 "data_size": 63488 00:19:41.618 }, 00:19:41.618 { 00:19:41.618 "name": "BaseBdev2", 00:19:41.618 "uuid": "2be68b8a-8450-59c3-9985-cb04f3b8a099", 00:19:41.618 "is_configured": true, 00:19:41.618 "data_offset": 2048, 00:19:41.618 "data_size": 63488 00:19:41.618 } 00:19:41.618 ] 00:19:41.618 }' 00:19:41.618 12:00:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:41.618 12:00:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:42.185 12:00:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:42.185 12:00:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:42.185 12:00:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:42.185 12:00:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:42.185 12:00:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:42.185 12:00:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:42.185 12:00:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:42.185 12:00:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:42.185 "name": "raid_bdev1", 00:19:42.185 "uuid": "fe170700-baa0-4a05-9465-8885914c5f04", 00:19:42.185 "strip_size_kb": 0, 00:19:42.185 "state": "online", 00:19:42.185 "raid_level": "raid1", 00:19:42.185 "superblock": true, 00:19:42.185 "num_base_bdevs": 2, 00:19:42.185 "num_base_bdevs_discovered": 1, 00:19:42.185 "num_base_bdevs_operational": 1, 00:19:42.185 "base_bdevs_list": [ 00:19:42.185 { 00:19:42.185 "name": null, 00:19:42.185 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:42.185 "is_configured": false, 00:19:42.185 "data_offset": 2048, 00:19:42.185 "data_size": 63488 00:19:42.185 }, 00:19:42.185 { 00:19:42.185 "name": "BaseBdev2", 00:19:42.185 "uuid": "2be68b8a-8450-59c3-9985-cb04f3b8a099", 00:19:42.185 "is_configured": true, 00:19:42.185 "data_offset": 2048, 00:19:42.185 "data_size": 63488 00:19:42.185 } 00:19:42.185 ] 00:19:42.185 }' 00:19:42.185 12:00:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:42.185 12:00:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:42.185 12:00:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:42.445 12:00:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:42.445 12:00:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 703638 00:19:42.445 12:00:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@948 -- # '[' -z 703638 ']' 00:19:42.445 12:00:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # kill -0 703638 00:19:42.445 12:00:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # uname 00:19:42.445 12:00:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:42.445 12:00:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 703638 00:19:42.445 12:00:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:42.445 12:00:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:42.445 12:00:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 703638' 00:19:42.445 killing process with pid 703638 00:19:42.445 12:00:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # kill 703638 00:19:42.445 Received shutdown signal, test time was about 21.301109 seconds 00:19:42.445 00:19:42.445 Latency(us) 00:19:42.445 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:19:42.446 =================================================================================================================== 00:19:42.446 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:19:42.446 [2024-07-12 12:00:32.494441] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:42.446 [2024-07-12 12:00:32.494508] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:42.446 [2024-07-12 12:00:32.494545] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:42.446 [2024-07-12 12:00:32.494551] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa69700 name raid_bdev1, state offline 00:19:42.446 12:00:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@972 -- # wait 703638 00:19:42.446 [2024-07-12 12:00:32.513118] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:19:42.706 00:19:42.706 real 0m24.810s 00:19:42.706 user 0m38.347s 00:19:42.706 sys 0m2.919s 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:42.706 ************************************ 00:19:42.706 END TEST raid_rebuild_test_sb_io 00:19:42.706 ************************************ 00:19:42.706 12:00:32 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:42.706 12:00:32 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:19:42.706 12:00:32 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 4 false false true 00:19:42.706 12:00:32 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:19:42.706 12:00:32 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:42.706 12:00:32 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:42.706 ************************************ 00:19:42.706 START TEST raid_rebuild_test 00:19:42.706 ************************************ 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 false false true 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=708572 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 708572 /var/tmp/spdk-raid.sock 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@829 -- # '[' -z 708572 ']' 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:42.706 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:42.706 12:00:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:19:42.706 [2024-07-12 12:00:32.802943] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:19:42.706 [2024-07-12 12:00:32.802976] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid708572 ] 00:19:42.706 I/O size of 3145728 is greater than zero copy threshold (65536). 00:19:42.706 Zero copy mechanism will not be used. 00:19:42.706 [2024-07-12 12:00:32.866296] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:42.707 [2024-07-12 12:00:32.944504] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:19:42.964 [2024-07-12 12:00:33.001138] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:42.964 [2024-07-12 12:00:33.001163] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:43.543 12:00:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:43.543 12:00:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@862 -- # return 0 00:19:43.543 12:00:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:19:43.543 12:00:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:43.543 BaseBdev1_malloc 00:19:43.543 12:00:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:19:43.801 [2024-07-12 12:00:33.916781] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:19:43.801 [2024-07-12 12:00:33.916812] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:43.801 [2024-07-12 12:00:33.916825] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d2e010 00:19:43.801 [2024-07-12 12:00:33.916831] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:43.801 [2024-07-12 12:00:33.917975] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:43.801 [2024-07-12 12:00:33.917995] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:43.801 BaseBdev1 00:19:43.801 12:00:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:19:43.801 12:00:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:44.059 BaseBdev2_malloc 00:19:44.059 12:00:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:19:44.059 [2024-07-12 12:00:34.237260] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:19:44.059 [2024-07-12 12:00:34.237288] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:44.059 [2024-07-12 12:00:34.237300] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d2eb60 00:19:44.059 [2024-07-12 12:00:34.237306] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:44.059 [2024-07-12 12:00:34.238286] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:44.059 [2024-07-12 12:00:34.238306] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:44.059 BaseBdev2 00:19:44.059 12:00:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:19:44.059 12:00:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:19:44.318 BaseBdev3_malloc 00:19:44.318 12:00:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:19:44.318 [2024-07-12 12:00:34.557505] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:19:44.318 [2024-07-12 12:00:34.557537] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:44.318 [2024-07-12 12:00:34.557548] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1edb0a0 00:19:44.318 [2024-07-12 12:00:34.557554] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:44.318 [2024-07-12 12:00:34.558592] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:44.318 [2024-07-12 12:00:34.558613] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:19:44.318 BaseBdev3 00:19:44.577 12:00:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:19:44.577 12:00:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:19:44.577 BaseBdev4_malloc 00:19:44.577 12:00:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:19:44.835 [2024-07-12 12:00:34.897916] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:19:44.835 [2024-07-12 12:00:34.897948] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:44.835 [2024-07-12 12:00:34.897960] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ed9880 00:19:44.835 [2024-07-12 12:00:34.897966] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:44.835 [2024-07-12 12:00:34.899029] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:44.835 [2024-07-12 12:00:34.899049] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:19:44.835 BaseBdev4 00:19:44.835 12:00:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:19:44.835 spare_malloc 00:19:45.092 12:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:19:45.092 spare_delay 00:19:45.092 12:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:19:45.349 [2024-07-12 12:00:35.394531] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:19:45.349 [2024-07-12 12:00:35.394562] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:45.349 [2024-07-12 12:00:35.394574] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1edd400 00:19:45.349 [2024-07-12 12:00:35.394580] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:45.349 [2024-07-12 12:00:35.395603] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:45.349 [2024-07-12 12:00:35.395623] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:19:45.349 spare 00:19:45.349 12:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:19:45.349 [2024-07-12 12:00:35.558976] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:45.349 [2024-07-12 12:00:35.559870] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:45.349 [2024-07-12 12:00:35.559907] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:45.349 [2024-07-12 12:00:35.559935] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:45.349 [2024-07-12 12:00:35.559986] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ed7dd0 00:19:45.349 [2024-07-12 12:00:35.559995] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:19:45.349 [2024-07-12 12:00:35.560142] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e62890 00:19:45.349 [2024-07-12 12:00:35.560244] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ed7dd0 00:19:45.349 [2024-07-12 12:00:35.560249] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1ed7dd0 00:19:45.349 [2024-07-12 12:00:35.560323] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:45.349 12:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:19:45.349 12:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:45.349 12:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:45.349 12:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:45.349 12:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:45.349 12:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:45.349 12:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:45.349 12:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:45.349 12:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:45.349 12:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:45.349 12:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:45.349 12:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:45.607 12:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:45.607 "name": "raid_bdev1", 00:19:45.607 "uuid": "62becb9a-cd3e-469e-9bed-ac8d15c0003e", 00:19:45.607 "strip_size_kb": 0, 00:19:45.607 "state": "online", 00:19:45.607 "raid_level": "raid1", 00:19:45.607 "superblock": false, 00:19:45.607 "num_base_bdevs": 4, 00:19:45.607 "num_base_bdevs_discovered": 4, 00:19:45.607 "num_base_bdevs_operational": 4, 00:19:45.607 "base_bdevs_list": [ 00:19:45.607 { 00:19:45.607 "name": "BaseBdev1", 00:19:45.607 "uuid": "1e21ac59-6527-5ed1-8994-8cf460cef9f9", 00:19:45.607 "is_configured": true, 00:19:45.607 "data_offset": 0, 00:19:45.607 "data_size": 65536 00:19:45.607 }, 00:19:45.607 { 00:19:45.607 "name": "BaseBdev2", 00:19:45.607 "uuid": "e6c45c21-75c0-56e3-8824-4b21f10c24fb", 00:19:45.607 "is_configured": true, 00:19:45.607 "data_offset": 0, 00:19:45.607 "data_size": 65536 00:19:45.607 }, 00:19:45.607 { 00:19:45.607 "name": "BaseBdev3", 00:19:45.607 "uuid": "a9661bbb-cca1-5c43-8342-4c904ff738ee", 00:19:45.607 "is_configured": true, 00:19:45.607 "data_offset": 0, 00:19:45.607 "data_size": 65536 00:19:45.607 }, 00:19:45.607 { 00:19:45.607 "name": "BaseBdev4", 00:19:45.607 "uuid": "58919239-0589-5be7-95ea-b79a31f6982e", 00:19:45.607 "is_configured": true, 00:19:45.607 "data_offset": 0, 00:19:45.607 "data_size": 65536 00:19:45.607 } 00:19:45.607 ] 00:19:45.607 }' 00:19:45.607 12:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:45.607 12:00:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:19:46.175 12:00:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:46.175 12:00:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:19:46.175 [2024-07-12 12:00:36.373249] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:46.175 12:00:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:19:46.175 12:00:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:46.175 12:00:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:19:46.436 12:00:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:19:46.436 12:00:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:19:46.436 12:00:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:19:46.436 12:00:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:19:46.436 12:00:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:19:46.436 12:00:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:46.436 12:00:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:19:46.436 12:00:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:19:46.436 12:00:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:19:46.436 12:00:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:19:46.436 12:00:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:19:46.436 12:00:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:19:46.436 12:00:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:46.436 12:00:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:19:46.720 [2024-07-12 12:00:36.709964] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e64fc0 00:19:46.720 /dev/nbd0 00:19:46.720 12:00:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:19:46.720 12:00:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:19:46.720 12:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:19:46.720 12:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:19:46.720 12:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:19:46.720 12:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:19:46.720 12:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:19:46.720 12:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:19:46.721 12:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:19:46.721 12:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:19:46.721 12:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:46.721 1+0 records in 00:19:46.721 1+0 records out 00:19:46.721 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000225241 s, 18.2 MB/s 00:19:46.721 12:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:46.721 12:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:19:46.721 12:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:46.721 12:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:19:46.721 12:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:19:46.721 12:00:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:46.721 12:00:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:46.721 12:00:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:19:46.721 12:00:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:19:46.721 12:00:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:19:52.025 65536+0 records in 00:19:52.025 65536+0 records out 00:19:52.025 33554432 bytes (34 MB, 32 MiB) copied, 4.4804 s, 7.5 MB/s 00:19:52.025 12:00:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:19:52.025 12:00:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:52.025 12:00:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:19:52.025 12:00:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:19:52.025 12:00:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:19:52.025 12:00:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:52.025 12:00:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:19:52.025 [2024-07-12 12:00:41.426012] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:52.025 12:00:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:19:52.025 12:00:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:19:52.025 12:00:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:19:52.025 12:00:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:52.025 12:00:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:52.025 12:00:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:19:52.025 12:00:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:19:52.025 12:00:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:19:52.025 12:00:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:19:52.025 [2024-07-12 12:00:41.581682] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:52.025 12:00:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:52.025 12:00:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:52.025 12:00:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:52.025 12:00:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:52.025 12:00:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:52.025 12:00:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:52.025 12:00:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:52.025 12:00:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:52.025 12:00:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:52.025 12:00:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:52.025 12:00:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:52.025 12:00:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:52.025 12:00:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:52.025 "name": "raid_bdev1", 00:19:52.025 "uuid": "62becb9a-cd3e-469e-9bed-ac8d15c0003e", 00:19:52.025 "strip_size_kb": 0, 00:19:52.025 "state": "online", 00:19:52.025 "raid_level": "raid1", 00:19:52.025 "superblock": false, 00:19:52.025 "num_base_bdevs": 4, 00:19:52.025 "num_base_bdevs_discovered": 3, 00:19:52.025 "num_base_bdevs_operational": 3, 00:19:52.025 "base_bdevs_list": [ 00:19:52.025 { 00:19:52.025 "name": null, 00:19:52.025 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:52.025 "is_configured": false, 00:19:52.025 "data_offset": 0, 00:19:52.025 "data_size": 65536 00:19:52.025 }, 00:19:52.025 { 00:19:52.025 "name": "BaseBdev2", 00:19:52.025 "uuid": "e6c45c21-75c0-56e3-8824-4b21f10c24fb", 00:19:52.025 "is_configured": true, 00:19:52.025 "data_offset": 0, 00:19:52.025 "data_size": 65536 00:19:52.025 }, 00:19:52.025 { 00:19:52.025 "name": "BaseBdev3", 00:19:52.025 "uuid": "a9661bbb-cca1-5c43-8342-4c904ff738ee", 00:19:52.025 "is_configured": true, 00:19:52.025 "data_offset": 0, 00:19:52.025 "data_size": 65536 00:19:52.025 }, 00:19:52.025 { 00:19:52.025 "name": "BaseBdev4", 00:19:52.025 "uuid": "58919239-0589-5be7-95ea-b79a31f6982e", 00:19:52.025 "is_configured": true, 00:19:52.025 "data_offset": 0, 00:19:52.025 "data_size": 65536 00:19:52.025 } 00:19:52.025 ] 00:19:52.025 }' 00:19:52.025 12:00:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:52.025 12:00:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:19:52.284 12:00:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:52.284 [2024-07-12 12:00:42.443919] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:52.284 [2024-07-12 12:00:42.447378] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e64900 00:19:52.284 [2024-07-12 12:00:42.448794] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:52.284 12:00:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:19:53.661 12:00:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:53.661 12:00:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:53.661 12:00:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:53.661 12:00:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:53.661 12:00:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:53.661 12:00:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:53.661 12:00:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:53.661 12:00:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:53.661 "name": "raid_bdev1", 00:19:53.661 "uuid": "62becb9a-cd3e-469e-9bed-ac8d15c0003e", 00:19:53.661 "strip_size_kb": 0, 00:19:53.661 "state": "online", 00:19:53.661 "raid_level": "raid1", 00:19:53.661 "superblock": false, 00:19:53.661 "num_base_bdevs": 4, 00:19:53.661 "num_base_bdevs_discovered": 4, 00:19:53.661 "num_base_bdevs_operational": 4, 00:19:53.661 "process": { 00:19:53.661 "type": "rebuild", 00:19:53.661 "target": "spare", 00:19:53.661 "progress": { 00:19:53.661 "blocks": 22528, 00:19:53.661 "percent": 34 00:19:53.661 } 00:19:53.661 }, 00:19:53.661 "base_bdevs_list": [ 00:19:53.661 { 00:19:53.661 "name": "spare", 00:19:53.661 "uuid": "661fa2d8-81e7-5426-a6e3-35b4e7c36d9d", 00:19:53.661 "is_configured": true, 00:19:53.661 "data_offset": 0, 00:19:53.661 "data_size": 65536 00:19:53.661 }, 00:19:53.661 { 00:19:53.661 "name": "BaseBdev2", 00:19:53.661 "uuid": "e6c45c21-75c0-56e3-8824-4b21f10c24fb", 00:19:53.661 "is_configured": true, 00:19:53.661 "data_offset": 0, 00:19:53.661 "data_size": 65536 00:19:53.661 }, 00:19:53.661 { 00:19:53.661 "name": "BaseBdev3", 00:19:53.661 "uuid": "a9661bbb-cca1-5c43-8342-4c904ff738ee", 00:19:53.661 "is_configured": true, 00:19:53.661 "data_offset": 0, 00:19:53.661 "data_size": 65536 00:19:53.661 }, 00:19:53.661 { 00:19:53.661 "name": "BaseBdev4", 00:19:53.661 "uuid": "58919239-0589-5be7-95ea-b79a31f6982e", 00:19:53.661 "is_configured": true, 00:19:53.661 "data_offset": 0, 00:19:53.661 "data_size": 65536 00:19:53.661 } 00:19:53.661 ] 00:19:53.661 }' 00:19:53.661 12:00:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:53.661 12:00:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:53.661 12:00:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:53.661 12:00:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:53.661 12:00:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:19:53.661 [2024-07-12 12:00:43.852915] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:53.661 [2024-07-12 12:00:43.858548] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:19:53.661 [2024-07-12 12:00:43.858573] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:53.661 [2024-07-12 12:00:43.858583] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:53.662 [2024-07-12 12:00:43.858603] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:19:53.662 12:00:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:53.662 12:00:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:53.662 12:00:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:53.662 12:00:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:53.662 12:00:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:53.662 12:00:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:53.662 12:00:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:53.662 12:00:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:53.662 12:00:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:53.662 12:00:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:53.662 12:00:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:53.662 12:00:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:53.919 12:00:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:53.919 "name": "raid_bdev1", 00:19:53.919 "uuid": "62becb9a-cd3e-469e-9bed-ac8d15c0003e", 00:19:53.919 "strip_size_kb": 0, 00:19:53.919 "state": "online", 00:19:53.919 "raid_level": "raid1", 00:19:53.919 "superblock": false, 00:19:53.919 "num_base_bdevs": 4, 00:19:53.919 "num_base_bdevs_discovered": 3, 00:19:53.919 "num_base_bdevs_operational": 3, 00:19:53.919 "base_bdevs_list": [ 00:19:53.919 { 00:19:53.919 "name": null, 00:19:53.919 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:53.919 "is_configured": false, 00:19:53.919 "data_offset": 0, 00:19:53.919 "data_size": 65536 00:19:53.919 }, 00:19:53.919 { 00:19:53.919 "name": "BaseBdev2", 00:19:53.919 "uuid": "e6c45c21-75c0-56e3-8824-4b21f10c24fb", 00:19:53.919 "is_configured": true, 00:19:53.919 "data_offset": 0, 00:19:53.919 "data_size": 65536 00:19:53.919 }, 00:19:53.919 { 00:19:53.919 "name": "BaseBdev3", 00:19:53.919 "uuid": "a9661bbb-cca1-5c43-8342-4c904ff738ee", 00:19:53.919 "is_configured": true, 00:19:53.919 "data_offset": 0, 00:19:53.919 "data_size": 65536 00:19:53.919 }, 00:19:53.919 { 00:19:53.919 "name": "BaseBdev4", 00:19:53.919 "uuid": "58919239-0589-5be7-95ea-b79a31f6982e", 00:19:53.919 "is_configured": true, 00:19:53.919 "data_offset": 0, 00:19:53.919 "data_size": 65536 00:19:53.919 } 00:19:53.919 ] 00:19:53.919 }' 00:19:53.919 12:00:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:53.919 12:00:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:19:54.484 12:00:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:54.484 12:00:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:54.484 12:00:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:54.484 12:00:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:54.484 12:00:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:54.484 12:00:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:54.484 12:00:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:54.484 12:00:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:54.484 "name": "raid_bdev1", 00:19:54.484 "uuid": "62becb9a-cd3e-469e-9bed-ac8d15c0003e", 00:19:54.484 "strip_size_kb": 0, 00:19:54.484 "state": "online", 00:19:54.484 "raid_level": "raid1", 00:19:54.484 "superblock": false, 00:19:54.484 "num_base_bdevs": 4, 00:19:54.484 "num_base_bdevs_discovered": 3, 00:19:54.484 "num_base_bdevs_operational": 3, 00:19:54.484 "base_bdevs_list": [ 00:19:54.484 { 00:19:54.484 "name": null, 00:19:54.484 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:54.484 "is_configured": false, 00:19:54.484 "data_offset": 0, 00:19:54.484 "data_size": 65536 00:19:54.484 }, 00:19:54.484 { 00:19:54.484 "name": "BaseBdev2", 00:19:54.484 "uuid": "e6c45c21-75c0-56e3-8824-4b21f10c24fb", 00:19:54.484 "is_configured": true, 00:19:54.484 "data_offset": 0, 00:19:54.484 "data_size": 65536 00:19:54.484 }, 00:19:54.484 { 00:19:54.484 "name": "BaseBdev3", 00:19:54.484 "uuid": "a9661bbb-cca1-5c43-8342-4c904ff738ee", 00:19:54.484 "is_configured": true, 00:19:54.484 "data_offset": 0, 00:19:54.484 "data_size": 65536 00:19:54.484 }, 00:19:54.484 { 00:19:54.484 "name": "BaseBdev4", 00:19:54.484 "uuid": "58919239-0589-5be7-95ea-b79a31f6982e", 00:19:54.484 "is_configured": true, 00:19:54.484 "data_offset": 0, 00:19:54.484 "data_size": 65536 00:19:54.484 } 00:19:54.484 ] 00:19:54.484 }' 00:19:54.484 12:00:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:54.742 12:00:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:54.742 12:00:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:54.742 12:00:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:54.742 12:00:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:54.742 [2024-07-12 12:00:44.961074] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:54.742 [2024-07-12 12:00:44.964595] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1edd820 00:19:54.742 [2024-07-12 12:00:44.965654] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:54.742 12:00:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:19:56.123 12:00:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:56.123 12:00:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:56.123 12:00:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:56.123 12:00:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:56.123 12:00:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:56.123 12:00:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:56.123 12:00:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:56.123 12:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:56.123 "name": "raid_bdev1", 00:19:56.123 "uuid": "62becb9a-cd3e-469e-9bed-ac8d15c0003e", 00:19:56.123 "strip_size_kb": 0, 00:19:56.123 "state": "online", 00:19:56.123 "raid_level": "raid1", 00:19:56.123 "superblock": false, 00:19:56.123 "num_base_bdevs": 4, 00:19:56.123 "num_base_bdevs_discovered": 4, 00:19:56.123 "num_base_bdevs_operational": 4, 00:19:56.123 "process": { 00:19:56.123 "type": "rebuild", 00:19:56.123 "target": "spare", 00:19:56.123 "progress": { 00:19:56.123 "blocks": 22528, 00:19:56.123 "percent": 34 00:19:56.123 } 00:19:56.123 }, 00:19:56.123 "base_bdevs_list": [ 00:19:56.123 { 00:19:56.123 "name": "spare", 00:19:56.123 "uuid": "661fa2d8-81e7-5426-a6e3-35b4e7c36d9d", 00:19:56.123 "is_configured": true, 00:19:56.123 "data_offset": 0, 00:19:56.123 "data_size": 65536 00:19:56.123 }, 00:19:56.123 { 00:19:56.123 "name": "BaseBdev2", 00:19:56.123 "uuid": "e6c45c21-75c0-56e3-8824-4b21f10c24fb", 00:19:56.123 "is_configured": true, 00:19:56.123 "data_offset": 0, 00:19:56.123 "data_size": 65536 00:19:56.123 }, 00:19:56.123 { 00:19:56.123 "name": "BaseBdev3", 00:19:56.123 "uuid": "a9661bbb-cca1-5c43-8342-4c904ff738ee", 00:19:56.123 "is_configured": true, 00:19:56.123 "data_offset": 0, 00:19:56.123 "data_size": 65536 00:19:56.123 }, 00:19:56.123 { 00:19:56.123 "name": "BaseBdev4", 00:19:56.123 "uuid": "58919239-0589-5be7-95ea-b79a31f6982e", 00:19:56.123 "is_configured": true, 00:19:56.123 "data_offset": 0, 00:19:56.123 "data_size": 65536 00:19:56.123 } 00:19:56.123 ] 00:19:56.123 }' 00:19:56.123 12:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:56.123 12:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:56.123 12:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:56.123 12:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:56.123 12:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:19:56.123 12:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:19:56.123 12:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:19:56.123 12:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:19:56.123 12:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:19:56.381 [2024-07-12 12:00:46.385849] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:56.381 [2024-07-12 12:00:46.476119] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1edd820 00:19:56.381 12:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:19:56.381 12:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:19:56.381 12:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:56.381 12:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:56.381 12:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:56.381 12:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:56.381 12:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:56.381 12:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:56.381 12:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:56.640 12:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:56.640 "name": "raid_bdev1", 00:19:56.640 "uuid": "62becb9a-cd3e-469e-9bed-ac8d15c0003e", 00:19:56.640 "strip_size_kb": 0, 00:19:56.640 "state": "online", 00:19:56.640 "raid_level": "raid1", 00:19:56.640 "superblock": false, 00:19:56.640 "num_base_bdevs": 4, 00:19:56.640 "num_base_bdevs_discovered": 3, 00:19:56.640 "num_base_bdevs_operational": 3, 00:19:56.640 "process": { 00:19:56.640 "type": "rebuild", 00:19:56.640 "target": "spare", 00:19:56.640 "progress": { 00:19:56.640 "blocks": 32768, 00:19:56.640 "percent": 50 00:19:56.640 } 00:19:56.640 }, 00:19:56.640 "base_bdevs_list": [ 00:19:56.640 { 00:19:56.640 "name": "spare", 00:19:56.640 "uuid": "661fa2d8-81e7-5426-a6e3-35b4e7c36d9d", 00:19:56.640 "is_configured": true, 00:19:56.640 "data_offset": 0, 00:19:56.640 "data_size": 65536 00:19:56.640 }, 00:19:56.640 { 00:19:56.640 "name": null, 00:19:56.640 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:56.640 "is_configured": false, 00:19:56.640 "data_offset": 0, 00:19:56.640 "data_size": 65536 00:19:56.640 }, 00:19:56.640 { 00:19:56.640 "name": "BaseBdev3", 00:19:56.640 "uuid": "a9661bbb-cca1-5c43-8342-4c904ff738ee", 00:19:56.640 "is_configured": true, 00:19:56.640 "data_offset": 0, 00:19:56.640 "data_size": 65536 00:19:56.640 }, 00:19:56.640 { 00:19:56.640 "name": "BaseBdev4", 00:19:56.640 "uuid": "58919239-0589-5be7-95ea-b79a31f6982e", 00:19:56.640 "is_configured": true, 00:19:56.640 "data_offset": 0, 00:19:56.640 "data_size": 65536 00:19:56.640 } 00:19:56.640 ] 00:19:56.640 }' 00:19:56.640 12:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:56.640 12:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:56.640 12:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:56.640 12:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:56.640 12:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=666 00:19:56.640 12:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:56.640 12:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:56.640 12:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:56.640 12:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:56.640 12:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:56.640 12:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:56.640 12:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:56.640 12:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:56.899 12:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:56.899 "name": "raid_bdev1", 00:19:56.899 "uuid": "62becb9a-cd3e-469e-9bed-ac8d15c0003e", 00:19:56.899 "strip_size_kb": 0, 00:19:56.899 "state": "online", 00:19:56.899 "raid_level": "raid1", 00:19:56.899 "superblock": false, 00:19:56.899 "num_base_bdevs": 4, 00:19:56.899 "num_base_bdevs_discovered": 3, 00:19:56.899 "num_base_bdevs_operational": 3, 00:19:56.899 "process": { 00:19:56.899 "type": "rebuild", 00:19:56.899 "target": "spare", 00:19:56.899 "progress": { 00:19:56.899 "blocks": 38912, 00:19:56.899 "percent": 59 00:19:56.899 } 00:19:56.899 }, 00:19:56.899 "base_bdevs_list": [ 00:19:56.899 { 00:19:56.899 "name": "spare", 00:19:56.899 "uuid": "661fa2d8-81e7-5426-a6e3-35b4e7c36d9d", 00:19:56.899 "is_configured": true, 00:19:56.899 "data_offset": 0, 00:19:56.899 "data_size": 65536 00:19:56.899 }, 00:19:56.899 { 00:19:56.899 "name": null, 00:19:56.899 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:56.899 "is_configured": false, 00:19:56.899 "data_offset": 0, 00:19:56.899 "data_size": 65536 00:19:56.899 }, 00:19:56.899 { 00:19:56.899 "name": "BaseBdev3", 00:19:56.899 "uuid": "a9661bbb-cca1-5c43-8342-4c904ff738ee", 00:19:56.899 "is_configured": true, 00:19:56.899 "data_offset": 0, 00:19:56.899 "data_size": 65536 00:19:56.899 }, 00:19:56.899 { 00:19:56.899 "name": "BaseBdev4", 00:19:56.899 "uuid": "58919239-0589-5be7-95ea-b79a31f6982e", 00:19:56.899 "is_configured": true, 00:19:56.899 "data_offset": 0, 00:19:56.899 "data_size": 65536 00:19:56.899 } 00:19:56.899 ] 00:19:56.899 }' 00:19:56.899 12:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:56.899 12:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:56.899 12:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:56.899 12:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:56.899 12:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:19:57.835 12:00:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:57.835 12:00:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:57.835 12:00:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:57.835 12:00:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:57.835 12:00:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:57.835 12:00:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:57.835 12:00:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:57.835 12:00:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:58.093 12:00:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:58.094 "name": "raid_bdev1", 00:19:58.094 "uuid": "62becb9a-cd3e-469e-9bed-ac8d15c0003e", 00:19:58.094 "strip_size_kb": 0, 00:19:58.094 "state": "online", 00:19:58.094 "raid_level": "raid1", 00:19:58.094 "superblock": false, 00:19:58.094 "num_base_bdevs": 4, 00:19:58.094 "num_base_bdevs_discovered": 3, 00:19:58.094 "num_base_bdevs_operational": 3, 00:19:58.094 "process": { 00:19:58.094 "type": "rebuild", 00:19:58.094 "target": "spare", 00:19:58.094 "progress": { 00:19:58.094 "blocks": 63488, 00:19:58.094 "percent": 96 00:19:58.094 } 00:19:58.094 }, 00:19:58.094 "base_bdevs_list": [ 00:19:58.094 { 00:19:58.094 "name": "spare", 00:19:58.094 "uuid": "661fa2d8-81e7-5426-a6e3-35b4e7c36d9d", 00:19:58.094 "is_configured": true, 00:19:58.094 "data_offset": 0, 00:19:58.094 "data_size": 65536 00:19:58.094 }, 00:19:58.094 { 00:19:58.094 "name": null, 00:19:58.094 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:58.094 "is_configured": false, 00:19:58.094 "data_offset": 0, 00:19:58.094 "data_size": 65536 00:19:58.094 }, 00:19:58.094 { 00:19:58.094 "name": "BaseBdev3", 00:19:58.094 "uuid": "a9661bbb-cca1-5c43-8342-4c904ff738ee", 00:19:58.094 "is_configured": true, 00:19:58.094 "data_offset": 0, 00:19:58.094 "data_size": 65536 00:19:58.094 }, 00:19:58.094 { 00:19:58.094 "name": "BaseBdev4", 00:19:58.094 "uuid": "58919239-0589-5be7-95ea-b79a31f6982e", 00:19:58.094 "is_configured": true, 00:19:58.094 "data_offset": 0, 00:19:58.094 "data_size": 65536 00:19:58.094 } 00:19:58.094 ] 00:19:58.094 }' 00:19:58.094 12:00:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:58.094 [2024-07-12 12:00:48.188035] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:19:58.094 [2024-07-12 12:00:48.188073] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:19:58.094 [2024-07-12 12:00:48.188094] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:58.094 12:00:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:58.094 12:00:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:58.094 12:00:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:58.094 12:00:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:19:59.030 12:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:59.030 12:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:59.030 12:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:59.030 12:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:59.030 12:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:59.030 12:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:59.030 12:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:59.030 12:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:59.289 12:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:59.289 "name": "raid_bdev1", 00:19:59.289 "uuid": "62becb9a-cd3e-469e-9bed-ac8d15c0003e", 00:19:59.289 "strip_size_kb": 0, 00:19:59.289 "state": "online", 00:19:59.289 "raid_level": "raid1", 00:19:59.289 "superblock": false, 00:19:59.289 "num_base_bdevs": 4, 00:19:59.289 "num_base_bdevs_discovered": 3, 00:19:59.289 "num_base_bdevs_operational": 3, 00:19:59.289 "base_bdevs_list": [ 00:19:59.289 { 00:19:59.289 "name": "spare", 00:19:59.289 "uuid": "661fa2d8-81e7-5426-a6e3-35b4e7c36d9d", 00:19:59.289 "is_configured": true, 00:19:59.289 "data_offset": 0, 00:19:59.289 "data_size": 65536 00:19:59.289 }, 00:19:59.289 { 00:19:59.289 "name": null, 00:19:59.289 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:59.289 "is_configured": false, 00:19:59.289 "data_offset": 0, 00:19:59.289 "data_size": 65536 00:19:59.289 }, 00:19:59.289 { 00:19:59.289 "name": "BaseBdev3", 00:19:59.289 "uuid": "a9661bbb-cca1-5c43-8342-4c904ff738ee", 00:19:59.289 "is_configured": true, 00:19:59.289 "data_offset": 0, 00:19:59.289 "data_size": 65536 00:19:59.289 }, 00:19:59.289 { 00:19:59.289 "name": "BaseBdev4", 00:19:59.289 "uuid": "58919239-0589-5be7-95ea-b79a31f6982e", 00:19:59.289 "is_configured": true, 00:19:59.289 "data_offset": 0, 00:19:59.289 "data_size": 65536 00:19:59.289 } 00:19:59.289 ] 00:19:59.289 }' 00:19:59.289 12:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:59.289 12:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:19:59.289 12:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:59.289 12:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:19:59.289 12:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:19:59.289 12:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:59.289 12:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:59.289 12:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:59.289 12:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:59.289 12:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:59.289 12:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:59.289 12:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:59.547 12:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:59.547 "name": "raid_bdev1", 00:19:59.547 "uuid": "62becb9a-cd3e-469e-9bed-ac8d15c0003e", 00:19:59.547 "strip_size_kb": 0, 00:19:59.547 "state": "online", 00:19:59.547 "raid_level": "raid1", 00:19:59.547 "superblock": false, 00:19:59.547 "num_base_bdevs": 4, 00:19:59.547 "num_base_bdevs_discovered": 3, 00:19:59.547 "num_base_bdevs_operational": 3, 00:19:59.547 "base_bdevs_list": [ 00:19:59.547 { 00:19:59.547 "name": "spare", 00:19:59.547 "uuid": "661fa2d8-81e7-5426-a6e3-35b4e7c36d9d", 00:19:59.547 "is_configured": true, 00:19:59.547 "data_offset": 0, 00:19:59.547 "data_size": 65536 00:19:59.547 }, 00:19:59.547 { 00:19:59.547 "name": null, 00:19:59.547 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:59.547 "is_configured": false, 00:19:59.547 "data_offset": 0, 00:19:59.547 "data_size": 65536 00:19:59.547 }, 00:19:59.547 { 00:19:59.547 "name": "BaseBdev3", 00:19:59.547 "uuid": "a9661bbb-cca1-5c43-8342-4c904ff738ee", 00:19:59.547 "is_configured": true, 00:19:59.547 "data_offset": 0, 00:19:59.547 "data_size": 65536 00:19:59.547 }, 00:19:59.547 { 00:19:59.548 "name": "BaseBdev4", 00:19:59.548 "uuid": "58919239-0589-5be7-95ea-b79a31f6982e", 00:19:59.548 "is_configured": true, 00:19:59.548 "data_offset": 0, 00:19:59.548 "data_size": 65536 00:19:59.548 } 00:19:59.548 ] 00:19:59.548 }' 00:19:59.548 12:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:59.548 12:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:59.548 12:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:59.548 12:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:59.548 12:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:59.548 12:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:59.548 12:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:59.548 12:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:59.548 12:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:59.548 12:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:59.548 12:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:59.548 12:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:59.548 12:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:59.548 12:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:59.548 12:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:59.548 12:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:59.806 12:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:59.806 "name": "raid_bdev1", 00:19:59.806 "uuid": "62becb9a-cd3e-469e-9bed-ac8d15c0003e", 00:19:59.806 "strip_size_kb": 0, 00:19:59.806 "state": "online", 00:19:59.806 "raid_level": "raid1", 00:19:59.806 "superblock": false, 00:19:59.806 "num_base_bdevs": 4, 00:19:59.806 "num_base_bdevs_discovered": 3, 00:19:59.806 "num_base_bdevs_operational": 3, 00:19:59.806 "base_bdevs_list": [ 00:19:59.806 { 00:19:59.806 "name": "spare", 00:19:59.806 "uuid": "661fa2d8-81e7-5426-a6e3-35b4e7c36d9d", 00:19:59.806 "is_configured": true, 00:19:59.806 "data_offset": 0, 00:19:59.806 "data_size": 65536 00:19:59.806 }, 00:19:59.806 { 00:19:59.806 "name": null, 00:19:59.806 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:59.806 "is_configured": false, 00:19:59.806 "data_offset": 0, 00:19:59.806 "data_size": 65536 00:19:59.806 }, 00:19:59.806 { 00:19:59.806 "name": "BaseBdev3", 00:19:59.806 "uuid": "a9661bbb-cca1-5c43-8342-4c904ff738ee", 00:19:59.806 "is_configured": true, 00:19:59.806 "data_offset": 0, 00:19:59.806 "data_size": 65536 00:19:59.806 }, 00:19:59.806 { 00:19:59.806 "name": "BaseBdev4", 00:19:59.806 "uuid": "58919239-0589-5be7-95ea-b79a31f6982e", 00:19:59.806 "is_configured": true, 00:19:59.806 "data_offset": 0, 00:19:59.806 "data_size": 65536 00:19:59.806 } 00:19:59.806 ] 00:19:59.806 }' 00:19:59.806 12:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:59.806 12:00:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:20:00.373 12:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:00.373 [2024-07-12 12:00:50.585775] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:00.373 [2024-07-12 12:00:50.585796] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:00.373 [2024-07-12 12:00:50.585836] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:00.373 [2024-07-12 12:00:50.585888] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:00.373 [2024-07-12 12:00:50.585894] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ed7dd0 name raid_bdev1, state offline 00:20:00.374 12:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:00.374 12:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:20:00.646 12:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:20:00.646 12:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:20:00.646 12:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:20:00.646 12:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:20:00.646 12:00:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:00.646 12:00:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:20:00.646 12:00:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:00.646 12:00:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:20:00.646 12:00:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:00.646 12:00:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:20:00.646 12:00:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:00.646 12:00:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:00.646 12:00:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:20:00.915 /dev/nbd0 00:20:00.915 12:00:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:20:00.915 12:00:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:20:00.915 12:00:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:20:00.915 12:00:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:20:00.915 12:00:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:20:00.915 12:00:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:20:00.915 12:00:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:20:00.915 12:00:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:20:00.915 12:00:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:20:00.915 12:00:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:20:00.915 12:00:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:00.915 1+0 records in 00:20:00.915 1+0 records out 00:20:00.915 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000196726 s, 20.8 MB/s 00:20:00.915 12:00:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:00.915 12:00:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:20:00.915 12:00:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:00.915 12:00:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:20:00.915 12:00:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:20:00.915 12:00:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:00.915 12:00:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:00.915 12:00:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:20:00.915 /dev/nbd1 00:20:00.915 12:00:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:20:00.915 12:00:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:20:00.915 12:00:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:20:00.915 12:00:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:20:00.915 12:00:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:20:00.915 12:00:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:20:00.915 12:00:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:20:00.915 12:00:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:20:00.915 12:00:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:20:00.915 12:00:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:20:00.915 12:00:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:01.175 1+0 records in 00:20:01.175 1+0 records out 00:20:01.175 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000171942 s, 23.8 MB/s 00:20:01.175 12:00:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:01.175 12:00:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:20:01.175 12:00:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:01.175 12:00:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:20:01.175 12:00:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:20:01.175 12:00:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:01.175 12:00:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:01.175 12:00:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:20:01.175 12:00:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:20:01.175 12:00:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:01.175 12:00:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:20:01.175 12:00:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:01.175 12:00:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:20:01.175 12:00:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:01.175 12:00:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:20:01.175 12:00:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:20:01.175 12:00:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:20:01.175 12:00:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:20:01.175 12:00:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:01.175 12:00:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:01.175 12:00:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:20:01.175 12:00:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:20:01.175 12:00:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:20:01.175 12:00:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:01.175 12:00:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:20:01.434 12:00:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:20:01.434 12:00:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:20:01.434 12:00:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:20:01.434 12:00:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:01.434 12:00:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:01.434 12:00:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:20:01.434 12:00:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:20:01.434 12:00:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:20:01.434 12:00:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:20:01.434 12:00:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 708572 00:20:01.434 12:00:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@948 -- # '[' -z 708572 ']' 00:20:01.434 12:00:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # kill -0 708572 00:20:01.434 12:00:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # uname 00:20:01.434 12:00:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:01.434 12:00:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 708572 00:20:01.434 12:00:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:01.434 12:00:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:01.434 12:00:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 708572' 00:20:01.434 killing process with pid 708572 00:20:01.434 12:00:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # kill 708572 00:20:01.434 Received shutdown signal, test time was about 60.000000 seconds 00:20:01.434 00:20:01.434 Latency(us) 00:20:01.434 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:20:01.434 =================================================================================================================== 00:20:01.434 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:20:01.434 [2024-07-12 12:00:51.641483] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:01.434 12:00:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@972 -- # wait 708572 00:20:01.693 [2024-07-12 12:00:51.681147] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:01.693 12:00:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:20:01.693 00:20:01.693 real 0m19.106s 00:20:01.693 user 0m26.181s 00:20:01.693 sys 0m3.123s 00:20:01.693 12:00:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:01.693 12:00:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:20:01.693 ************************************ 00:20:01.693 END TEST raid_rebuild_test 00:20:01.693 ************************************ 00:20:01.693 12:00:51 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:01.693 12:00:51 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 4 true false true 00:20:01.693 12:00:51 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:20:01.693 12:00:51 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:01.693 12:00:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:01.693 ************************************ 00:20:01.693 START TEST raid_rebuild_test_sb 00:20:01.693 ************************************ 00:20:01.693 12:00:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 true false true 00:20:01.693 12:00:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:20:01.693 12:00:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:20:01.693 12:00:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:20:01.693 12:00:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:20:01.693 12:00:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:20:01.693 12:00:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:20:01.693 12:00:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:01.693 12:00:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:20:01.693 12:00:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:01.693 12:00:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:01.693 12:00:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:20:01.693 12:00:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:01.693 12:00:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:01.693 12:00:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:20:01.693 12:00:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:01.693 12:00:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:01.693 12:00:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:20:01.693 12:00:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:01.693 12:00:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:01.693 12:00:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:01.693 12:00:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:20:01.693 12:00:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:20:01.693 12:00:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:20:01.693 12:00:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:20:01.693 12:00:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:20:01.693 12:00:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:20:01.693 12:00:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:20:01.693 12:00:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:20:01.693 12:00:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:20:01.694 12:00:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:20:01.694 12:00:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=711998 00:20:01.694 12:00:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 711998 /var/tmp/spdk-raid.sock 00:20:01.694 12:00:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:20:01.694 12:00:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@829 -- # '[' -z 711998 ']' 00:20:01.694 12:00:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:01.694 12:00:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:01.694 12:00:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:01.694 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:01.694 12:00:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:01.694 12:00:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:01.952 [2024-07-12 12:00:51.981477] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:20:01.952 [2024-07-12 12:00:51.981515] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid711998 ] 00:20:01.952 I/O size of 3145728 is greater than zero copy threshold (65536). 00:20:01.952 Zero copy mechanism will not be used. 00:20:01.953 [2024-07-12 12:00:52.044820] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:01.953 [2024-07-12 12:00:52.122300] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:20:01.953 [2024-07-12 12:00:52.176270] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:01.953 [2024-07-12 12:00:52.176296] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:02.886 12:00:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:02.886 12:00:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@862 -- # return 0 00:20:02.886 12:00:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:02.886 12:00:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:02.886 BaseBdev1_malloc 00:20:02.886 12:00:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:20:02.886 [2024-07-12 12:00:53.083921] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:20:02.886 [2024-07-12 12:00:53.083954] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:02.886 [2024-07-12 12:00:53.083966] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x256e010 00:20:02.886 [2024-07-12 12:00:53.083972] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:02.886 [2024-07-12 12:00:53.085260] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:02.886 [2024-07-12 12:00:53.085281] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:02.886 BaseBdev1 00:20:02.886 12:00:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:02.886 12:00:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:03.143 BaseBdev2_malloc 00:20:03.143 12:00:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:20:03.401 [2024-07-12 12:00:53.408259] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:20:03.401 [2024-07-12 12:00:53.408290] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:03.401 [2024-07-12 12:00:53.408303] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x256eb60 00:20:03.401 [2024-07-12 12:00:53.408324] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:03.401 [2024-07-12 12:00:53.409420] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:03.401 [2024-07-12 12:00:53.409439] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:03.401 BaseBdev2 00:20:03.401 12:00:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:03.401 12:00:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:03.401 BaseBdev3_malloc 00:20:03.401 12:00:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:20:03.660 [2024-07-12 12:00:53.732412] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:20:03.660 [2024-07-12 12:00:53.732443] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:03.660 [2024-07-12 12:00:53.732454] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x271b0a0 00:20:03.660 [2024-07-12 12:00:53.732460] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:03.660 [2024-07-12 12:00:53.733590] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:03.660 [2024-07-12 12:00:53.733610] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:03.660 BaseBdev3 00:20:03.660 12:00:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:03.660 12:00:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:20:03.660 BaseBdev4_malloc 00:20:03.660 12:00:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:20:03.919 [2024-07-12 12:00:54.044721] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:20:03.919 [2024-07-12 12:00:54.044752] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:03.919 [2024-07-12 12:00:54.044763] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2719880 00:20:03.919 [2024-07-12 12:00:54.044769] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:03.919 [2024-07-12 12:00:54.045850] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:03.919 [2024-07-12 12:00:54.045869] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:20:03.919 BaseBdev4 00:20:03.919 12:00:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:20:04.177 spare_malloc 00:20:04.177 12:00:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:20:04.177 spare_delay 00:20:04.177 12:00:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:04.436 [2024-07-12 12:00:54.537562] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:04.436 [2024-07-12 12:00:54.537590] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:04.436 [2024-07-12 12:00:54.537603] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x271d400 00:20:04.436 [2024-07-12 12:00:54.537608] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:04.436 [2024-07-12 12:00:54.538650] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:04.436 [2024-07-12 12:00:54.538668] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:04.436 spare 00:20:04.436 12:00:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:20:04.694 [2024-07-12 12:00:54.689977] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:04.694 [2024-07-12 12:00:54.690778] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:04.694 [2024-07-12 12:00:54.690813] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:04.694 [2024-07-12 12:00:54.690840] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:04.694 [2024-07-12 12:00:54.690959] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2717dd0 00:20:04.694 [2024-07-12 12:00:54.690965] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:04.694 [2024-07-12 12:00:54.691088] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x256d460 00:20:04.694 [2024-07-12 12:00:54.691182] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2717dd0 00:20:04.694 [2024-07-12 12:00:54.691187] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2717dd0 00:20:04.694 [2024-07-12 12:00:54.691246] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:04.694 12:00:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:20:04.694 12:00:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:04.694 12:00:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:04.694 12:00:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:04.694 12:00:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:04.694 12:00:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:04.694 12:00:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:04.694 12:00:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:04.694 12:00:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:04.694 12:00:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:04.694 12:00:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:04.694 12:00:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:04.694 12:00:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:04.694 "name": "raid_bdev1", 00:20:04.694 "uuid": "cf01f1ab-21d8-4c42-b072-e1ee5625a77d", 00:20:04.694 "strip_size_kb": 0, 00:20:04.694 "state": "online", 00:20:04.694 "raid_level": "raid1", 00:20:04.694 "superblock": true, 00:20:04.694 "num_base_bdevs": 4, 00:20:04.694 "num_base_bdevs_discovered": 4, 00:20:04.694 "num_base_bdevs_operational": 4, 00:20:04.694 "base_bdevs_list": [ 00:20:04.694 { 00:20:04.694 "name": "BaseBdev1", 00:20:04.694 "uuid": "49c3cd78-a8dc-5cce-aa8e-b15030571083", 00:20:04.694 "is_configured": true, 00:20:04.694 "data_offset": 2048, 00:20:04.694 "data_size": 63488 00:20:04.694 }, 00:20:04.694 { 00:20:04.694 "name": "BaseBdev2", 00:20:04.694 "uuid": "cf2c7219-1827-5692-b8dd-e9b853059b23", 00:20:04.694 "is_configured": true, 00:20:04.694 "data_offset": 2048, 00:20:04.694 "data_size": 63488 00:20:04.694 }, 00:20:04.694 { 00:20:04.694 "name": "BaseBdev3", 00:20:04.694 "uuid": "32990f8f-99eb-53d0-abcd-524f14def49a", 00:20:04.694 "is_configured": true, 00:20:04.694 "data_offset": 2048, 00:20:04.694 "data_size": 63488 00:20:04.694 }, 00:20:04.694 { 00:20:04.694 "name": "BaseBdev4", 00:20:04.694 "uuid": "5f06044b-827b-5789-b6c2-71a11c73bad5", 00:20:04.694 "is_configured": true, 00:20:04.694 "data_offset": 2048, 00:20:04.694 "data_size": 63488 00:20:04.694 } 00:20:04.694 ] 00:20:04.694 }' 00:20:04.694 12:00:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:04.694 12:00:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:05.260 12:00:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:20:05.260 12:00:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:05.260 [2024-07-12 12:00:55.500244] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:05.519 12:00:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:20:05.519 12:00:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:20:05.519 12:00:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:05.519 12:00:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:20:05.519 12:00:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:20:05.519 12:00:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:20:05.519 12:00:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:20:05.519 12:00:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:20:05.519 12:00:55 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:05.519 12:00:55 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:20:05.519 12:00:55 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:05.519 12:00:55 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:20:05.519 12:00:55 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:05.519 12:00:55 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:20:05.519 12:00:55 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:05.519 12:00:55 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:05.519 12:00:55 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:20:05.777 [2024-07-12 12:00:55.844969] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x256ca20 00:20:05.777 /dev/nbd0 00:20:05.777 12:00:55 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:20:05.777 12:00:55 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:20:05.777 12:00:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:20:05.777 12:00:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:20:05.778 12:00:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:20:05.778 12:00:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:20:05.778 12:00:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:20:05.778 12:00:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:20:05.778 12:00:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:20:05.778 12:00:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:20:05.778 12:00:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:05.778 1+0 records in 00:20:05.778 1+0 records out 00:20:05.778 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000209462 s, 19.6 MB/s 00:20:05.778 12:00:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:05.778 12:00:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:20:05.778 12:00:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:05.778 12:00:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:20:05.778 12:00:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:20:05.778 12:00:55 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:05.778 12:00:55 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:05.778 12:00:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:20:05.778 12:00:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:20:05.778 12:00:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:20:11.056 63488+0 records in 00:20:11.056 63488+0 records out 00:20:11.056 32505856 bytes (33 MB, 31 MiB) copied, 4.36008 s, 7.5 MB/s 00:20:11.056 12:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:20:11.056 12:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:11.056 12:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:20:11.056 12:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:11.056 12:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:20:11.056 12:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:11.056 12:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:20:11.056 12:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:20:11.056 [2024-07-12 12:01:00.449006] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:11.056 12:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:20:11.056 12:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:20:11.056 12:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:11.056 12:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:11.056 12:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:20:11.056 12:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:20:11.056 12:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:20:11.056 12:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:20:11.056 [2024-07-12 12:01:00.605434] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:11.056 12:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:11.056 12:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:11.056 12:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:11.056 12:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:11.056 12:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:11.056 12:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:11.056 12:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:11.056 12:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:11.056 12:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:11.056 12:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:11.056 12:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:11.056 12:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:11.056 12:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:11.056 "name": "raid_bdev1", 00:20:11.056 "uuid": "cf01f1ab-21d8-4c42-b072-e1ee5625a77d", 00:20:11.056 "strip_size_kb": 0, 00:20:11.056 "state": "online", 00:20:11.056 "raid_level": "raid1", 00:20:11.056 "superblock": true, 00:20:11.056 "num_base_bdevs": 4, 00:20:11.056 "num_base_bdevs_discovered": 3, 00:20:11.056 "num_base_bdevs_operational": 3, 00:20:11.056 "base_bdevs_list": [ 00:20:11.056 { 00:20:11.056 "name": null, 00:20:11.056 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:11.056 "is_configured": false, 00:20:11.056 "data_offset": 2048, 00:20:11.056 "data_size": 63488 00:20:11.056 }, 00:20:11.056 { 00:20:11.056 "name": "BaseBdev2", 00:20:11.056 "uuid": "cf2c7219-1827-5692-b8dd-e9b853059b23", 00:20:11.056 "is_configured": true, 00:20:11.056 "data_offset": 2048, 00:20:11.056 "data_size": 63488 00:20:11.056 }, 00:20:11.056 { 00:20:11.056 "name": "BaseBdev3", 00:20:11.056 "uuid": "32990f8f-99eb-53d0-abcd-524f14def49a", 00:20:11.056 "is_configured": true, 00:20:11.056 "data_offset": 2048, 00:20:11.057 "data_size": 63488 00:20:11.057 }, 00:20:11.057 { 00:20:11.057 "name": "BaseBdev4", 00:20:11.057 "uuid": "5f06044b-827b-5789-b6c2-71a11c73bad5", 00:20:11.057 "is_configured": true, 00:20:11.057 "data_offset": 2048, 00:20:11.057 "data_size": 63488 00:20:11.057 } 00:20:11.057 ] 00:20:11.057 }' 00:20:11.057 12:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:11.057 12:01:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:11.057 12:01:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:11.315 [2024-07-12 12:01:01.435593] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:11.315 [2024-07-12 12:01:01.439094] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x269eca0 00:20:11.315 [2024-07-12 12:01:01.440514] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:11.315 12:01:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:20:12.250 12:01:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:12.250 12:01:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:12.250 12:01:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:12.250 12:01:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:12.250 12:01:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:12.250 12:01:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:12.250 12:01:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:12.510 12:01:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:12.510 "name": "raid_bdev1", 00:20:12.510 "uuid": "cf01f1ab-21d8-4c42-b072-e1ee5625a77d", 00:20:12.510 "strip_size_kb": 0, 00:20:12.510 "state": "online", 00:20:12.510 "raid_level": "raid1", 00:20:12.510 "superblock": true, 00:20:12.510 "num_base_bdevs": 4, 00:20:12.510 "num_base_bdevs_discovered": 4, 00:20:12.510 "num_base_bdevs_operational": 4, 00:20:12.510 "process": { 00:20:12.510 "type": "rebuild", 00:20:12.510 "target": "spare", 00:20:12.510 "progress": { 00:20:12.510 "blocks": 22528, 00:20:12.510 "percent": 35 00:20:12.510 } 00:20:12.510 }, 00:20:12.510 "base_bdevs_list": [ 00:20:12.510 { 00:20:12.510 "name": "spare", 00:20:12.510 "uuid": "6d6cdb2c-388f-5b1b-bec1-ef5ae03e0298", 00:20:12.510 "is_configured": true, 00:20:12.510 "data_offset": 2048, 00:20:12.510 "data_size": 63488 00:20:12.510 }, 00:20:12.510 { 00:20:12.510 "name": "BaseBdev2", 00:20:12.510 "uuid": "cf2c7219-1827-5692-b8dd-e9b853059b23", 00:20:12.510 "is_configured": true, 00:20:12.510 "data_offset": 2048, 00:20:12.510 "data_size": 63488 00:20:12.510 }, 00:20:12.510 { 00:20:12.510 "name": "BaseBdev3", 00:20:12.510 "uuid": "32990f8f-99eb-53d0-abcd-524f14def49a", 00:20:12.510 "is_configured": true, 00:20:12.510 "data_offset": 2048, 00:20:12.510 "data_size": 63488 00:20:12.510 }, 00:20:12.510 { 00:20:12.510 "name": "BaseBdev4", 00:20:12.510 "uuid": "5f06044b-827b-5789-b6c2-71a11c73bad5", 00:20:12.510 "is_configured": true, 00:20:12.510 "data_offset": 2048, 00:20:12.510 "data_size": 63488 00:20:12.510 } 00:20:12.510 ] 00:20:12.510 }' 00:20:12.510 12:01:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:12.510 12:01:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:12.510 12:01:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:12.510 12:01:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:12.510 12:01:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:20:12.769 [2024-07-12 12:01:02.848696] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:12.769 [2024-07-12 12:01:02.850328] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:12.769 [2024-07-12 12:01:02.850353] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:12.769 [2024-07-12 12:01:02.850363] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:12.769 [2024-07-12 12:01:02.850383] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:20:12.769 12:01:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:12.769 12:01:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:12.769 12:01:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:12.769 12:01:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:12.769 12:01:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:12.769 12:01:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:12.769 12:01:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:12.769 12:01:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:12.769 12:01:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:12.769 12:01:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:12.769 12:01:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:12.769 12:01:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:13.027 12:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:13.027 "name": "raid_bdev1", 00:20:13.027 "uuid": "cf01f1ab-21d8-4c42-b072-e1ee5625a77d", 00:20:13.027 "strip_size_kb": 0, 00:20:13.027 "state": "online", 00:20:13.027 "raid_level": "raid1", 00:20:13.027 "superblock": true, 00:20:13.027 "num_base_bdevs": 4, 00:20:13.027 "num_base_bdevs_discovered": 3, 00:20:13.027 "num_base_bdevs_operational": 3, 00:20:13.027 "base_bdevs_list": [ 00:20:13.027 { 00:20:13.027 "name": null, 00:20:13.027 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:13.027 "is_configured": false, 00:20:13.027 "data_offset": 2048, 00:20:13.027 "data_size": 63488 00:20:13.027 }, 00:20:13.027 { 00:20:13.027 "name": "BaseBdev2", 00:20:13.027 "uuid": "cf2c7219-1827-5692-b8dd-e9b853059b23", 00:20:13.027 "is_configured": true, 00:20:13.027 "data_offset": 2048, 00:20:13.027 "data_size": 63488 00:20:13.027 }, 00:20:13.027 { 00:20:13.027 "name": "BaseBdev3", 00:20:13.027 "uuid": "32990f8f-99eb-53d0-abcd-524f14def49a", 00:20:13.027 "is_configured": true, 00:20:13.027 "data_offset": 2048, 00:20:13.027 "data_size": 63488 00:20:13.027 }, 00:20:13.027 { 00:20:13.027 "name": "BaseBdev4", 00:20:13.027 "uuid": "5f06044b-827b-5789-b6c2-71a11c73bad5", 00:20:13.027 "is_configured": true, 00:20:13.027 "data_offset": 2048, 00:20:13.027 "data_size": 63488 00:20:13.027 } 00:20:13.027 ] 00:20:13.027 }' 00:20:13.027 12:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:13.027 12:01:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:13.287 12:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:13.287 12:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:13.287 12:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:13.287 12:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:13.287 12:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:13.287 12:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:13.287 12:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:13.584 12:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:13.584 "name": "raid_bdev1", 00:20:13.584 "uuid": "cf01f1ab-21d8-4c42-b072-e1ee5625a77d", 00:20:13.584 "strip_size_kb": 0, 00:20:13.584 "state": "online", 00:20:13.584 "raid_level": "raid1", 00:20:13.584 "superblock": true, 00:20:13.584 "num_base_bdevs": 4, 00:20:13.584 "num_base_bdevs_discovered": 3, 00:20:13.584 "num_base_bdevs_operational": 3, 00:20:13.584 "base_bdevs_list": [ 00:20:13.584 { 00:20:13.584 "name": null, 00:20:13.584 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:13.584 "is_configured": false, 00:20:13.584 "data_offset": 2048, 00:20:13.584 "data_size": 63488 00:20:13.584 }, 00:20:13.584 { 00:20:13.584 "name": "BaseBdev2", 00:20:13.584 "uuid": "cf2c7219-1827-5692-b8dd-e9b853059b23", 00:20:13.584 "is_configured": true, 00:20:13.584 "data_offset": 2048, 00:20:13.584 "data_size": 63488 00:20:13.584 }, 00:20:13.584 { 00:20:13.584 "name": "BaseBdev3", 00:20:13.584 "uuid": "32990f8f-99eb-53d0-abcd-524f14def49a", 00:20:13.584 "is_configured": true, 00:20:13.584 "data_offset": 2048, 00:20:13.584 "data_size": 63488 00:20:13.584 }, 00:20:13.584 { 00:20:13.584 "name": "BaseBdev4", 00:20:13.584 "uuid": "5f06044b-827b-5789-b6c2-71a11c73bad5", 00:20:13.584 "is_configured": true, 00:20:13.584 "data_offset": 2048, 00:20:13.584 "data_size": 63488 00:20:13.584 } 00:20:13.584 ] 00:20:13.584 }' 00:20:13.584 12:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:13.584 12:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:13.584 12:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:13.584 12:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:13.584 12:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:13.879 [2024-07-12 12:01:03.924805] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:13.879 [2024-07-12 12:01:03.928357] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x256d690 00:20:13.879 [2024-07-12 12:01:03.929376] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:13.879 12:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:20:14.812 12:01:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:14.812 12:01:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:14.812 12:01:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:14.812 12:01:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:14.812 12:01:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:14.812 12:01:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:14.812 12:01:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:15.075 12:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:15.075 "name": "raid_bdev1", 00:20:15.075 "uuid": "cf01f1ab-21d8-4c42-b072-e1ee5625a77d", 00:20:15.075 "strip_size_kb": 0, 00:20:15.075 "state": "online", 00:20:15.075 "raid_level": "raid1", 00:20:15.075 "superblock": true, 00:20:15.075 "num_base_bdevs": 4, 00:20:15.075 "num_base_bdevs_discovered": 4, 00:20:15.075 "num_base_bdevs_operational": 4, 00:20:15.075 "process": { 00:20:15.075 "type": "rebuild", 00:20:15.075 "target": "spare", 00:20:15.075 "progress": { 00:20:15.075 "blocks": 22528, 00:20:15.075 "percent": 35 00:20:15.075 } 00:20:15.075 }, 00:20:15.075 "base_bdevs_list": [ 00:20:15.075 { 00:20:15.075 "name": "spare", 00:20:15.075 "uuid": "6d6cdb2c-388f-5b1b-bec1-ef5ae03e0298", 00:20:15.075 "is_configured": true, 00:20:15.075 "data_offset": 2048, 00:20:15.075 "data_size": 63488 00:20:15.075 }, 00:20:15.075 { 00:20:15.075 "name": "BaseBdev2", 00:20:15.075 "uuid": "cf2c7219-1827-5692-b8dd-e9b853059b23", 00:20:15.075 "is_configured": true, 00:20:15.075 "data_offset": 2048, 00:20:15.075 "data_size": 63488 00:20:15.075 }, 00:20:15.075 { 00:20:15.075 "name": "BaseBdev3", 00:20:15.075 "uuid": "32990f8f-99eb-53d0-abcd-524f14def49a", 00:20:15.075 "is_configured": true, 00:20:15.075 "data_offset": 2048, 00:20:15.075 "data_size": 63488 00:20:15.075 }, 00:20:15.075 { 00:20:15.075 "name": "BaseBdev4", 00:20:15.076 "uuid": "5f06044b-827b-5789-b6c2-71a11c73bad5", 00:20:15.076 "is_configured": true, 00:20:15.076 "data_offset": 2048, 00:20:15.076 "data_size": 63488 00:20:15.076 } 00:20:15.076 ] 00:20:15.076 }' 00:20:15.076 12:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:15.076 12:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:15.076 12:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:15.076 12:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:15.076 12:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:20:15.076 12:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:20:15.076 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:20:15.076 12:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:20:15.076 12:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:20:15.076 12:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:20:15.076 12:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:20:15.336 [2024-07-12 12:01:05.337641] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:15.336 [2024-07-12 12:01:05.439550] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x256d690 00:20:15.336 12:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:20:15.336 12:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:20:15.336 12:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:15.336 12:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:15.336 12:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:15.336 12:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:15.336 12:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:15.336 12:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:15.336 12:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:15.595 12:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:15.595 "name": "raid_bdev1", 00:20:15.595 "uuid": "cf01f1ab-21d8-4c42-b072-e1ee5625a77d", 00:20:15.595 "strip_size_kb": 0, 00:20:15.595 "state": "online", 00:20:15.595 "raid_level": "raid1", 00:20:15.595 "superblock": true, 00:20:15.595 "num_base_bdevs": 4, 00:20:15.595 "num_base_bdevs_discovered": 3, 00:20:15.595 "num_base_bdevs_operational": 3, 00:20:15.595 "process": { 00:20:15.595 "type": "rebuild", 00:20:15.595 "target": "spare", 00:20:15.595 "progress": { 00:20:15.595 "blocks": 30720, 00:20:15.595 "percent": 48 00:20:15.595 } 00:20:15.595 }, 00:20:15.595 "base_bdevs_list": [ 00:20:15.595 { 00:20:15.595 "name": "spare", 00:20:15.595 "uuid": "6d6cdb2c-388f-5b1b-bec1-ef5ae03e0298", 00:20:15.595 "is_configured": true, 00:20:15.595 "data_offset": 2048, 00:20:15.595 "data_size": 63488 00:20:15.595 }, 00:20:15.595 { 00:20:15.595 "name": null, 00:20:15.595 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:15.595 "is_configured": false, 00:20:15.595 "data_offset": 2048, 00:20:15.595 "data_size": 63488 00:20:15.595 }, 00:20:15.595 { 00:20:15.595 "name": "BaseBdev3", 00:20:15.595 "uuid": "32990f8f-99eb-53d0-abcd-524f14def49a", 00:20:15.595 "is_configured": true, 00:20:15.595 "data_offset": 2048, 00:20:15.595 "data_size": 63488 00:20:15.595 }, 00:20:15.595 { 00:20:15.595 "name": "BaseBdev4", 00:20:15.595 "uuid": "5f06044b-827b-5789-b6c2-71a11c73bad5", 00:20:15.595 "is_configured": true, 00:20:15.595 "data_offset": 2048, 00:20:15.595 "data_size": 63488 00:20:15.595 } 00:20:15.595 ] 00:20:15.595 }' 00:20:15.595 12:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:15.595 12:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:15.595 12:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:15.595 12:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:15.595 12:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=685 00:20:15.596 12:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:15.596 12:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:15.596 12:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:15.596 12:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:15.596 12:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:15.596 12:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:15.596 12:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:15.596 12:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:15.854 12:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:15.854 "name": "raid_bdev1", 00:20:15.854 "uuid": "cf01f1ab-21d8-4c42-b072-e1ee5625a77d", 00:20:15.854 "strip_size_kb": 0, 00:20:15.854 "state": "online", 00:20:15.854 "raid_level": "raid1", 00:20:15.854 "superblock": true, 00:20:15.854 "num_base_bdevs": 4, 00:20:15.854 "num_base_bdevs_discovered": 3, 00:20:15.854 "num_base_bdevs_operational": 3, 00:20:15.854 "process": { 00:20:15.854 "type": "rebuild", 00:20:15.854 "target": "spare", 00:20:15.854 "progress": { 00:20:15.854 "blocks": 36864, 00:20:15.854 "percent": 58 00:20:15.854 } 00:20:15.854 }, 00:20:15.854 "base_bdevs_list": [ 00:20:15.854 { 00:20:15.854 "name": "spare", 00:20:15.854 "uuid": "6d6cdb2c-388f-5b1b-bec1-ef5ae03e0298", 00:20:15.854 "is_configured": true, 00:20:15.854 "data_offset": 2048, 00:20:15.854 "data_size": 63488 00:20:15.854 }, 00:20:15.855 { 00:20:15.855 "name": null, 00:20:15.855 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:15.855 "is_configured": false, 00:20:15.855 "data_offset": 2048, 00:20:15.855 "data_size": 63488 00:20:15.855 }, 00:20:15.855 { 00:20:15.855 "name": "BaseBdev3", 00:20:15.855 "uuid": "32990f8f-99eb-53d0-abcd-524f14def49a", 00:20:15.855 "is_configured": true, 00:20:15.855 "data_offset": 2048, 00:20:15.855 "data_size": 63488 00:20:15.855 }, 00:20:15.855 { 00:20:15.855 "name": "BaseBdev4", 00:20:15.855 "uuid": "5f06044b-827b-5789-b6c2-71a11c73bad5", 00:20:15.855 "is_configured": true, 00:20:15.855 "data_offset": 2048, 00:20:15.855 "data_size": 63488 00:20:15.855 } 00:20:15.855 ] 00:20:15.855 }' 00:20:15.855 12:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:15.855 12:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:15.855 12:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:15.855 12:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:15.855 12:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:20:16.789 12:01:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:16.789 12:01:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:16.789 12:01:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:16.789 12:01:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:16.789 12:01:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:16.789 12:01:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:16.789 12:01:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:16.789 12:01:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:17.048 12:01:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:17.048 "name": "raid_bdev1", 00:20:17.048 "uuid": "cf01f1ab-21d8-4c42-b072-e1ee5625a77d", 00:20:17.048 "strip_size_kb": 0, 00:20:17.048 "state": "online", 00:20:17.048 "raid_level": "raid1", 00:20:17.048 "superblock": true, 00:20:17.048 "num_base_bdevs": 4, 00:20:17.048 "num_base_bdevs_discovered": 3, 00:20:17.048 "num_base_bdevs_operational": 3, 00:20:17.048 "process": { 00:20:17.048 "type": "rebuild", 00:20:17.048 "target": "spare", 00:20:17.048 "progress": { 00:20:17.048 "blocks": 61440, 00:20:17.048 "percent": 96 00:20:17.048 } 00:20:17.048 }, 00:20:17.048 "base_bdevs_list": [ 00:20:17.048 { 00:20:17.048 "name": "spare", 00:20:17.048 "uuid": "6d6cdb2c-388f-5b1b-bec1-ef5ae03e0298", 00:20:17.048 "is_configured": true, 00:20:17.048 "data_offset": 2048, 00:20:17.048 "data_size": 63488 00:20:17.048 }, 00:20:17.048 { 00:20:17.048 "name": null, 00:20:17.048 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:17.048 "is_configured": false, 00:20:17.048 "data_offset": 2048, 00:20:17.048 "data_size": 63488 00:20:17.048 }, 00:20:17.048 { 00:20:17.048 "name": "BaseBdev3", 00:20:17.048 "uuid": "32990f8f-99eb-53d0-abcd-524f14def49a", 00:20:17.048 "is_configured": true, 00:20:17.048 "data_offset": 2048, 00:20:17.048 "data_size": 63488 00:20:17.048 }, 00:20:17.048 { 00:20:17.048 "name": "BaseBdev4", 00:20:17.048 "uuid": "5f06044b-827b-5789-b6c2-71a11c73bad5", 00:20:17.048 "is_configured": true, 00:20:17.048 "data_offset": 2048, 00:20:17.048 "data_size": 63488 00:20:17.048 } 00:20:17.048 ] 00:20:17.048 }' 00:20:17.048 12:01:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:17.048 12:01:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:17.048 12:01:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:17.048 [2024-07-12 12:01:07.151414] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:20:17.048 [2024-07-12 12:01:07.151452] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:20:17.048 [2024-07-12 12:01:07.151524] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:17.048 12:01:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:17.048 12:01:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:20:17.982 12:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:17.982 12:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:17.982 12:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:17.982 12:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:17.982 12:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:17.982 12:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:17.982 12:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:17.982 12:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:18.242 12:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:18.242 "name": "raid_bdev1", 00:20:18.242 "uuid": "cf01f1ab-21d8-4c42-b072-e1ee5625a77d", 00:20:18.242 "strip_size_kb": 0, 00:20:18.242 "state": "online", 00:20:18.242 "raid_level": "raid1", 00:20:18.242 "superblock": true, 00:20:18.242 "num_base_bdevs": 4, 00:20:18.242 "num_base_bdevs_discovered": 3, 00:20:18.242 "num_base_bdevs_operational": 3, 00:20:18.242 "base_bdevs_list": [ 00:20:18.242 { 00:20:18.242 "name": "spare", 00:20:18.242 "uuid": "6d6cdb2c-388f-5b1b-bec1-ef5ae03e0298", 00:20:18.242 "is_configured": true, 00:20:18.242 "data_offset": 2048, 00:20:18.242 "data_size": 63488 00:20:18.242 }, 00:20:18.242 { 00:20:18.242 "name": null, 00:20:18.242 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:18.242 "is_configured": false, 00:20:18.242 "data_offset": 2048, 00:20:18.242 "data_size": 63488 00:20:18.242 }, 00:20:18.242 { 00:20:18.242 "name": "BaseBdev3", 00:20:18.242 "uuid": "32990f8f-99eb-53d0-abcd-524f14def49a", 00:20:18.242 "is_configured": true, 00:20:18.242 "data_offset": 2048, 00:20:18.242 "data_size": 63488 00:20:18.242 }, 00:20:18.242 { 00:20:18.242 "name": "BaseBdev4", 00:20:18.242 "uuid": "5f06044b-827b-5789-b6c2-71a11c73bad5", 00:20:18.242 "is_configured": true, 00:20:18.242 "data_offset": 2048, 00:20:18.242 "data_size": 63488 00:20:18.242 } 00:20:18.242 ] 00:20:18.242 }' 00:20:18.242 12:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:18.242 12:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:20:18.242 12:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:18.242 12:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:20:18.242 12:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:20:18.242 12:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:18.242 12:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:18.242 12:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:18.242 12:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:18.242 12:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:18.242 12:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:18.242 12:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:18.501 12:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:18.501 "name": "raid_bdev1", 00:20:18.501 "uuid": "cf01f1ab-21d8-4c42-b072-e1ee5625a77d", 00:20:18.501 "strip_size_kb": 0, 00:20:18.501 "state": "online", 00:20:18.501 "raid_level": "raid1", 00:20:18.501 "superblock": true, 00:20:18.501 "num_base_bdevs": 4, 00:20:18.501 "num_base_bdevs_discovered": 3, 00:20:18.501 "num_base_bdevs_operational": 3, 00:20:18.501 "base_bdevs_list": [ 00:20:18.501 { 00:20:18.501 "name": "spare", 00:20:18.501 "uuid": "6d6cdb2c-388f-5b1b-bec1-ef5ae03e0298", 00:20:18.501 "is_configured": true, 00:20:18.501 "data_offset": 2048, 00:20:18.501 "data_size": 63488 00:20:18.501 }, 00:20:18.501 { 00:20:18.501 "name": null, 00:20:18.501 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:18.501 "is_configured": false, 00:20:18.501 "data_offset": 2048, 00:20:18.501 "data_size": 63488 00:20:18.501 }, 00:20:18.501 { 00:20:18.501 "name": "BaseBdev3", 00:20:18.501 "uuid": "32990f8f-99eb-53d0-abcd-524f14def49a", 00:20:18.501 "is_configured": true, 00:20:18.501 "data_offset": 2048, 00:20:18.501 "data_size": 63488 00:20:18.501 }, 00:20:18.501 { 00:20:18.501 "name": "BaseBdev4", 00:20:18.501 "uuid": "5f06044b-827b-5789-b6c2-71a11c73bad5", 00:20:18.501 "is_configured": true, 00:20:18.501 "data_offset": 2048, 00:20:18.501 "data_size": 63488 00:20:18.501 } 00:20:18.501 ] 00:20:18.501 }' 00:20:18.501 12:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:18.501 12:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:18.501 12:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:18.501 12:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:18.501 12:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:18.501 12:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:18.501 12:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:18.501 12:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:18.501 12:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:18.501 12:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:18.501 12:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:18.501 12:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:18.501 12:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:18.501 12:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:18.501 12:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:18.501 12:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:18.760 12:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:18.760 "name": "raid_bdev1", 00:20:18.760 "uuid": "cf01f1ab-21d8-4c42-b072-e1ee5625a77d", 00:20:18.760 "strip_size_kb": 0, 00:20:18.760 "state": "online", 00:20:18.760 "raid_level": "raid1", 00:20:18.760 "superblock": true, 00:20:18.760 "num_base_bdevs": 4, 00:20:18.760 "num_base_bdevs_discovered": 3, 00:20:18.760 "num_base_bdevs_operational": 3, 00:20:18.760 "base_bdevs_list": [ 00:20:18.760 { 00:20:18.760 "name": "spare", 00:20:18.760 "uuid": "6d6cdb2c-388f-5b1b-bec1-ef5ae03e0298", 00:20:18.760 "is_configured": true, 00:20:18.760 "data_offset": 2048, 00:20:18.760 "data_size": 63488 00:20:18.760 }, 00:20:18.760 { 00:20:18.760 "name": null, 00:20:18.760 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:18.760 "is_configured": false, 00:20:18.760 "data_offset": 2048, 00:20:18.760 "data_size": 63488 00:20:18.760 }, 00:20:18.760 { 00:20:18.760 "name": "BaseBdev3", 00:20:18.760 "uuid": "32990f8f-99eb-53d0-abcd-524f14def49a", 00:20:18.760 "is_configured": true, 00:20:18.760 "data_offset": 2048, 00:20:18.760 "data_size": 63488 00:20:18.760 }, 00:20:18.760 { 00:20:18.760 "name": "BaseBdev4", 00:20:18.760 "uuid": "5f06044b-827b-5789-b6c2-71a11c73bad5", 00:20:18.760 "is_configured": true, 00:20:18.760 "data_offset": 2048, 00:20:18.760 "data_size": 63488 00:20:18.760 } 00:20:18.760 ] 00:20:18.760 }' 00:20:18.760 12:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:18.760 12:01:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:19.327 12:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:19.327 [2024-07-12 12:01:09.505220] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:19.327 [2024-07-12 12:01:09.505241] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:19.327 [2024-07-12 12:01:09.505283] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:19.327 [2024-07-12 12:01:09.505334] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:19.327 [2024-07-12 12:01:09.505341] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2717dd0 name raid_bdev1, state offline 00:20:19.327 12:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:19.327 12:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:20:19.586 12:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:20:19.586 12:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:20:19.586 12:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:20:19.586 12:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:20:19.586 12:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:19.586 12:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:20:19.586 12:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:19.586 12:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:20:19.586 12:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:19.586 12:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:20:19.586 12:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:19.586 12:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:19.586 12:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:20:19.843 /dev/nbd0 00:20:19.843 12:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:20:19.843 12:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:20:19.843 12:01:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:20:19.843 12:01:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:20:19.843 12:01:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:20:19.843 12:01:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:20:19.843 12:01:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:20:19.843 12:01:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:20:19.843 12:01:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:20:19.843 12:01:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:20:19.843 12:01:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:19.843 1+0 records in 00:20:19.843 1+0 records out 00:20:19.843 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000233869 s, 17.5 MB/s 00:20:19.843 12:01:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:19.843 12:01:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:20:19.843 12:01:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:19.843 12:01:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:20:19.843 12:01:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:20:19.844 12:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:19.844 12:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:19.844 12:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:20:19.844 /dev/nbd1 00:20:19.844 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:20:19.844 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:20:19.844 12:01:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:20:20.101 12:01:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:20:20.101 12:01:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:20:20.101 12:01:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:20:20.101 12:01:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:20:20.101 12:01:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:20:20.101 12:01:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:20:20.101 12:01:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:20:20.101 12:01:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:20.101 1+0 records in 00:20:20.101 1+0 records out 00:20:20.101 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000221175 s, 18.5 MB/s 00:20:20.101 12:01:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:20.101 12:01:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:20:20.101 12:01:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:20.101 12:01:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:20:20.101 12:01:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:20:20.101 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:20.101 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:20.101 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:20:20.101 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:20:20.101 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:20.101 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:20:20.101 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:20.101 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:20:20.101 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:20.101 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:20:20.101 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:20:20.101 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:20:20.101 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:20:20.101 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:20.101 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:20.101 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:20:20.101 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:20:20.101 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:20:20.101 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:20.102 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:20:20.359 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:20:20.359 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:20:20.359 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:20:20.359 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:20.359 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:20.359 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:20:20.359 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:20:20.359 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:20:20.359 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:20:20.359 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:20:20.617 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:20.617 [2024-07-12 12:01:10.843871] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:20.617 [2024-07-12 12:01:10.843902] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:20.617 [2024-07-12 12:01:10.843914] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x271f360 00:20:20.617 [2024-07-12 12:01:10.843936] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:20.617 [2024-07-12 12:01:10.845123] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:20.617 [2024-07-12 12:01:10.845143] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:20.617 [2024-07-12 12:01:10.845192] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:20:20.617 [2024-07-12 12:01:10.845211] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:20.617 [2024-07-12 12:01:10.845280] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:20.617 [2024-07-12 12:01:10.845331] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:20.617 spare 00:20:20.876 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:20.876 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:20.876 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:20.876 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:20.876 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:20.876 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:20.876 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:20.876 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:20.876 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:20.876 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:20.876 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:20.876 12:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:20.876 [2024-07-12 12:01:10.945624] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x270a4b0 00:20:20.876 [2024-07-12 12:01:10.945635] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:20.876 [2024-07-12 12:01:10.945753] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2718ee0 00:20:20.876 [2024-07-12 12:01:10.945850] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x270a4b0 00:20:20.876 [2024-07-12 12:01:10.945855] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x270a4b0 00:20:20.876 [2024-07-12 12:01:10.945918] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:20.876 12:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:20.876 "name": "raid_bdev1", 00:20:20.876 "uuid": "cf01f1ab-21d8-4c42-b072-e1ee5625a77d", 00:20:20.876 "strip_size_kb": 0, 00:20:20.876 "state": "online", 00:20:20.876 "raid_level": "raid1", 00:20:20.876 "superblock": true, 00:20:20.876 "num_base_bdevs": 4, 00:20:20.876 "num_base_bdevs_discovered": 3, 00:20:20.876 "num_base_bdevs_operational": 3, 00:20:20.876 "base_bdevs_list": [ 00:20:20.876 { 00:20:20.876 "name": "spare", 00:20:20.876 "uuid": "6d6cdb2c-388f-5b1b-bec1-ef5ae03e0298", 00:20:20.876 "is_configured": true, 00:20:20.876 "data_offset": 2048, 00:20:20.876 "data_size": 63488 00:20:20.876 }, 00:20:20.876 { 00:20:20.876 "name": null, 00:20:20.876 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:20.876 "is_configured": false, 00:20:20.876 "data_offset": 2048, 00:20:20.876 "data_size": 63488 00:20:20.876 }, 00:20:20.876 { 00:20:20.876 "name": "BaseBdev3", 00:20:20.876 "uuid": "32990f8f-99eb-53d0-abcd-524f14def49a", 00:20:20.876 "is_configured": true, 00:20:20.876 "data_offset": 2048, 00:20:20.876 "data_size": 63488 00:20:20.876 }, 00:20:20.876 { 00:20:20.876 "name": "BaseBdev4", 00:20:20.876 "uuid": "5f06044b-827b-5789-b6c2-71a11c73bad5", 00:20:20.876 "is_configured": true, 00:20:20.876 "data_offset": 2048, 00:20:20.876 "data_size": 63488 00:20:20.876 } 00:20:20.876 ] 00:20:20.876 }' 00:20:20.876 12:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:20.876 12:01:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:21.443 12:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:21.443 12:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:21.443 12:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:21.443 12:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:21.443 12:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:21.443 12:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:21.443 12:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:21.702 12:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:21.702 "name": "raid_bdev1", 00:20:21.702 "uuid": "cf01f1ab-21d8-4c42-b072-e1ee5625a77d", 00:20:21.702 "strip_size_kb": 0, 00:20:21.702 "state": "online", 00:20:21.702 "raid_level": "raid1", 00:20:21.702 "superblock": true, 00:20:21.702 "num_base_bdevs": 4, 00:20:21.702 "num_base_bdevs_discovered": 3, 00:20:21.702 "num_base_bdevs_operational": 3, 00:20:21.702 "base_bdevs_list": [ 00:20:21.702 { 00:20:21.702 "name": "spare", 00:20:21.702 "uuid": "6d6cdb2c-388f-5b1b-bec1-ef5ae03e0298", 00:20:21.702 "is_configured": true, 00:20:21.702 "data_offset": 2048, 00:20:21.702 "data_size": 63488 00:20:21.702 }, 00:20:21.702 { 00:20:21.702 "name": null, 00:20:21.702 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:21.702 "is_configured": false, 00:20:21.702 "data_offset": 2048, 00:20:21.702 "data_size": 63488 00:20:21.702 }, 00:20:21.702 { 00:20:21.702 "name": "BaseBdev3", 00:20:21.702 "uuid": "32990f8f-99eb-53d0-abcd-524f14def49a", 00:20:21.702 "is_configured": true, 00:20:21.702 "data_offset": 2048, 00:20:21.702 "data_size": 63488 00:20:21.702 }, 00:20:21.702 { 00:20:21.702 "name": "BaseBdev4", 00:20:21.702 "uuid": "5f06044b-827b-5789-b6c2-71a11c73bad5", 00:20:21.702 "is_configured": true, 00:20:21.702 "data_offset": 2048, 00:20:21.702 "data_size": 63488 00:20:21.702 } 00:20:21.702 ] 00:20:21.702 }' 00:20:21.702 12:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:21.702 12:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:21.702 12:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:21.702 12:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:21.702 12:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:21.702 12:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:20:21.961 12:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:20:21.961 12:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:20:21.961 [2024-07-12 12:01:12.123226] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:21.961 12:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:21.961 12:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:21.961 12:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:21.961 12:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:21.961 12:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:21.961 12:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:21.961 12:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:21.961 12:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:21.961 12:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:21.961 12:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:21.961 12:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:21.961 12:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:22.220 12:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:22.220 "name": "raid_bdev1", 00:20:22.220 "uuid": "cf01f1ab-21d8-4c42-b072-e1ee5625a77d", 00:20:22.220 "strip_size_kb": 0, 00:20:22.220 "state": "online", 00:20:22.220 "raid_level": "raid1", 00:20:22.220 "superblock": true, 00:20:22.220 "num_base_bdevs": 4, 00:20:22.220 "num_base_bdevs_discovered": 2, 00:20:22.220 "num_base_bdevs_operational": 2, 00:20:22.220 "base_bdevs_list": [ 00:20:22.220 { 00:20:22.220 "name": null, 00:20:22.220 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:22.220 "is_configured": false, 00:20:22.220 "data_offset": 2048, 00:20:22.220 "data_size": 63488 00:20:22.220 }, 00:20:22.220 { 00:20:22.220 "name": null, 00:20:22.220 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:22.220 "is_configured": false, 00:20:22.220 "data_offset": 2048, 00:20:22.220 "data_size": 63488 00:20:22.220 }, 00:20:22.220 { 00:20:22.220 "name": "BaseBdev3", 00:20:22.220 "uuid": "32990f8f-99eb-53d0-abcd-524f14def49a", 00:20:22.220 "is_configured": true, 00:20:22.220 "data_offset": 2048, 00:20:22.220 "data_size": 63488 00:20:22.220 }, 00:20:22.220 { 00:20:22.220 "name": "BaseBdev4", 00:20:22.220 "uuid": "5f06044b-827b-5789-b6c2-71a11c73bad5", 00:20:22.220 "is_configured": true, 00:20:22.220 "data_offset": 2048, 00:20:22.220 "data_size": 63488 00:20:22.220 } 00:20:22.220 ] 00:20:22.220 }' 00:20:22.220 12:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:22.220 12:01:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:22.799 12:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:22.799 [2024-07-12 12:01:12.913285] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:22.799 [2024-07-12 12:01:12.913404] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:20:22.799 [2024-07-12 12:01:12.913414] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:20:22.799 [2024-07-12 12:01:12.913431] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:22.799 [2024-07-12 12:01:12.916835] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x269f0e0 00:20:22.799 [2024-07-12 12:01:12.918300] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:22.799 12:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:20:23.733 12:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:23.733 12:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:23.733 12:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:23.733 12:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:23.733 12:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:23.733 12:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:23.733 12:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:23.991 12:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:23.991 "name": "raid_bdev1", 00:20:23.991 "uuid": "cf01f1ab-21d8-4c42-b072-e1ee5625a77d", 00:20:23.991 "strip_size_kb": 0, 00:20:23.991 "state": "online", 00:20:23.991 "raid_level": "raid1", 00:20:23.991 "superblock": true, 00:20:23.991 "num_base_bdevs": 4, 00:20:23.991 "num_base_bdevs_discovered": 3, 00:20:23.991 "num_base_bdevs_operational": 3, 00:20:23.991 "process": { 00:20:23.991 "type": "rebuild", 00:20:23.991 "target": "spare", 00:20:23.991 "progress": { 00:20:23.991 "blocks": 22528, 00:20:23.991 "percent": 35 00:20:23.991 } 00:20:23.991 }, 00:20:23.991 "base_bdevs_list": [ 00:20:23.991 { 00:20:23.991 "name": "spare", 00:20:23.991 "uuid": "6d6cdb2c-388f-5b1b-bec1-ef5ae03e0298", 00:20:23.991 "is_configured": true, 00:20:23.991 "data_offset": 2048, 00:20:23.991 "data_size": 63488 00:20:23.991 }, 00:20:23.991 { 00:20:23.991 "name": null, 00:20:23.991 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:23.991 "is_configured": false, 00:20:23.991 "data_offset": 2048, 00:20:23.991 "data_size": 63488 00:20:23.991 }, 00:20:23.991 { 00:20:23.991 "name": "BaseBdev3", 00:20:23.991 "uuid": "32990f8f-99eb-53d0-abcd-524f14def49a", 00:20:23.991 "is_configured": true, 00:20:23.991 "data_offset": 2048, 00:20:23.991 "data_size": 63488 00:20:23.991 }, 00:20:23.991 { 00:20:23.991 "name": "BaseBdev4", 00:20:23.991 "uuid": "5f06044b-827b-5789-b6c2-71a11c73bad5", 00:20:23.991 "is_configured": true, 00:20:23.991 "data_offset": 2048, 00:20:23.991 "data_size": 63488 00:20:23.991 } 00:20:23.991 ] 00:20:23.991 }' 00:20:23.991 12:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:23.991 12:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:23.991 12:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:23.991 12:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:23.991 12:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:20:24.249 [2024-07-12 12:01:14.342515] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:24.249 [2024-07-12 12:01:14.428770] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:24.249 [2024-07-12 12:01:14.428798] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:24.249 [2024-07-12 12:01:14.428807] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:24.249 [2024-07-12 12:01:14.428811] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:20:24.249 12:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:24.249 12:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:24.249 12:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:24.249 12:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:24.249 12:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:24.249 12:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:24.249 12:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:24.249 12:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:24.249 12:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:24.249 12:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:24.249 12:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:24.250 12:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:24.508 12:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:24.508 "name": "raid_bdev1", 00:20:24.508 "uuid": "cf01f1ab-21d8-4c42-b072-e1ee5625a77d", 00:20:24.508 "strip_size_kb": 0, 00:20:24.508 "state": "online", 00:20:24.508 "raid_level": "raid1", 00:20:24.508 "superblock": true, 00:20:24.508 "num_base_bdevs": 4, 00:20:24.508 "num_base_bdevs_discovered": 2, 00:20:24.508 "num_base_bdevs_operational": 2, 00:20:24.508 "base_bdevs_list": [ 00:20:24.508 { 00:20:24.508 "name": null, 00:20:24.508 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:24.508 "is_configured": false, 00:20:24.508 "data_offset": 2048, 00:20:24.508 "data_size": 63488 00:20:24.508 }, 00:20:24.508 { 00:20:24.508 "name": null, 00:20:24.508 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:24.508 "is_configured": false, 00:20:24.508 "data_offset": 2048, 00:20:24.508 "data_size": 63488 00:20:24.508 }, 00:20:24.508 { 00:20:24.508 "name": "BaseBdev3", 00:20:24.509 "uuid": "32990f8f-99eb-53d0-abcd-524f14def49a", 00:20:24.509 "is_configured": true, 00:20:24.509 "data_offset": 2048, 00:20:24.509 "data_size": 63488 00:20:24.509 }, 00:20:24.509 { 00:20:24.509 "name": "BaseBdev4", 00:20:24.509 "uuid": "5f06044b-827b-5789-b6c2-71a11c73bad5", 00:20:24.509 "is_configured": true, 00:20:24.509 "data_offset": 2048, 00:20:24.509 "data_size": 63488 00:20:24.509 } 00:20:24.509 ] 00:20:24.509 }' 00:20:24.509 12:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:24.509 12:01:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:25.075 12:01:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:25.075 [2024-07-12 12:01:15.258543] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:25.075 [2024-07-12 12:01:15.258579] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:25.075 [2024-07-12 12:01:15.258608] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2709890 00:20:25.075 [2024-07-12 12:01:15.258615] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:25.075 [2024-07-12 12:01:15.258883] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:25.075 [2024-07-12 12:01:15.258893] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:25.075 [2024-07-12 12:01:15.258944] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:20:25.075 [2024-07-12 12:01:15.258951] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:20:25.075 [2024-07-12 12:01:15.258957] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:20:25.075 [2024-07-12 12:01:15.258968] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:25.075 [2024-07-12 12:01:15.262344] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x271ddc0 00:20:25.075 [2024-07-12 12:01:15.263349] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:25.075 spare 00:20:25.075 12:01:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:20:26.450 12:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:26.450 12:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:26.450 12:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:26.450 12:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:26.450 12:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:26.450 12:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:26.450 12:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:26.450 12:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:26.450 "name": "raid_bdev1", 00:20:26.450 "uuid": "cf01f1ab-21d8-4c42-b072-e1ee5625a77d", 00:20:26.450 "strip_size_kb": 0, 00:20:26.450 "state": "online", 00:20:26.450 "raid_level": "raid1", 00:20:26.450 "superblock": true, 00:20:26.450 "num_base_bdevs": 4, 00:20:26.450 "num_base_bdevs_discovered": 3, 00:20:26.450 "num_base_bdevs_operational": 3, 00:20:26.450 "process": { 00:20:26.450 "type": "rebuild", 00:20:26.450 "target": "spare", 00:20:26.450 "progress": { 00:20:26.450 "blocks": 22528, 00:20:26.450 "percent": 35 00:20:26.450 } 00:20:26.450 }, 00:20:26.450 "base_bdevs_list": [ 00:20:26.450 { 00:20:26.450 "name": "spare", 00:20:26.450 "uuid": "6d6cdb2c-388f-5b1b-bec1-ef5ae03e0298", 00:20:26.450 "is_configured": true, 00:20:26.450 "data_offset": 2048, 00:20:26.450 "data_size": 63488 00:20:26.450 }, 00:20:26.450 { 00:20:26.450 "name": null, 00:20:26.450 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:26.450 "is_configured": false, 00:20:26.450 "data_offset": 2048, 00:20:26.450 "data_size": 63488 00:20:26.450 }, 00:20:26.450 { 00:20:26.450 "name": "BaseBdev3", 00:20:26.450 "uuid": "32990f8f-99eb-53d0-abcd-524f14def49a", 00:20:26.450 "is_configured": true, 00:20:26.450 "data_offset": 2048, 00:20:26.450 "data_size": 63488 00:20:26.450 }, 00:20:26.450 { 00:20:26.450 "name": "BaseBdev4", 00:20:26.450 "uuid": "5f06044b-827b-5789-b6c2-71a11c73bad5", 00:20:26.450 "is_configured": true, 00:20:26.450 "data_offset": 2048, 00:20:26.450 "data_size": 63488 00:20:26.450 } 00:20:26.450 ] 00:20:26.450 }' 00:20:26.450 12:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:26.450 12:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:26.450 12:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:26.450 12:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:26.450 12:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:20:26.450 [2024-07-12 12:01:16.683602] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:26.708 [2024-07-12 12:01:16.773917] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:26.708 [2024-07-12 12:01:16.773945] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:26.708 [2024-07-12 12:01:16.773953] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:26.708 [2024-07-12 12:01:16.773957] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:20:26.708 12:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:26.708 12:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:26.708 12:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:26.708 12:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:26.708 12:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:26.709 12:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:26.709 12:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:26.709 12:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:26.709 12:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:26.709 12:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:26.709 12:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:26.709 12:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:26.966 12:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:26.966 "name": "raid_bdev1", 00:20:26.966 "uuid": "cf01f1ab-21d8-4c42-b072-e1ee5625a77d", 00:20:26.966 "strip_size_kb": 0, 00:20:26.966 "state": "online", 00:20:26.966 "raid_level": "raid1", 00:20:26.966 "superblock": true, 00:20:26.966 "num_base_bdevs": 4, 00:20:26.966 "num_base_bdevs_discovered": 2, 00:20:26.966 "num_base_bdevs_operational": 2, 00:20:26.966 "base_bdevs_list": [ 00:20:26.966 { 00:20:26.966 "name": null, 00:20:26.966 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:26.966 "is_configured": false, 00:20:26.966 "data_offset": 2048, 00:20:26.966 "data_size": 63488 00:20:26.966 }, 00:20:26.966 { 00:20:26.966 "name": null, 00:20:26.966 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:26.966 "is_configured": false, 00:20:26.966 "data_offset": 2048, 00:20:26.966 "data_size": 63488 00:20:26.967 }, 00:20:26.967 { 00:20:26.967 "name": "BaseBdev3", 00:20:26.967 "uuid": "32990f8f-99eb-53d0-abcd-524f14def49a", 00:20:26.967 "is_configured": true, 00:20:26.967 "data_offset": 2048, 00:20:26.967 "data_size": 63488 00:20:26.967 }, 00:20:26.967 { 00:20:26.967 "name": "BaseBdev4", 00:20:26.967 "uuid": "5f06044b-827b-5789-b6c2-71a11c73bad5", 00:20:26.967 "is_configured": true, 00:20:26.967 "data_offset": 2048, 00:20:26.967 "data_size": 63488 00:20:26.967 } 00:20:26.967 ] 00:20:26.967 }' 00:20:26.967 12:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:26.967 12:01:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:27.226 12:01:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:27.226 12:01:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:27.226 12:01:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:27.226 12:01:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:27.226 12:01:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:27.226 12:01:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:27.226 12:01:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:27.486 12:01:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:27.486 "name": "raid_bdev1", 00:20:27.486 "uuid": "cf01f1ab-21d8-4c42-b072-e1ee5625a77d", 00:20:27.486 "strip_size_kb": 0, 00:20:27.486 "state": "online", 00:20:27.486 "raid_level": "raid1", 00:20:27.486 "superblock": true, 00:20:27.486 "num_base_bdevs": 4, 00:20:27.486 "num_base_bdevs_discovered": 2, 00:20:27.486 "num_base_bdevs_operational": 2, 00:20:27.486 "base_bdevs_list": [ 00:20:27.486 { 00:20:27.486 "name": null, 00:20:27.486 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:27.486 "is_configured": false, 00:20:27.486 "data_offset": 2048, 00:20:27.486 "data_size": 63488 00:20:27.486 }, 00:20:27.486 { 00:20:27.486 "name": null, 00:20:27.486 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:27.486 "is_configured": false, 00:20:27.486 "data_offset": 2048, 00:20:27.486 "data_size": 63488 00:20:27.486 }, 00:20:27.486 { 00:20:27.486 "name": "BaseBdev3", 00:20:27.486 "uuid": "32990f8f-99eb-53d0-abcd-524f14def49a", 00:20:27.486 "is_configured": true, 00:20:27.486 "data_offset": 2048, 00:20:27.486 "data_size": 63488 00:20:27.486 }, 00:20:27.486 { 00:20:27.486 "name": "BaseBdev4", 00:20:27.486 "uuid": "5f06044b-827b-5789-b6c2-71a11c73bad5", 00:20:27.486 "is_configured": true, 00:20:27.486 "data_offset": 2048, 00:20:27.486 "data_size": 63488 00:20:27.486 } 00:20:27.486 ] 00:20:27.486 }' 00:20:27.486 12:01:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:27.486 12:01:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:27.486 12:01:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:27.486 12:01:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:27.486 12:01:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:20:27.745 12:01:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:20:28.005 [2024-07-12 12:01:17.996746] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:20:28.005 [2024-07-12 12:01:17.996778] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:28.005 [2024-07-12 12:01:17.996791] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2718cc0 00:20:28.005 [2024-07-12 12:01:17.996807] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:28.005 [2024-07-12 12:01:17.997065] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:28.005 [2024-07-12 12:01:17.997075] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:28.005 [2024-07-12 12:01:17.997116] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:20:28.005 [2024-07-12 12:01:17.997123] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:20:28.005 [2024-07-12 12:01:17.997128] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:20:28.005 BaseBdev1 00:20:28.005 12:01:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:20:28.940 12:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:28.940 12:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:28.940 12:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:28.940 12:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:28.940 12:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:28.940 12:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:28.940 12:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:28.940 12:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:28.940 12:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:28.940 12:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:28.940 12:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:28.940 12:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:29.197 12:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:29.197 "name": "raid_bdev1", 00:20:29.197 "uuid": "cf01f1ab-21d8-4c42-b072-e1ee5625a77d", 00:20:29.197 "strip_size_kb": 0, 00:20:29.197 "state": "online", 00:20:29.197 "raid_level": "raid1", 00:20:29.197 "superblock": true, 00:20:29.197 "num_base_bdevs": 4, 00:20:29.197 "num_base_bdevs_discovered": 2, 00:20:29.197 "num_base_bdevs_operational": 2, 00:20:29.197 "base_bdevs_list": [ 00:20:29.197 { 00:20:29.197 "name": null, 00:20:29.197 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:29.197 "is_configured": false, 00:20:29.197 "data_offset": 2048, 00:20:29.197 "data_size": 63488 00:20:29.197 }, 00:20:29.197 { 00:20:29.197 "name": null, 00:20:29.197 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:29.197 "is_configured": false, 00:20:29.197 "data_offset": 2048, 00:20:29.197 "data_size": 63488 00:20:29.197 }, 00:20:29.197 { 00:20:29.197 "name": "BaseBdev3", 00:20:29.197 "uuid": "32990f8f-99eb-53d0-abcd-524f14def49a", 00:20:29.197 "is_configured": true, 00:20:29.197 "data_offset": 2048, 00:20:29.197 "data_size": 63488 00:20:29.197 }, 00:20:29.197 { 00:20:29.197 "name": "BaseBdev4", 00:20:29.197 "uuid": "5f06044b-827b-5789-b6c2-71a11c73bad5", 00:20:29.197 "is_configured": true, 00:20:29.197 "data_offset": 2048, 00:20:29.197 "data_size": 63488 00:20:29.197 } 00:20:29.197 ] 00:20:29.197 }' 00:20:29.197 12:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:29.197 12:01:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:29.454 12:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:29.454 12:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:29.454 12:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:29.455 12:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:29.455 12:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:29.455 12:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:29.455 12:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:29.713 12:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:29.713 "name": "raid_bdev1", 00:20:29.713 "uuid": "cf01f1ab-21d8-4c42-b072-e1ee5625a77d", 00:20:29.713 "strip_size_kb": 0, 00:20:29.713 "state": "online", 00:20:29.713 "raid_level": "raid1", 00:20:29.713 "superblock": true, 00:20:29.713 "num_base_bdevs": 4, 00:20:29.713 "num_base_bdevs_discovered": 2, 00:20:29.713 "num_base_bdevs_operational": 2, 00:20:29.713 "base_bdevs_list": [ 00:20:29.713 { 00:20:29.713 "name": null, 00:20:29.713 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:29.713 "is_configured": false, 00:20:29.713 "data_offset": 2048, 00:20:29.713 "data_size": 63488 00:20:29.713 }, 00:20:29.713 { 00:20:29.713 "name": null, 00:20:29.713 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:29.713 "is_configured": false, 00:20:29.713 "data_offset": 2048, 00:20:29.713 "data_size": 63488 00:20:29.713 }, 00:20:29.713 { 00:20:29.713 "name": "BaseBdev3", 00:20:29.713 "uuid": "32990f8f-99eb-53d0-abcd-524f14def49a", 00:20:29.713 "is_configured": true, 00:20:29.713 "data_offset": 2048, 00:20:29.713 "data_size": 63488 00:20:29.713 }, 00:20:29.713 { 00:20:29.713 "name": "BaseBdev4", 00:20:29.713 "uuid": "5f06044b-827b-5789-b6c2-71a11c73bad5", 00:20:29.713 "is_configured": true, 00:20:29.713 "data_offset": 2048, 00:20:29.713 "data_size": 63488 00:20:29.713 } 00:20:29.713 ] 00:20:29.713 }' 00:20:29.713 12:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:29.713 12:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:29.713 12:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:29.713 12:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:29.713 12:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:20:29.713 12:01:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:20:29.713 12:01:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:20:29.713 12:01:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:29.713 12:01:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:29.713 12:01:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:29.713 12:01:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:29.713 12:01:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:29.713 12:01:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:29.713 12:01:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:29.713 12:01:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:20:29.713 12:01:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:20:29.971 [2024-07-12 12:01:20.062114] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:29.971 [2024-07-12 12:01:20.062220] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:20:29.971 [2024-07-12 12:01:20.062230] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:20:29.971 request: 00:20:29.971 { 00:20:29.971 "raid_bdev": "raid_bdev1", 00:20:29.971 "base_bdev": "BaseBdev1", 00:20:29.971 "method": "bdev_raid_add_base_bdev", 00:20:29.971 "req_id": 1 00:20:29.971 } 00:20:29.971 Got JSON-RPC error response 00:20:29.971 response: 00:20:29.971 { 00:20:29.971 "code": -22, 00:20:29.971 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:20:29.971 } 00:20:29.971 12:01:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:20:29.971 12:01:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:20:29.971 12:01:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:20:29.971 12:01:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:20:29.971 12:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:20:30.904 12:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:30.904 12:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:30.904 12:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:30.904 12:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:30.904 12:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:30.904 12:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:30.904 12:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:30.904 12:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:30.904 12:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:30.904 12:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:30.904 12:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:30.904 12:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:31.162 12:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:31.162 "name": "raid_bdev1", 00:20:31.162 "uuid": "cf01f1ab-21d8-4c42-b072-e1ee5625a77d", 00:20:31.162 "strip_size_kb": 0, 00:20:31.162 "state": "online", 00:20:31.162 "raid_level": "raid1", 00:20:31.162 "superblock": true, 00:20:31.162 "num_base_bdevs": 4, 00:20:31.162 "num_base_bdevs_discovered": 2, 00:20:31.162 "num_base_bdevs_operational": 2, 00:20:31.162 "base_bdevs_list": [ 00:20:31.162 { 00:20:31.162 "name": null, 00:20:31.163 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:31.163 "is_configured": false, 00:20:31.163 "data_offset": 2048, 00:20:31.163 "data_size": 63488 00:20:31.163 }, 00:20:31.163 { 00:20:31.163 "name": null, 00:20:31.163 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:31.163 "is_configured": false, 00:20:31.163 "data_offset": 2048, 00:20:31.163 "data_size": 63488 00:20:31.163 }, 00:20:31.163 { 00:20:31.163 "name": "BaseBdev3", 00:20:31.163 "uuid": "32990f8f-99eb-53d0-abcd-524f14def49a", 00:20:31.163 "is_configured": true, 00:20:31.163 "data_offset": 2048, 00:20:31.163 "data_size": 63488 00:20:31.163 }, 00:20:31.163 { 00:20:31.163 "name": "BaseBdev4", 00:20:31.163 "uuid": "5f06044b-827b-5789-b6c2-71a11c73bad5", 00:20:31.163 "is_configured": true, 00:20:31.163 "data_offset": 2048, 00:20:31.163 "data_size": 63488 00:20:31.163 } 00:20:31.163 ] 00:20:31.163 }' 00:20:31.163 12:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:31.163 12:01:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:31.729 12:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:31.729 12:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:31.729 12:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:31.729 12:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:31.729 12:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:31.729 12:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:31.729 12:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:31.729 12:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:31.729 "name": "raid_bdev1", 00:20:31.729 "uuid": "cf01f1ab-21d8-4c42-b072-e1ee5625a77d", 00:20:31.729 "strip_size_kb": 0, 00:20:31.729 "state": "online", 00:20:31.729 "raid_level": "raid1", 00:20:31.729 "superblock": true, 00:20:31.729 "num_base_bdevs": 4, 00:20:31.729 "num_base_bdevs_discovered": 2, 00:20:31.729 "num_base_bdevs_operational": 2, 00:20:31.729 "base_bdevs_list": [ 00:20:31.729 { 00:20:31.729 "name": null, 00:20:31.729 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:31.729 "is_configured": false, 00:20:31.729 "data_offset": 2048, 00:20:31.729 "data_size": 63488 00:20:31.729 }, 00:20:31.729 { 00:20:31.729 "name": null, 00:20:31.729 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:31.729 "is_configured": false, 00:20:31.729 "data_offset": 2048, 00:20:31.729 "data_size": 63488 00:20:31.729 }, 00:20:31.729 { 00:20:31.729 "name": "BaseBdev3", 00:20:31.729 "uuid": "32990f8f-99eb-53d0-abcd-524f14def49a", 00:20:31.729 "is_configured": true, 00:20:31.729 "data_offset": 2048, 00:20:31.729 "data_size": 63488 00:20:31.729 }, 00:20:31.729 { 00:20:31.729 "name": "BaseBdev4", 00:20:31.729 "uuid": "5f06044b-827b-5789-b6c2-71a11c73bad5", 00:20:31.729 "is_configured": true, 00:20:31.729 "data_offset": 2048, 00:20:31.729 "data_size": 63488 00:20:31.729 } 00:20:31.729 ] 00:20:31.729 }' 00:20:31.729 12:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:31.729 12:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:31.729 12:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:31.988 12:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:31.988 12:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 711998 00:20:31.988 12:01:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@948 -- # '[' -z 711998 ']' 00:20:31.988 12:01:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # kill -0 711998 00:20:31.988 12:01:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # uname 00:20:31.988 12:01:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:31.988 12:01:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 711998 00:20:31.988 12:01:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:31.988 12:01:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:31.988 12:01:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 711998' 00:20:31.988 killing process with pid 711998 00:20:31.988 12:01:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # kill 711998 00:20:31.988 Received shutdown signal, test time was about 60.000000 seconds 00:20:31.988 00:20:31.988 Latency(us) 00:20:31.988 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:20:31.988 =================================================================================================================== 00:20:31.988 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:20:31.988 [2024-07-12 12:01:22.037362] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:31.988 [2024-07-12 12:01:22.037427] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:31.988 [2024-07-12 12:01:22.037467] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:31.988 [2024-07-12 12:01:22.037473] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x270a4b0 name raid_bdev1, state offline 00:20:31.988 12:01:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@972 -- # wait 711998 00:20:31.988 [2024-07-12 12:01:22.076470] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:32.247 12:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:20:32.247 00:20:32.247 real 0m30.330s 00:20:32.247 user 0m44.010s 00:20:32.247 sys 0m4.168s 00:20:32.247 12:01:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:32.247 12:01:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:32.247 ************************************ 00:20:32.247 END TEST raid_rebuild_test_sb 00:20:32.247 ************************************ 00:20:32.247 12:01:22 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:32.247 12:01:22 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 4 false true true 00:20:32.247 12:01:22 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:20:32.247 12:01:22 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:32.247 12:01:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:32.247 ************************************ 00:20:32.247 START TEST raid_rebuild_test_io 00:20:32.247 ************************************ 00:20:32.247 12:01:22 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 false true true 00:20:32.247 12:01:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:20:32.247 12:01:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:20:32.247 12:01:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:20:32.247 12:01:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:20:32.247 12:01:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:20:32.247 12:01:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:20:32.247 12:01:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:32.247 12:01:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:20:32.247 12:01:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:32.247 12:01:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:32.247 12:01:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:20:32.247 12:01:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:32.247 12:01:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:32.247 12:01:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:20:32.247 12:01:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:32.248 12:01:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:32.248 12:01:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:20:32.248 12:01:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:32.248 12:01:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:32.248 12:01:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:32.248 12:01:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:20:32.248 12:01:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:20:32.248 12:01:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:20:32.248 12:01:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:20:32.248 12:01:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:20:32.248 12:01:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:20:32.248 12:01:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:20:32.248 12:01:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:20:32.248 12:01:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:20:32.248 12:01:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=717452 00:20:32.248 12:01:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 717452 /var/tmp/spdk-raid.sock 00:20:32.248 12:01:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:20:32.248 12:01:22 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@829 -- # '[' -z 717452 ']' 00:20:32.248 12:01:22 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:32.248 12:01:22 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:32.248 12:01:22 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:32.248 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:32.248 12:01:22 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:32.248 12:01:22 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:32.248 [2024-07-12 12:01:22.374898] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:20:32.248 [2024-07-12 12:01:22.374934] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid717452 ] 00:20:32.248 I/O size of 3145728 is greater than zero copy threshold (65536). 00:20:32.248 Zero copy mechanism will not be used. 00:20:32.248 [2024-07-12 12:01:22.437395] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:32.505 [2024-07-12 12:01:22.516448] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:20:32.505 [2024-07-12 12:01:22.568212] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:32.505 [2024-07-12 12:01:22.568239] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:33.071 12:01:23 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:33.071 12:01:23 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@862 -- # return 0 00:20:33.071 12:01:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:33.071 12:01:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:33.330 BaseBdev1_malloc 00:20:33.330 12:01:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:20:33.330 [2024-07-12 12:01:23.491896] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:20:33.330 [2024-07-12 12:01:23.491929] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:33.330 [2024-07-12 12:01:23.491942] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1717010 00:20:33.330 [2024-07-12 12:01:23.491948] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:33.330 [2024-07-12 12:01:23.493095] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:33.330 [2024-07-12 12:01:23.493115] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:33.330 BaseBdev1 00:20:33.330 12:01:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:33.330 12:01:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:33.588 BaseBdev2_malloc 00:20:33.588 12:01:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:20:33.849 [2024-07-12 12:01:23.836491] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:20:33.849 [2024-07-12 12:01:23.836526] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:33.849 [2024-07-12 12:01:23.836540] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1717b60 00:20:33.849 [2024-07-12 12:01:23.836546] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:33.849 [2024-07-12 12:01:23.837614] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:33.849 [2024-07-12 12:01:23.837635] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:33.849 BaseBdev2 00:20:33.849 12:01:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:33.849 12:01:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:33.849 BaseBdev3_malloc 00:20:33.849 12:01:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:20:34.106 [2024-07-12 12:01:24.177048] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:20:34.106 [2024-07-12 12:01:24.177079] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:34.106 [2024-07-12 12:01:24.177090] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18c40a0 00:20:34.106 [2024-07-12 12:01:24.177097] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:34.106 [2024-07-12 12:01:24.178161] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:34.106 [2024-07-12 12:01:24.178181] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:34.106 BaseBdev3 00:20:34.106 12:01:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:34.106 12:01:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:20:34.364 BaseBdev4_malloc 00:20:34.364 12:01:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:20:34.364 [2024-07-12 12:01:24.517501] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:20:34.364 [2024-07-12 12:01:24.517546] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:34.364 [2024-07-12 12:01:24.517556] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18c2880 00:20:34.364 [2024-07-12 12:01:24.517578] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:34.364 [2024-07-12 12:01:24.518601] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:34.364 [2024-07-12 12:01:24.518621] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:20:34.364 BaseBdev4 00:20:34.364 12:01:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:20:34.621 spare_malloc 00:20:34.621 12:01:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:20:34.621 spare_delay 00:20:34.621 12:01:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:34.879 [2024-07-12 12:01:25.006133] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:34.879 [2024-07-12 12:01:25.006164] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:34.879 [2024-07-12 12:01:25.006177] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18c6400 00:20:34.879 [2024-07-12 12:01:25.006184] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:34.879 [2024-07-12 12:01:25.007255] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:34.879 [2024-07-12 12:01:25.007274] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:34.879 spare 00:20:34.879 12:01:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:20:35.138 [2024-07-12 12:01:25.166570] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:35.138 [2024-07-12 12:01:25.167460] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:35.138 [2024-07-12 12:01:25.167496] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:35.138 [2024-07-12 12:01:25.167537] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:35.138 [2024-07-12 12:01:25.167586] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x18c0dd0 00:20:35.138 [2024-07-12 12:01:25.167590] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:20:35.138 [2024-07-12 12:01:25.167730] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x184b890 00:20:35.138 [2024-07-12 12:01:25.167828] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18c0dd0 00:20:35.138 [2024-07-12 12:01:25.167833] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x18c0dd0 00:20:35.138 [2024-07-12 12:01:25.167903] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:35.138 12:01:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:20:35.138 12:01:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:35.138 12:01:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:35.138 12:01:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:35.138 12:01:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:35.138 12:01:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:35.138 12:01:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:35.138 12:01:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:35.138 12:01:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:35.138 12:01:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:35.138 12:01:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:35.138 12:01:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:35.138 12:01:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:35.138 "name": "raid_bdev1", 00:20:35.138 "uuid": "6c9ecc31-6333-44a6-8e1f-c5f12094e642", 00:20:35.138 "strip_size_kb": 0, 00:20:35.138 "state": "online", 00:20:35.138 "raid_level": "raid1", 00:20:35.138 "superblock": false, 00:20:35.138 "num_base_bdevs": 4, 00:20:35.138 "num_base_bdevs_discovered": 4, 00:20:35.138 "num_base_bdevs_operational": 4, 00:20:35.138 "base_bdevs_list": [ 00:20:35.138 { 00:20:35.138 "name": "BaseBdev1", 00:20:35.138 "uuid": "66f61935-582f-5e18-b693-ed15d5626895", 00:20:35.138 "is_configured": true, 00:20:35.138 "data_offset": 0, 00:20:35.138 "data_size": 65536 00:20:35.138 }, 00:20:35.138 { 00:20:35.138 "name": "BaseBdev2", 00:20:35.138 "uuid": "f6d536ff-bc3d-508e-b28e-e6d2ad45d201", 00:20:35.138 "is_configured": true, 00:20:35.138 "data_offset": 0, 00:20:35.138 "data_size": 65536 00:20:35.138 }, 00:20:35.138 { 00:20:35.138 "name": "BaseBdev3", 00:20:35.138 "uuid": "28b9d108-b2e5-56f5-bee4-bda609efdd3b", 00:20:35.138 "is_configured": true, 00:20:35.138 "data_offset": 0, 00:20:35.138 "data_size": 65536 00:20:35.138 }, 00:20:35.138 { 00:20:35.138 "name": "BaseBdev4", 00:20:35.138 "uuid": "b8962bf3-cb26-522b-b69a-efb77c3670ad", 00:20:35.138 "is_configured": true, 00:20:35.138 "data_offset": 0, 00:20:35.138 "data_size": 65536 00:20:35.138 } 00:20:35.138 ] 00:20:35.138 }' 00:20:35.139 12:01:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:35.139 12:01:25 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:35.707 12:01:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:20:35.707 12:01:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:35.965 [2024-07-12 12:01:26.004905] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:35.965 12:01:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:20:35.965 12:01:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:35.965 12:01:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:20:35.965 12:01:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:20:35.965 12:01:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:20:35.966 12:01:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:20:35.966 12:01:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:36.222 [2024-07-12 12:01:26.279202] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18c1940 00:20:36.222 I/O size of 3145728 is greater than zero copy threshold (65536). 00:20:36.222 Zero copy mechanism will not be used. 00:20:36.222 Running I/O for 60 seconds... 00:20:36.222 [2024-07-12 12:01:26.347964] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:36.222 [2024-07-12 12:01:26.353153] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x18c1940 00:20:36.222 12:01:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:36.222 12:01:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:36.222 12:01:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:36.222 12:01:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:36.222 12:01:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:36.222 12:01:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:36.222 12:01:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:36.222 12:01:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:36.222 12:01:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:36.222 12:01:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:36.222 12:01:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:36.222 12:01:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:36.491 12:01:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:36.491 "name": "raid_bdev1", 00:20:36.491 "uuid": "6c9ecc31-6333-44a6-8e1f-c5f12094e642", 00:20:36.491 "strip_size_kb": 0, 00:20:36.491 "state": "online", 00:20:36.491 "raid_level": "raid1", 00:20:36.491 "superblock": false, 00:20:36.491 "num_base_bdevs": 4, 00:20:36.491 "num_base_bdevs_discovered": 3, 00:20:36.491 "num_base_bdevs_operational": 3, 00:20:36.491 "base_bdevs_list": [ 00:20:36.491 { 00:20:36.491 "name": null, 00:20:36.491 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:36.491 "is_configured": false, 00:20:36.491 "data_offset": 0, 00:20:36.491 "data_size": 65536 00:20:36.491 }, 00:20:36.491 { 00:20:36.491 "name": "BaseBdev2", 00:20:36.491 "uuid": "f6d536ff-bc3d-508e-b28e-e6d2ad45d201", 00:20:36.491 "is_configured": true, 00:20:36.491 "data_offset": 0, 00:20:36.491 "data_size": 65536 00:20:36.491 }, 00:20:36.491 { 00:20:36.491 "name": "BaseBdev3", 00:20:36.491 "uuid": "28b9d108-b2e5-56f5-bee4-bda609efdd3b", 00:20:36.491 "is_configured": true, 00:20:36.491 "data_offset": 0, 00:20:36.491 "data_size": 65536 00:20:36.491 }, 00:20:36.491 { 00:20:36.491 "name": "BaseBdev4", 00:20:36.491 "uuid": "b8962bf3-cb26-522b-b69a-efb77c3670ad", 00:20:36.491 "is_configured": true, 00:20:36.491 "data_offset": 0, 00:20:36.491 "data_size": 65536 00:20:36.491 } 00:20:36.491 ] 00:20:36.491 }' 00:20:36.491 12:01:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:36.491 12:01:26 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:37.076 12:01:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:37.076 [2024-07-12 12:01:27.232998] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:37.076 12:01:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:20:37.076 [2024-07-12 12:01:27.284330] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17ae290 00:20:37.076 [2024-07-12 12:01:27.286054] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:37.334 [2024-07-12 12:01:27.401157] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:37.334 [2024-07-12 12:01:27.402223] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:37.593 [2024-07-12 12:01:27.609681] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:37.593 [2024-07-12 12:01:27.609791] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:37.852 [2024-07-12 12:01:27.935507] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:20:37.852 [2024-07-12 12:01:27.936563] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:20:38.111 [2024-07-12 12:01:28.164443] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:20:38.111 12:01:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:38.111 12:01:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:38.111 12:01:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:38.111 12:01:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:38.111 12:01:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:38.111 12:01:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:38.111 12:01:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:38.369 [2024-07-12 12:01:28.386311] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:20:38.369 12:01:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:38.369 "name": "raid_bdev1", 00:20:38.369 "uuid": "6c9ecc31-6333-44a6-8e1f-c5f12094e642", 00:20:38.369 "strip_size_kb": 0, 00:20:38.369 "state": "online", 00:20:38.369 "raid_level": "raid1", 00:20:38.369 "superblock": false, 00:20:38.369 "num_base_bdevs": 4, 00:20:38.369 "num_base_bdevs_discovered": 4, 00:20:38.369 "num_base_bdevs_operational": 4, 00:20:38.369 "process": { 00:20:38.369 "type": "rebuild", 00:20:38.369 "target": "spare", 00:20:38.369 "progress": { 00:20:38.369 "blocks": 14336, 00:20:38.369 "percent": 21 00:20:38.369 } 00:20:38.369 }, 00:20:38.369 "base_bdevs_list": [ 00:20:38.369 { 00:20:38.369 "name": "spare", 00:20:38.369 "uuid": "dff5513c-94e9-5429-8289-6692ea1e7ae2", 00:20:38.369 "is_configured": true, 00:20:38.369 "data_offset": 0, 00:20:38.369 "data_size": 65536 00:20:38.369 }, 00:20:38.369 { 00:20:38.369 "name": "BaseBdev2", 00:20:38.369 "uuid": "f6d536ff-bc3d-508e-b28e-e6d2ad45d201", 00:20:38.369 "is_configured": true, 00:20:38.369 "data_offset": 0, 00:20:38.369 "data_size": 65536 00:20:38.369 }, 00:20:38.369 { 00:20:38.369 "name": "BaseBdev3", 00:20:38.369 "uuid": "28b9d108-b2e5-56f5-bee4-bda609efdd3b", 00:20:38.369 "is_configured": true, 00:20:38.369 "data_offset": 0, 00:20:38.369 "data_size": 65536 00:20:38.369 }, 00:20:38.369 { 00:20:38.369 "name": "BaseBdev4", 00:20:38.369 "uuid": "b8962bf3-cb26-522b-b69a-efb77c3670ad", 00:20:38.369 "is_configured": true, 00:20:38.369 "data_offset": 0, 00:20:38.369 "data_size": 65536 00:20:38.369 } 00:20:38.369 ] 00:20:38.369 }' 00:20:38.369 12:01:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:38.369 12:01:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:38.369 12:01:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:38.369 [2024-07-12 12:01:28.501080] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:20:38.369 12:01:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:38.369 12:01:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:20:38.628 [2024-07-12 12:01:28.691654] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:38.628 [2024-07-12 12:01:28.832112] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:38.628 [2024-07-12 12:01:28.840988] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:38.628 [2024-07-12 12:01:28.841011] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:38.628 [2024-07-12 12:01:28.841016] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:20:38.628 [2024-07-12 12:01:28.856798] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x18c1940 00:20:38.886 12:01:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:38.886 12:01:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:38.886 12:01:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:38.886 12:01:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:38.886 12:01:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:38.886 12:01:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:38.886 12:01:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:38.886 12:01:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:38.886 12:01:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:38.886 12:01:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:38.886 12:01:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:38.886 12:01:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:38.886 12:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:38.886 "name": "raid_bdev1", 00:20:38.886 "uuid": "6c9ecc31-6333-44a6-8e1f-c5f12094e642", 00:20:38.886 "strip_size_kb": 0, 00:20:38.886 "state": "online", 00:20:38.886 "raid_level": "raid1", 00:20:38.886 "superblock": false, 00:20:38.886 "num_base_bdevs": 4, 00:20:38.886 "num_base_bdevs_discovered": 3, 00:20:38.886 "num_base_bdevs_operational": 3, 00:20:38.886 "base_bdevs_list": [ 00:20:38.886 { 00:20:38.886 "name": null, 00:20:38.886 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:38.886 "is_configured": false, 00:20:38.886 "data_offset": 0, 00:20:38.886 "data_size": 65536 00:20:38.886 }, 00:20:38.886 { 00:20:38.886 "name": "BaseBdev2", 00:20:38.886 "uuid": "f6d536ff-bc3d-508e-b28e-e6d2ad45d201", 00:20:38.886 "is_configured": true, 00:20:38.886 "data_offset": 0, 00:20:38.886 "data_size": 65536 00:20:38.886 }, 00:20:38.886 { 00:20:38.886 "name": "BaseBdev3", 00:20:38.886 "uuid": "28b9d108-b2e5-56f5-bee4-bda609efdd3b", 00:20:38.886 "is_configured": true, 00:20:38.886 "data_offset": 0, 00:20:38.886 "data_size": 65536 00:20:38.886 }, 00:20:38.886 { 00:20:38.886 "name": "BaseBdev4", 00:20:38.886 "uuid": "b8962bf3-cb26-522b-b69a-efb77c3670ad", 00:20:38.886 "is_configured": true, 00:20:38.886 "data_offset": 0, 00:20:38.886 "data_size": 65536 00:20:38.886 } 00:20:38.886 ] 00:20:38.886 }' 00:20:38.886 12:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:38.886 12:01:29 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:39.453 12:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:39.453 12:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:39.453 12:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:39.453 12:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:39.453 12:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:39.453 12:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:39.453 12:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:39.712 12:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:39.712 "name": "raid_bdev1", 00:20:39.712 "uuid": "6c9ecc31-6333-44a6-8e1f-c5f12094e642", 00:20:39.712 "strip_size_kb": 0, 00:20:39.712 "state": "online", 00:20:39.712 "raid_level": "raid1", 00:20:39.712 "superblock": false, 00:20:39.712 "num_base_bdevs": 4, 00:20:39.712 "num_base_bdevs_discovered": 3, 00:20:39.712 "num_base_bdevs_operational": 3, 00:20:39.712 "base_bdevs_list": [ 00:20:39.712 { 00:20:39.712 "name": null, 00:20:39.712 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:39.712 "is_configured": false, 00:20:39.712 "data_offset": 0, 00:20:39.712 "data_size": 65536 00:20:39.712 }, 00:20:39.712 { 00:20:39.712 "name": "BaseBdev2", 00:20:39.712 "uuid": "f6d536ff-bc3d-508e-b28e-e6d2ad45d201", 00:20:39.712 "is_configured": true, 00:20:39.712 "data_offset": 0, 00:20:39.712 "data_size": 65536 00:20:39.712 }, 00:20:39.712 { 00:20:39.712 "name": "BaseBdev3", 00:20:39.712 "uuid": "28b9d108-b2e5-56f5-bee4-bda609efdd3b", 00:20:39.712 "is_configured": true, 00:20:39.712 "data_offset": 0, 00:20:39.712 "data_size": 65536 00:20:39.712 }, 00:20:39.712 { 00:20:39.712 "name": "BaseBdev4", 00:20:39.712 "uuid": "b8962bf3-cb26-522b-b69a-efb77c3670ad", 00:20:39.712 "is_configured": true, 00:20:39.712 "data_offset": 0, 00:20:39.712 "data_size": 65536 00:20:39.712 } 00:20:39.712 ] 00:20:39.712 }' 00:20:39.712 12:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:39.712 12:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:39.712 12:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:39.712 12:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:39.712 12:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:39.971 [2024-07-12 12:01:29.990102] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:39.971 12:01:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:20:39.971 [2024-07-12 12:01:30.037001] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18bcfd0 00:20:39.971 [2024-07-12 12:01:30.038093] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:39.971 [2024-07-12 12:01:30.165267] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:40.229 [2024-07-12 12:01:30.379693] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:40.229 [2024-07-12 12:01:30.379849] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:40.488 [2024-07-12 12:01:30.714309] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:20:40.745 [2024-07-12 12:01:30.841617] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:20:41.042 12:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:41.042 12:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:41.042 12:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:41.042 12:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:41.042 12:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:41.042 12:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:41.042 12:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:41.042 [2024-07-12 12:01:31.079228] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:20:41.042 12:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:41.042 "name": "raid_bdev1", 00:20:41.042 "uuid": "6c9ecc31-6333-44a6-8e1f-c5f12094e642", 00:20:41.042 "strip_size_kb": 0, 00:20:41.042 "state": "online", 00:20:41.042 "raid_level": "raid1", 00:20:41.042 "superblock": false, 00:20:41.042 "num_base_bdevs": 4, 00:20:41.042 "num_base_bdevs_discovered": 4, 00:20:41.042 "num_base_bdevs_operational": 4, 00:20:41.042 "process": { 00:20:41.042 "type": "rebuild", 00:20:41.042 "target": "spare", 00:20:41.042 "progress": { 00:20:41.042 "blocks": 14336, 00:20:41.042 "percent": 21 00:20:41.042 } 00:20:41.042 }, 00:20:41.042 "base_bdevs_list": [ 00:20:41.042 { 00:20:41.042 "name": "spare", 00:20:41.042 "uuid": "dff5513c-94e9-5429-8289-6692ea1e7ae2", 00:20:41.042 "is_configured": true, 00:20:41.042 "data_offset": 0, 00:20:41.042 "data_size": 65536 00:20:41.042 }, 00:20:41.042 { 00:20:41.042 "name": "BaseBdev2", 00:20:41.042 "uuid": "f6d536ff-bc3d-508e-b28e-e6d2ad45d201", 00:20:41.042 "is_configured": true, 00:20:41.042 "data_offset": 0, 00:20:41.042 "data_size": 65536 00:20:41.042 }, 00:20:41.042 { 00:20:41.042 "name": "BaseBdev3", 00:20:41.042 "uuid": "28b9d108-b2e5-56f5-bee4-bda609efdd3b", 00:20:41.042 "is_configured": true, 00:20:41.042 "data_offset": 0, 00:20:41.042 "data_size": 65536 00:20:41.042 }, 00:20:41.042 { 00:20:41.042 "name": "BaseBdev4", 00:20:41.042 "uuid": "b8962bf3-cb26-522b-b69a-efb77c3670ad", 00:20:41.042 "is_configured": true, 00:20:41.042 "data_offset": 0, 00:20:41.042 "data_size": 65536 00:20:41.042 } 00:20:41.042 ] 00:20:41.042 }' 00:20:41.042 12:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:41.304 12:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:41.304 12:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:41.304 12:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:41.304 12:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:20:41.304 12:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:20:41.304 12:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:20:41.304 12:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:20:41.304 12:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:20:41.304 [2024-07-12 12:01:31.452309] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:41.304 [2024-07-12 12:01:31.531137] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:20:41.562 [2024-07-12 12:01:31.643715] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x18c1940 00:20:41.562 [2024-07-12 12:01:31.643735] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x18bcfd0 00:20:41.562 [2024-07-12 12:01:31.644744] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:20:41.563 12:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:20:41.563 12:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:20:41.563 12:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:41.563 12:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:41.563 12:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:41.563 12:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:41.563 12:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:41.563 12:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:41.563 12:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:41.821 12:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:41.821 "name": "raid_bdev1", 00:20:41.821 "uuid": "6c9ecc31-6333-44a6-8e1f-c5f12094e642", 00:20:41.821 "strip_size_kb": 0, 00:20:41.821 "state": "online", 00:20:41.821 "raid_level": "raid1", 00:20:41.821 "superblock": false, 00:20:41.821 "num_base_bdevs": 4, 00:20:41.821 "num_base_bdevs_discovered": 3, 00:20:41.821 "num_base_bdevs_operational": 3, 00:20:41.821 "process": { 00:20:41.821 "type": "rebuild", 00:20:41.821 "target": "spare", 00:20:41.821 "progress": { 00:20:41.821 "blocks": 22528, 00:20:41.821 "percent": 34 00:20:41.821 } 00:20:41.821 }, 00:20:41.821 "base_bdevs_list": [ 00:20:41.821 { 00:20:41.821 "name": "spare", 00:20:41.821 "uuid": "dff5513c-94e9-5429-8289-6692ea1e7ae2", 00:20:41.821 "is_configured": true, 00:20:41.821 "data_offset": 0, 00:20:41.821 "data_size": 65536 00:20:41.821 }, 00:20:41.821 { 00:20:41.821 "name": null, 00:20:41.821 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:41.821 "is_configured": false, 00:20:41.821 "data_offset": 0, 00:20:41.821 "data_size": 65536 00:20:41.821 }, 00:20:41.821 { 00:20:41.821 "name": "BaseBdev3", 00:20:41.821 "uuid": "28b9d108-b2e5-56f5-bee4-bda609efdd3b", 00:20:41.821 "is_configured": true, 00:20:41.821 "data_offset": 0, 00:20:41.821 "data_size": 65536 00:20:41.821 }, 00:20:41.821 { 00:20:41.821 "name": "BaseBdev4", 00:20:41.821 "uuid": "b8962bf3-cb26-522b-b69a-efb77c3670ad", 00:20:41.821 "is_configured": true, 00:20:41.821 "data_offset": 0, 00:20:41.821 "data_size": 65536 00:20:41.821 } 00:20:41.821 ] 00:20:41.821 }' 00:20:41.821 12:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:41.821 12:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:41.821 12:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:41.821 12:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:41.821 12:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=711 00:20:41.821 12:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:41.821 12:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:41.821 12:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:41.821 12:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:41.821 12:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:41.821 12:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:41.821 12:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:41.821 12:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:42.079 12:01:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:42.079 "name": "raid_bdev1", 00:20:42.079 "uuid": "6c9ecc31-6333-44a6-8e1f-c5f12094e642", 00:20:42.079 "strip_size_kb": 0, 00:20:42.079 "state": "online", 00:20:42.079 "raid_level": "raid1", 00:20:42.079 "superblock": false, 00:20:42.079 "num_base_bdevs": 4, 00:20:42.079 "num_base_bdevs_discovered": 3, 00:20:42.079 "num_base_bdevs_operational": 3, 00:20:42.079 "process": { 00:20:42.079 "type": "rebuild", 00:20:42.079 "target": "spare", 00:20:42.079 "progress": { 00:20:42.079 "blocks": 26624, 00:20:42.079 "percent": 40 00:20:42.079 } 00:20:42.079 }, 00:20:42.079 "base_bdevs_list": [ 00:20:42.079 { 00:20:42.079 "name": "spare", 00:20:42.079 "uuid": "dff5513c-94e9-5429-8289-6692ea1e7ae2", 00:20:42.079 "is_configured": true, 00:20:42.079 "data_offset": 0, 00:20:42.079 "data_size": 65536 00:20:42.079 }, 00:20:42.079 { 00:20:42.079 "name": null, 00:20:42.079 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:42.079 "is_configured": false, 00:20:42.079 "data_offset": 0, 00:20:42.079 "data_size": 65536 00:20:42.079 }, 00:20:42.079 { 00:20:42.079 "name": "BaseBdev3", 00:20:42.079 "uuid": "28b9d108-b2e5-56f5-bee4-bda609efdd3b", 00:20:42.079 "is_configured": true, 00:20:42.079 "data_offset": 0, 00:20:42.079 "data_size": 65536 00:20:42.079 }, 00:20:42.079 { 00:20:42.079 "name": "BaseBdev4", 00:20:42.079 "uuid": "b8962bf3-cb26-522b-b69a-efb77c3670ad", 00:20:42.079 "is_configured": true, 00:20:42.079 "data_offset": 0, 00:20:42.079 "data_size": 65536 00:20:42.079 } 00:20:42.079 ] 00:20:42.079 }' 00:20:42.079 12:01:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:42.079 [2024-07-12 12:01:32.112816] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:20:42.080 [2024-07-12 12:01:32.113187] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:20:42.080 12:01:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:42.080 12:01:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:42.080 12:01:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:42.080 12:01:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:20:42.646 [2024-07-12 12:01:32.761204] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:20:42.646 [2024-07-12 12:01:32.761473] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:20:42.904 [2024-07-12 12:01:32.976802] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:20:43.163 12:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:43.163 12:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:43.163 12:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:43.163 12:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:43.163 12:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:43.163 12:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:43.163 12:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:43.163 12:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:43.163 [2024-07-12 12:01:33.318690] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:20:43.163 12:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:43.163 "name": "raid_bdev1", 00:20:43.164 "uuid": "6c9ecc31-6333-44a6-8e1f-c5f12094e642", 00:20:43.164 "strip_size_kb": 0, 00:20:43.164 "state": "online", 00:20:43.164 "raid_level": "raid1", 00:20:43.164 "superblock": false, 00:20:43.164 "num_base_bdevs": 4, 00:20:43.164 "num_base_bdevs_discovered": 3, 00:20:43.164 "num_base_bdevs_operational": 3, 00:20:43.164 "process": { 00:20:43.164 "type": "rebuild", 00:20:43.164 "target": "spare", 00:20:43.164 "progress": { 00:20:43.164 "blocks": 45056, 00:20:43.164 "percent": 68 00:20:43.164 } 00:20:43.164 }, 00:20:43.164 "base_bdevs_list": [ 00:20:43.164 { 00:20:43.164 "name": "spare", 00:20:43.164 "uuid": "dff5513c-94e9-5429-8289-6692ea1e7ae2", 00:20:43.164 "is_configured": true, 00:20:43.164 "data_offset": 0, 00:20:43.164 "data_size": 65536 00:20:43.164 }, 00:20:43.164 { 00:20:43.164 "name": null, 00:20:43.164 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:43.164 "is_configured": false, 00:20:43.164 "data_offset": 0, 00:20:43.164 "data_size": 65536 00:20:43.164 }, 00:20:43.164 { 00:20:43.164 "name": "BaseBdev3", 00:20:43.164 "uuid": "28b9d108-b2e5-56f5-bee4-bda609efdd3b", 00:20:43.164 "is_configured": true, 00:20:43.164 "data_offset": 0, 00:20:43.164 "data_size": 65536 00:20:43.164 }, 00:20:43.164 { 00:20:43.164 "name": "BaseBdev4", 00:20:43.164 "uuid": "b8962bf3-cb26-522b-b69a-efb77c3670ad", 00:20:43.164 "is_configured": true, 00:20:43.164 "data_offset": 0, 00:20:43.164 "data_size": 65536 00:20:43.164 } 00:20:43.164 ] 00:20:43.164 }' 00:20:43.164 12:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:43.164 12:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:43.164 12:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:43.422 [2024-07-12 12:01:33.426068] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:20:43.422 12:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:43.422 12:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:20:43.989 [2024-07-12 12:01:33.972321] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:20:43.989 [2024-07-12 12:01:34.179326] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:20:44.249 12:01:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:44.249 12:01:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:44.249 12:01:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:44.249 12:01:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:44.249 12:01:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:44.249 12:01:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:44.249 12:01:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:44.249 12:01:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:44.507 [2024-07-12 12:01:34.620903] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:20:44.507 12:01:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:44.507 "name": "raid_bdev1", 00:20:44.507 "uuid": "6c9ecc31-6333-44a6-8e1f-c5f12094e642", 00:20:44.507 "strip_size_kb": 0, 00:20:44.507 "state": "online", 00:20:44.507 "raid_level": "raid1", 00:20:44.507 "superblock": false, 00:20:44.507 "num_base_bdevs": 4, 00:20:44.507 "num_base_bdevs_discovered": 3, 00:20:44.507 "num_base_bdevs_operational": 3, 00:20:44.507 "process": { 00:20:44.507 "type": "rebuild", 00:20:44.507 "target": "spare", 00:20:44.507 "progress": { 00:20:44.507 "blocks": 63488, 00:20:44.507 "percent": 96 00:20:44.507 } 00:20:44.507 }, 00:20:44.507 "base_bdevs_list": [ 00:20:44.507 { 00:20:44.507 "name": "spare", 00:20:44.507 "uuid": "dff5513c-94e9-5429-8289-6692ea1e7ae2", 00:20:44.507 "is_configured": true, 00:20:44.507 "data_offset": 0, 00:20:44.507 "data_size": 65536 00:20:44.507 }, 00:20:44.507 { 00:20:44.507 "name": null, 00:20:44.507 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:44.507 "is_configured": false, 00:20:44.507 "data_offset": 0, 00:20:44.507 "data_size": 65536 00:20:44.507 }, 00:20:44.507 { 00:20:44.507 "name": "BaseBdev3", 00:20:44.507 "uuid": "28b9d108-b2e5-56f5-bee4-bda609efdd3b", 00:20:44.507 "is_configured": true, 00:20:44.507 "data_offset": 0, 00:20:44.507 "data_size": 65536 00:20:44.507 }, 00:20:44.507 { 00:20:44.507 "name": "BaseBdev4", 00:20:44.507 "uuid": "b8962bf3-cb26-522b-b69a-efb77c3670ad", 00:20:44.507 "is_configured": true, 00:20:44.507 "data_offset": 0, 00:20:44.507 "data_size": 65536 00:20:44.507 } 00:20:44.507 ] 00:20:44.507 }' 00:20:44.507 12:01:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:44.507 12:01:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:44.507 12:01:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:44.507 12:01:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:44.507 12:01:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:20:44.507 [2024-07-12 12:01:34.721204] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:20:44.507 [2024-07-12 12:01:34.728888] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:45.882 12:01:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:45.882 12:01:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:45.882 12:01:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:45.882 12:01:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:45.882 12:01:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:45.882 12:01:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:45.882 12:01:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:45.882 12:01:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:45.882 12:01:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:45.882 "name": "raid_bdev1", 00:20:45.882 "uuid": "6c9ecc31-6333-44a6-8e1f-c5f12094e642", 00:20:45.882 "strip_size_kb": 0, 00:20:45.882 "state": "online", 00:20:45.882 "raid_level": "raid1", 00:20:45.882 "superblock": false, 00:20:45.882 "num_base_bdevs": 4, 00:20:45.882 "num_base_bdevs_discovered": 3, 00:20:45.882 "num_base_bdevs_operational": 3, 00:20:45.882 "base_bdevs_list": [ 00:20:45.882 { 00:20:45.882 "name": "spare", 00:20:45.882 "uuid": "dff5513c-94e9-5429-8289-6692ea1e7ae2", 00:20:45.882 "is_configured": true, 00:20:45.882 "data_offset": 0, 00:20:45.882 "data_size": 65536 00:20:45.882 }, 00:20:45.882 { 00:20:45.882 "name": null, 00:20:45.882 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:45.882 "is_configured": false, 00:20:45.882 "data_offset": 0, 00:20:45.882 "data_size": 65536 00:20:45.882 }, 00:20:45.882 { 00:20:45.882 "name": "BaseBdev3", 00:20:45.882 "uuid": "28b9d108-b2e5-56f5-bee4-bda609efdd3b", 00:20:45.882 "is_configured": true, 00:20:45.882 "data_offset": 0, 00:20:45.882 "data_size": 65536 00:20:45.882 }, 00:20:45.882 { 00:20:45.882 "name": "BaseBdev4", 00:20:45.882 "uuid": "b8962bf3-cb26-522b-b69a-efb77c3670ad", 00:20:45.882 "is_configured": true, 00:20:45.882 "data_offset": 0, 00:20:45.882 "data_size": 65536 00:20:45.882 } 00:20:45.882 ] 00:20:45.882 }' 00:20:45.882 12:01:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:45.882 12:01:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:20:45.882 12:01:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:45.882 12:01:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:20:45.882 12:01:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:20:45.882 12:01:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:45.882 12:01:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:45.882 12:01:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:45.882 12:01:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:45.882 12:01:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:45.882 12:01:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:45.882 12:01:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:45.882 12:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:45.882 "name": "raid_bdev1", 00:20:45.882 "uuid": "6c9ecc31-6333-44a6-8e1f-c5f12094e642", 00:20:45.882 "strip_size_kb": 0, 00:20:45.882 "state": "online", 00:20:45.882 "raid_level": "raid1", 00:20:45.882 "superblock": false, 00:20:45.882 "num_base_bdevs": 4, 00:20:45.882 "num_base_bdevs_discovered": 3, 00:20:45.882 "num_base_bdevs_operational": 3, 00:20:45.882 "base_bdevs_list": [ 00:20:45.882 { 00:20:45.882 "name": "spare", 00:20:45.882 "uuid": "dff5513c-94e9-5429-8289-6692ea1e7ae2", 00:20:45.882 "is_configured": true, 00:20:45.882 "data_offset": 0, 00:20:45.882 "data_size": 65536 00:20:45.882 }, 00:20:45.882 { 00:20:45.882 "name": null, 00:20:45.882 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:45.882 "is_configured": false, 00:20:45.882 "data_offset": 0, 00:20:45.882 "data_size": 65536 00:20:45.882 }, 00:20:45.882 { 00:20:45.882 "name": "BaseBdev3", 00:20:45.882 "uuid": "28b9d108-b2e5-56f5-bee4-bda609efdd3b", 00:20:45.882 "is_configured": true, 00:20:45.882 "data_offset": 0, 00:20:45.882 "data_size": 65536 00:20:45.882 }, 00:20:45.882 { 00:20:45.882 "name": "BaseBdev4", 00:20:45.882 "uuid": "b8962bf3-cb26-522b-b69a-efb77c3670ad", 00:20:45.882 "is_configured": true, 00:20:45.882 "data_offset": 0, 00:20:45.882 "data_size": 65536 00:20:45.882 } 00:20:45.882 ] 00:20:45.882 }' 00:20:45.882 12:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:46.141 12:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:46.141 12:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:46.141 12:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:46.141 12:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:46.141 12:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:46.141 12:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:46.141 12:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:46.141 12:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:46.141 12:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:46.141 12:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:46.141 12:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:46.141 12:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:46.141 12:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:46.141 12:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:46.141 12:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:46.141 12:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:46.141 "name": "raid_bdev1", 00:20:46.141 "uuid": "6c9ecc31-6333-44a6-8e1f-c5f12094e642", 00:20:46.141 "strip_size_kb": 0, 00:20:46.141 "state": "online", 00:20:46.141 "raid_level": "raid1", 00:20:46.141 "superblock": false, 00:20:46.141 "num_base_bdevs": 4, 00:20:46.141 "num_base_bdevs_discovered": 3, 00:20:46.141 "num_base_bdevs_operational": 3, 00:20:46.141 "base_bdevs_list": [ 00:20:46.141 { 00:20:46.141 "name": "spare", 00:20:46.141 "uuid": "dff5513c-94e9-5429-8289-6692ea1e7ae2", 00:20:46.141 "is_configured": true, 00:20:46.141 "data_offset": 0, 00:20:46.141 "data_size": 65536 00:20:46.141 }, 00:20:46.141 { 00:20:46.141 "name": null, 00:20:46.141 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:46.141 "is_configured": false, 00:20:46.141 "data_offset": 0, 00:20:46.141 "data_size": 65536 00:20:46.141 }, 00:20:46.141 { 00:20:46.141 "name": "BaseBdev3", 00:20:46.141 "uuid": "28b9d108-b2e5-56f5-bee4-bda609efdd3b", 00:20:46.141 "is_configured": true, 00:20:46.141 "data_offset": 0, 00:20:46.141 "data_size": 65536 00:20:46.141 }, 00:20:46.141 { 00:20:46.141 "name": "BaseBdev4", 00:20:46.141 "uuid": "b8962bf3-cb26-522b-b69a-efb77c3670ad", 00:20:46.141 "is_configured": true, 00:20:46.141 "data_offset": 0, 00:20:46.141 "data_size": 65536 00:20:46.141 } 00:20:46.141 ] 00:20:46.141 }' 00:20:46.141 12:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:46.141 12:01:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:46.706 12:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:46.965 [2024-07-12 12:01:37.009322] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:46.965 [2024-07-12 12:01:37.009344] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:46.965 00:20:46.965 Latency(us) 00:20:46.965 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:20:46.965 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:20:46.965 raid_bdev1 : 10.78 105.36 316.07 0.00 0.00 13669.86 236.98 114344.72 00:20:46.965 =================================================================================================================== 00:20:46.965 Total : 105.36 316.07 0.00 0.00 13669.86 236.98 114344.72 00:20:46.965 [2024-07-12 12:01:37.092246] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:46.965 [2024-07-12 12:01:37.092283] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:46.965 [2024-07-12 12:01:37.092345] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:46.965 [2024-07-12 12:01:37.092351] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18c0dd0 name raid_bdev1, state offline 00:20:46.965 0 00:20:46.965 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:46.965 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:20:47.223 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:20:47.223 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:20:47.223 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:20:47.223 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:20:47.223 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:47.223 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:20:47.223 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:47.223 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:20:47.223 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:47.223 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:20:47.223 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:47.223 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:47.223 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:20:47.223 /dev/nbd0 00:20:47.223 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:20:47.482 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:20:47.482 12:01:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:20:47.482 12:01:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:20:47.482 12:01:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:20:47.482 12:01:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:20:47.482 12:01:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:20:47.482 12:01:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:20:47.482 12:01:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:20:47.482 12:01:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:20:47.482 12:01:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:47.482 1+0 records in 00:20:47.482 1+0 records out 00:20:47.482 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000218273 s, 18.8 MB/s 00:20:47.482 12:01:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:47.482 12:01:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:20:47.482 12:01:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:47.482 12:01:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:20:47.482 12:01:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:20:47.482 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:47.482 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:47.482 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:20:47.482 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:20:47.482 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@727 -- # continue 00:20:47.482 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:20:47.482 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:20:47.482 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:20:47.482 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:47.482 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:20:47.482 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:47.482 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:20:47.482 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:47.482 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:20:47.482 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:47.482 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:47.483 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:20:47.483 /dev/nbd1 00:20:47.483 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:20:47.483 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:20:47.483 12:01:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:20:47.483 12:01:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:20:47.483 12:01:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:20:47.483 12:01:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:20:47.483 12:01:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:20:47.483 12:01:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:20:47.483 12:01:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:20:47.483 12:01:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:20:47.483 12:01:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:47.483 1+0 records in 00:20:47.483 1+0 records out 00:20:47.483 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000225119 s, 18.2 MB/s 00:20:47.483 12:01:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:47.483 12:01:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:20:47.483 12:01:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:47.483 12:01:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:20:47.483 12:01:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:20:47.483 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:47.483 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:47.483 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:20:47.742 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:20:47.742 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:47.742 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:20:47.742 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:47.742 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:20:47.742 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:47.742 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:20:47.742 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:20:47.742 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:20:47.742 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:20:47.742 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:47.742 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:47.742 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:20:47.742 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:20:47.742 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:20:47.742 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:20:47.742 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:20:47.742 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:20:47.742 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:47.742 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:20:47.742 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:47.742 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:20:47.742 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:47.742 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:20:47.742 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:47.742 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:47.742 12:01:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:20:48.001 /dev/nbd1 00:20:48.001 12:01:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:20:48.001 12:01:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:20:48.001 12:01:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:20:48.001 12:01:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:20:48.001 12:01:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:20:48.001 12:01:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:20:48.001 12:01:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:20:48.001 12:01:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:20:48.001 12:01:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:20:48.001 12:01:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:20:48.001 12:01:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:48.001 1+0 records in 00:20:48.001 1+0 records out 00:20:48.001 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000184209 s, 22.2 MB/s 00:20:48.001 12:01:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:48.001 12:01:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:20:48.001 12:01:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:48.001 12:01:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:20:48.001 12:01:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:20:48.001 12:01:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:48.001 12:01:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:48.001 12:01:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:20:48.001 12:01:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:20:48.001 12:01:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:48.001 12:01:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:20:48.001 12:01:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:48.001 12:01:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:20:48.001 12:01:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:48.001 12:01:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:20:48.259 12:01:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:20:48.259 12:01:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:20:48.259 12:01:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:20:48.259 12:01:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:48.259 12:01:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:48.259 12:01:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:20:48.259 12:01:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:20:48.259 12:01:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:20:48.259 12:01:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:20:48.259 12:01:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:48.259 12:01:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:20:48.259 12:01:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:48.259 12:01:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:20:48.259 12:01:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:48.259 12:01:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:20:48.518 12:01:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:20:48.518 12:01:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:20:48.518 12:01:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:20:48.518 12:01:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:48.518 12:01:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:48.518 12:01:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:20:48.518 12:01:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:20:48.518 12:01:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:20:48.518 12:01:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:20:48.518 12:01:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 717452 00:20:48.518 12:01:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@948 -- # '[' -z 717452 ']' 00:20:48.518 12:01:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # kill -0 717452 00:20:48.518 12:01:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # uname 00:20:48.518 12:01:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:48.518 12:01:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 717452 00:20:48.518 12:01:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:48.518 12:01:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:48.518 12:01:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 717452' 00:20:48.518 killing process with pid 717452 00:20:48.518 12:01:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # kill 717452 00:20:48.518 Received shutdown signal, test time was about 12.307350 seconds 00:20:48.518 00:20:48.518 Latency(us) 00:20:48.518 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:20:48.518 =================================================================================================================== 00:20:48.518 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:20:48.518 [2024-07-12 12:01:38.618227] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:48.518 12:01:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@972 -- # wait 717452 00:20:48.518 [2024-07-12 12:01:38.652247] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:20:48.777 00:20:48.777 real 0m16.516s 00:20:48.777 user 0m24.757s 00:20:48.777 sys 0m2.258s 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:48.777 ************************************ 00:20:48.777 END TEST raid_rebuild_test_io 00:20:48.777 ************************************ 00:20:48.777 12:01:38 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:48.777 12:01:38 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 4 true true true 00:20:48.777 12:01:38 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:20:48.777 12:01:38 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:48.777 12:01:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:48.777 ************************************ 00:20:48.777 START TEST raid_rebuild_test_sb_io 00:20:48.777 ************************************ 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 true true true 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=720360 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 720360 /var/tmp/spdk-raid.sock 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@829 -- # '[' -z 720360 ']' 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:48.777 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:20:48.777 12:01:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:48.777 [2024-07-12 12:01:38.955280] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:20:48.777 [2024-07-12 12:01:38.955319] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid720360 ] 00:20:48.777 I/O size of 3145728 is greater than zero copy threshold (65536). 00:20:48.777 Zero copy mechanism will not be used. 00:20:48.777 [2024-07-12 12:01:39.018432] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:49.034 [2024-07-12 12:01:39.099194] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:20:49.035 [2024-07-12 12:01:39.155434] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:49.035 [2024-07-12 12:01:39.155462] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:49.612 12:01:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:49.612 12:01:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@862 -- # return 0 00:20:49.612 12:01:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:49.612 12:01:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:49.873 BaseBdev1_malloc 00:20:49.873 12:01:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:20:49.873 [2024-07-12 12:01:40.030422] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:20:49.873 [2024-07-12 12:01:40.030456] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:49.873 [2024-07-12 12:01:40.030470] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf7f010 00:20:49.873 [2024-07-12 12:01:40.030494] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:49.873 [2024-07-12 12:01:40.031680] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:49.873 [2024-07-12 12:01:40.031703] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:49.873 BaseBdev1 00:20:49.873 12:01:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:49.873 12:01:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:50.139 BaseBdev2_malloc 00:20:50.139 12:01:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:20:50.139 [2024-07-12 12:01:40.383120] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:20:50.139 [2024-07-12 12:01:40.383149] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:50.139 [2024-07-12 12:01:40.383161] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf7fb60 00:20:50.139 [2024-07-12 12:01:40.383167] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:50.139 [2024-07-12 12:01:40.384134] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:50.139 [2024-07-12 12:01:40.384155] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:50.401 BaseBdev2 00:20:50.401 12:01:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:50.401 12:01:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:50.401 BaseBdev3_malloc 00:20:50.401 12:01:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:20:50.659 [2024-07-12 12:01:40.731385] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:20:50.659 [2024-07-12 12:01:40.731419] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:50.659 [2024-07-12 12:01:40.731429] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x112c0a0 00:20:50.659 [2024-07-12 12:01:40.731435] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:50.659 [2024-07-12 12:01:40.732458] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:50.659 [2024-07-12 12:01:40.732478] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:50.659 BaseBdev3 00:20:50.659 12:01:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:50.659 12:01:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:20:50.659 BaseBdev4_malloc 00:20:50.917 12:01:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:20:50.917 [2024-07-12 12:01:41.059646] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:20:50.917 [2024-07-12 12:01:41.059675] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:50.917 [2024-07-12 12:01:41.059686] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x112a880 00:20:50.917 [2024-07-12 12:01:41.059692] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:50.917 [2024-07-12 12:01:41.060663] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:50.917 [2024-07-12 12:01:41.060682] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:20:50.917 BaseBdev4 00:20:50.917 12:01:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:20:51.176 spare_malloc 00:20:51.176 12:01:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:20:51.176 spare_delay 00:20:51.176 12:01:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:51.435 [2024-07-12 12:01:41.560389] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:51.435 [2024-07-12 12:01:41.560420] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:51.435 [2024-07-12 12:01:41.560432] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x112e400 00:20:51.435 [2024-07-12 12:01:41.560438] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:51.435 [2024-07-12 12:01:41.561595] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:51.435 [2024-07-12 12:01:41.561615] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:51.435 spare 00:20:51.435 12:01:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:20:51.694 [2024-07-12 12:01:41.724891] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:51.694 [2024-07-12 12:01:41.725791] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:51.694 [2024-07-12 12:01:41.725830] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:51.694 [2024-07-12 12:01:41.725857] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:51.694 [2024-07-12 12:01:41.725977] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1128dd0 00:20:51.694 [2024-07-12 12:01:41.725983] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:51.694 [2024-07-12 12:01:41.726121] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf7e460 00:20:51.694 [2024-07-12 12:01:41.726224] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1128dd0 00:20:51.694 [2024-07-12 12:01:41.726230] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1128dd0 00:20:51.694 [2024-07-12 12:01:41.726290] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:51.694 12:01:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:20:51.694 12:01:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:51.694 12:01:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:51.694 12:01:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:51.694 12:01:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:51.694 12:01:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:51.694 12:01:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:51.694 12:01:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:51.694 12:01:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:51.694 12:01:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:51.694 12:01:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:51.694 12:01:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:51.694 12:01:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:51.694 "name": "raid_bdev1", 00:20:51.694 "uuid": "b3d939dd-c6d0-4afe-8a2d-25add101120d", 00:20:51.694 "strip_size_kb": 0, 00:20:51.694 "state": "online", 00:20:51.694 "raid_level": "raid1", 00:20:51.694 "superblock": true, 00:20:51.694 "num_base_bdevs": 4, 00:20:51.694 "num_base_bdevs_discovered": 4, 00:20:51.694 "num_base_bdevs_operational": 4, 00:20:51.694 "base_bdevs_list": [ 00:20:51.694 { 00:20:51.694 "name": "BaseBdev1", 00:20:51.694 "uuid": "26cdf22f-3c37-54b2-9d6f-cc4ff3f07ba5", 00:20:51.694 "is_configured": true, 00:20:51.694 "data_offset": 2048, 00:20:51.694 "data_size": 63488 00:20:51.694 }, 00:20:51.694 { 00:20:51.694 "name": "BaseBdev2", 00:20:51.694 "uuid": "e346fc75-e650-59a1-aafd-d45576a691cc", 00:20:51.694 "is_configured": true, 00:20:51.694 "data_offset": 2048, 00:20:51.694 "data_size": 63488 00:20:51.694 }, 00:20:51.694 { 00:20:51.694 "name": "BaseBdev3", 00:20:51.694 "uuid": "a8a2c5e3-38c2-5e0d-a5a8-e9f2aaf3c9f3", 00:20:51.694 "is_configured": true, 00:20:51.694 "data_offset": 2048, 00:20:51.694 "data_size": 63488 00:20:51.694 }, 00:20:51.694 { 00:20:51.694 "name": "BaseBdev4", 00:20:51.694 "uuid": "839a762c-822a-580a-a40d-c999ccf036a2", 00:20:51.694 "is_configured": true, 00:20:51.694 "data_offset": 2048, 00:20:51.694 "data_size": 63488 00:20:51.694 } 00:20:51.694 ] 00:20:51.694 }' 00:20:51.694 12:01:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:51.694 12:01:41 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:52.260 12:01:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:52.260 12:01:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:20:52.518 [2024-07-12 12:01:42.507094] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:52.518 12:01:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:20:52.518 12:01:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:52.518 12:01:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:20:52.518 12:01:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:20:52.518 12:01:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:20:52.518 12:01:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:20:52.518 12:01:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:52.776 [2024-07-12 12:01:42.773370] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf7e0d0 00:20:52.776 I/O size of 3145728 is greater than zero copy threshold (65536). 00:20:52.776 Zero copy mechanism will not be used. 00:20:52.776 Running I/O for 60 seconds... 00:20:52.776 [2024-07-12 12:01:42.844564] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:52.776 [2024-07-12 12:01:42.849971] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xf7e0d0 00:20:52.776 12:01:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:52.776 12:01:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:52.776 12:01:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:52.776 12:01:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:52.776 12:01:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:52.776 12:01:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:52.776 12:01:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:52.776 12:01:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:52.776 12:01:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:52.776 12:01:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:52.777 12:01:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:52.777 12:01:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:53.034 12:01:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:53.034 "name": "raid_bdev1", 00:20:53.034 "uuid": "b3d939dd-c6d0-4afe-8a2d-25add101120d", 00:20:53.034 "strip_size_kb": 0, 00:20:53.034 "state": "online", 00:20:53.034 "raid_level": "raid1", 00:20:53.034 "superblock": true, 00:20:53.034 "num_base_bdevs": 4, 00:20:53.034 "num_base_bdevs_discovered": 3, 00:20:53.034 "num_base_bdevs_operational": 3, 00:20:53.034 "base_bdevs_list": [ 00:20:53.034 { 00:20:53.034 "name": null, 00:20:53.034 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:53.034 "is_configured": false, 00:20:53.034 "data_offset": 2048, 00:20:53.034 "data_size": 63488 00:20:53.034 }, 00:20:53.034 { 00:20:53.034 "name": "BaseBdev2", 00:20:53.034 "uuid": "e346fc75-e650-59a1-aafd-d45576a691cc", 00:20:53.034 "is_configured": true, 00:20:53.034 "data_offset": 2048, 00:20:53.034 "data_size": 63488 00:20:53.034 }, 00:20:53.034 { 00:20:53.034 "name": "BaseBdev3", 00:20:53.034 "uuid": "a8a2c5e3-38c2-5e0d-a5a8-e9f2aaf3c9f3", 00:20:53.034 "is_configured": true, 00:20:53.034 "data_offset": 2048, 00:20:53.034 "data_size": 63488 00:20:53.034 }, 00:20:53.034 { 00:20:53.034 "name": "BaseBdev4", 00:20:53.034 "uuid": "839a762c-822a-580a-a40d-c999ccf036a2", 00:20:53.034 "is_configured": true, 00:20:53.034 "data_offset": 2048, 00:20:53.034 "data_size": 63488 00:20:53.034 } 00:20:53.034 ] 00:20:53.034 }' 00:20:53.034 12:01:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:53.034 12:01:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:53.599 12:01:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:53.599 [2024-07-12 12:01:43.733849] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:53.599 12:01:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:20:53.599 [2024-07-12 12:01:43.792234] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1016290 00:20:53.599 [2024-07-12 12:01:43.793673] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:53.857 [2024-07-12 12:01:43.913924] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:53.857 [2024-07-12 12:01:43.914372] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:53.857 [2024-07-12 12:01:44.042417] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:53.858 [2024-07-12 12:01:44.042951] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:54.424 [2024-07-12 12:01:44.375209] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:20:54.424 [2024-07-12 12:01:44.375430] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:20:54.424 [2024-07-12 12:01:44.591824] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:20:54.424 [2024-07-12 12:01:44.592396] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:20:54.683 12:01:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:54.683 12:01:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:54.683 12:01:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:54.683 12:01:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:54.683 12:01:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:54.683 12:01:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:54.683 12:01:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:54.683 [2024-07-12 12:01:44.917152] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:20:54.942 12:01:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:54.942 "name": "raid_bdev1", 00:20:54.942 "uuid": "b3d939dd-c6d0-4afe-8a2d-25add101120d", 00:20:54.942 "strip_size_kb": 0, 00:20:54.942 "state": "online", 00:20:54.942 "raid_level": "raid1", 00:20:54.942 "superblock": true, 00:20:54.942 "num_base_bdevs": 4, 00:20:54.942 "num_base_bdevs_discovered": 4, 00:20:54.942 "num_base_bdevs_operational": 4, 00:20:54.942 "process": { 00:20:54.942 "type": "rebuild", 00:20:54.942 "target": "spare", 00:20:54.942 "progress": { 00:20:54.942 "blocks": 14336, 00:20:54.942 "percent": 22 00:20:54.942 } 00:20:54.942 }, 00:20:54.942 "base_bdevs_list": [ 00:20:54.942 { 00:20:54.942 "name": "spare", 00:20:54.942 "uuid": "985df082-3da1-52b1-9f63-f170378eb6fe", 00:20:54.942 "is_configured": true, 00:20:54.942 "data_offset": 2048, 00:20:54.942 "data_size": 63488 00:20:54.942 }, 00:20:54.942 { 00:20:54.942 "name": "BaseBdev2", 00:20:54.942 "uuid": "e346fc75-e650-59a1-aafd-d45576a691cc", 00:20:54.942 "is_configured": true, 00:20:54.942 "data_offset": 2048, 00:20:54.942 "data_size": 63488 00:20:54.942 }, 00:20:54.942 { 00:20:54.942 "name": "BaseBdev3", 00:20:54.942 "uuid": "a8a2c5e3-38c2-5e0d-a5a8-e9f2aaf3c9f3", 00:20:54.942 "is_configured": true, 00:20:54.942 "data_offset": 2048, 00:20:54.942 "data_size": 63488 00:20:54.942 }, 00:20:54.942 { 00:20:54.942 "name": "BaseBdev4", 00:20:54.942 "uuid": "839a762c-822a-580a-a40d-c999ccf036a2", 00:20:54.942 "is_configured": true, 00:20:54.942 "data_offset": 2048, 00:20:54.942 "data_size": 63488 00:20:54.942 } 00:20:54.942 ] 00:20:54.942 }' 00:20:54.942 12:01:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:54.942 12:01:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:54.942 12:01:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:54.942 12:01:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:54.942 12:01:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:20:54.942 [2024-07-12 12:01:45.040409] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:20:55.201 [2024-07-12 12:01:45.195513] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:55.201 [2024-07-12 12:01:45.360683] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:55.201 [2024-07-12 12:01:45.369603] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:55.201 [2024-07-12 12:01:45.369628] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:55.201 [2024-07-12 12:01:45.369634] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:20:55.201 [2024-07-12 12:01:45.385823] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xf7e0d0 00:20:55.201 12:01:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:55.201 12:01:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:55.201 12:01:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:55.201 12:01:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:55.201 12:01:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:55.201 12:01:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:55.201 12:01:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:55.201 12:01:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:55.201 12:01:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:55.201 12:01:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:55.201 12:01:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:55.201 12:01:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:55.460 12:01:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:55.460 "name": "raid_bdev1", 00:20:55.460 "uuid": "b3d939dd-c6d0-4afe-8a2d-25add101120d", 00:20:55.460 "strip_size_kb": 0, 00:20:55.460 "state": "online", 00:20:55.460 "raid_level": "raid1", 00:20:55.460 "superblock": true, 00:20:55.460 "num_base_bdevs": 4, 00:20:55.460 "num_base_bdevs_discovered": 3, 00:20:55.460 "num_base_bdevs_operational": 3, 00:20:55.460 "base_bdevs_list": [ 00:20:55.460 { 00:20:55.460 "name": null, 00:20:55.460 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:55.460 "is_configured": false, 00:20:55.460 "data_offset": 2048, 00:20:55.460 "data_size": 63488 00:20:55.460 }, 00:20:55.460 { 00:20:55.460 "name": "BaseBdev2", 00:20:55.460 "uuid": "e346fc75-e650-59a1-aafd-d45576a691cc", 00:20:55.460 "is_configured": true, 00:20:55.460 "data_offset": 2048, 00:20:55.460 "data_size": 63488 00:20:55.460 }, 00:20:55.460 { 00:20:55.460 "name": "BaseBdev3", 00:20:55.460 "uuid": "a8a2c5e3-38c2-5e0d-a5a8-e9f2aaf3c9f3", 00:20:55.460 "is_configured": true, 00:20:55.460 "data_offset": 2048, 00:20:55.460 "data_size": 63488 00:20:55.460 }, 00:20:55.460 { 00:20:55.460 "name": "BaseBdev4", 00:20:55.460 "uuid": "839a762c-822a-580a-a40d-c999ccf036a2", 00:20:55.460 "is_configured": true, 00:20:55.460 "data_offset": 2048, 00:20:55.460 "data_size": 63488 00:20:55.460 } 00:20:55.460 ] 00:20:55.460 }' 00:20:55.460 12:01:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:55.460 12:01:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:56.026 12:01:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:56.026 12:01:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:56.026 12:01:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:56.026 12:01:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:56.026 12:01:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:56.026 12:01:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:56.026 12:01:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:56.026 12:01:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:56.026 "name": "raid_bdev1", 00:20:56.026 "uuid": "b3d939dd-c6d0-4afe-8a2d-25add101120d", 00:20:56.026 "strip_size_kb": 0, 00:20:56.026 "state": "online", 00:20:56.026 "raid_level": "raid1", 00:20:56.026 "superblock": true, 00:20:56.026 "num_base_bdevs": 4, 00:20:56.026 "num_base_bdevs_discovered": 3, 00:20:56.026 "num_base_bdevs_operational": 3, 00:20:56.026 "base_bdevs_list": [ 00:20:56.026 { 00:20:56.026 "name": null, 00:20:56.026 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:56.026 "is_configured": false, 00:20:56.026 "data_offset": 2048, 00:20:56.026 "data_size": 63488 00:20:56.026 }, 00:20:56.026 { 00:20:56.026 "name": "BaseBdev2", 00:20:56.026 "uuid": "e346fc75-e650-59a1-aafd-d45576a691cc", 00:20:56.026 "is_configured": true, 00:20:56.026 "data_offset": 2048, 00:20:56.026 "data_size": 63488 00:20:56.026 }, 00:20:56.026 { 00:20:56.026 "name": "BaseBdev3", 00:20:56.026 "uuid": "a8a2c5e3-38c2-5e0d-a5a8-e9f2aaf3c9f3", 00:20:56.026 "is_configured": true, 00:20:56.026 "data_offset": 2048, 00:20:56.026 "data_size": 63488 00:20:56.026 }, 00:20:56.026 { 00:20:56.026 "name": "BaseBdev4", 00:20:56.026 "uuid": "839a762c-822a-580a-a40d-c999ccf036a2", 00:20:56.026 "is_configured": true, 00:20:56.026 "data_offset": 2048, 00:20:56.027 "data_size": 63488 00:20:56.027 } 00:20:56.027 ] 00:20:56.027 }' 00:20:56.027 12:01:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:56.285 12:01:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:56.285 12:01:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:56.285 12:01:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:56.285 12:01:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:56.285 [2024-07-12 12:01:46.490853] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:56.544 [2024-07-12 12:01:46.535072] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc86230 00:20:56.544 12:01:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:20:56.544 [2024-07-12 12:01:46.536152] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:56.544 [2024-07-12 12:01:46.651532] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:56.544 [2024-07-12 12:01:46.652583] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:56.803 [2024-07-12 12:01:46.867526] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:56.803 [2024-07-12 12:01:46.867665] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:57.066 [2024-07-12 12:01:47.104671] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:20:57.066 [2024-07-12 12:01:47.105791] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:20:57.066 [2024-07-12 12:01:47.307811] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:20:57.066 [2024-07-12 12:01:47.307927] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:20:57.326 12:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:57.326 12:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:57.326 12:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:57.326 12:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:57.326 12:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:57.326 12:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:57.326 12:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:57.615 [2024-07-12 12:01:47.679830] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:20:57.615 12:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:57.615 "name": "raid_bdev1", 00:20:57.615 "uuid": "b3d939dd-c6d0-4afe-8a2d-25add101120d", 00:20:57.615 "strip_size_kb": 0, 00:20:57.615 "state": "online", 00:20:57.615 "raid_level": "raid1", 00:20:57.615 "superblock": true, 00:20:57.615 "num_base_bdevs": 4, 00:20:57.615 "num_base_bdevs_discovered": 4, 00:20:57.615 "num_base_bdevs_operational": 4, 00:20:57.615 "process": { 00:20:57.615 "type": "rebuild", 00:20:57.615 "target": "spare", 00:20:57.615 "progress": { 00:20:57.615 "blocks": 16384, 00:20:57.615 "percent": 25 00:20:57.615 } 00:20:57.615 }, 00:20:57.615 "base_bdevs_list": [ 00:20:57.615 { 00:20:57.615 "name": "spare", 00:20:57.615 "uuid": "985df082-3da1-52b1-9f63-f170378eb6fe", 00:20:57.615 "is_configured": true, 00:20:57.615 "data_offset": 2048, 00:20:57.615 "data_size": 63488 00:20:57.615 }, 00:20:57.615 { 00:20:57.615 "name": "BaseBdev2", 00:20:57.615 "uuid": "e346fc75-e650-59a1-aafd-d45576a691cc", 00:20:57.615 "is_configured": true, 00:20:57.615 "data_offset": 2048, 00:20:57.615 "data_size": 63488 00:20:57.615 }, 00:20:57.615 { 00:20:57.615 "name": "BaseBdev3", 00:20:57.615 "uuid": "a8a2c5e3-38c2-5e0d-a5a8-e9f2aaf3c9f3", 00:20:57.615 "is_configured": true, 00:20:57.615 "data_offset": 2048, 00:20:57.615 "data_size": 63488 00:20:57.615 }, 00:20:57.615 { 00:20:57.615 "name": "BaseBdev4", 00:20:57.615 "uuid": "839a762c-822a-580a-a40d-c999ccf036a2", 00:20:57.615 "is_configured": true, 00:20:57.615 "data_offset": 2048, 00:20:57.615 "data_size": 63488 00:20:57.615 } 00:20:57.615 ] 00:20:57.615 }' 00:20:57.615 12:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:57.615 12:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:57.615 12:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:57.615 12:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:57.615 12:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:20:57.615 12:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:20:57.615 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:20:57.615 12:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:20:57.615 12:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:20:57.615 12:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:20:57.615 12:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:20:57.912 [2024-07-12 12:01:47.954081] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:57.912 [2024-07-12 12:01:48.013131] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:20:58.170 [2024-07-12 12:01:48.213674] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0xf7e0d0 00:20:58.170 [2024-07-12 12:01:48.213695] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0xc86230 00:20:58.170 12:01:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:20:58.170 12:01:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:20:58.170 12:01:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:58.170 12:01:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:58.170 12:01:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:58.170 12:01:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:58.170 12:01:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:58.170 12:01:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:58.170 12:01:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:58.170 [2024-07-12 12:01:48.335367] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:20:58.170 12:01:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:58.170 "name": "raid_bdev1", 00:20:58.170 "uuid": "b3d939dd-c6d0-4afe-8a2d-25add101120d", 00:20:58.170 "strip_size_kb": 0, 00:20:58.170 "state": "online", 00:20:58.170 "raid_level": "raid1", 00:20:58.170 "superblock": true, 00:20:58.170 "num_base_bdevs": 4, 00:20:58.170 "num_base_bdevs_discovered": 3, 00:20:58.170 "num_base_bdevs_operational": 3, 00:20:58.170 "process": { 00:20:58.170 "type": "rebuild", 00:20:58.170 "target": "spare", 00:20:58.170 "progress": { 00:20:58.170 "blocks": 22528, 00:20:58.170 "percent": 35 00:20:58.170 } 00:20:58.170 }, 00:20:58.170 "base_bdevs_list": [ 00:20:58.170 { 00:20:58.170 "name": "spare", 00:20:58.170 "uuid": "985df082-3da1-52b1-9f63-f170378eb6fe", 00:20:58.170 "is_configured": true, 00:20:58.170 "data_offset": 2048, 00:20:58.170 "data_size": 63488 00:20:58.170 }, 00:20:58.170 { 00:20:58.170 "name": null, 00:20:58.170 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:58.170 "is_configured": false, 00:20:58.170 "data_offset": 2048, 00:20:58.170 "data_size": 63488 00:20:58.170 }, 00:20:58.170 { 00:20:58.170 "name": "BaseBdev3", 00:20:58.170 "uuid": "a8a2c5e3-38c2-5e0d-a5a8-e9f2aaf3c9f3", 00:20:58.170 "is_configured": true, 00:20:58.170 "data_offset": 2048, 00:20:58.170 "data_size": 63488 00:20:58.170 }, 00:20:58.170 { 00:20:58.170 "name": "BaseBdev4", 00:20:58.170 "uuid": "839a762c-822a-580a-a40d-c999ccf036a2", 00:20:58.170 "is_configured": true, 00:20:58.170 "data_offset": 2048, 00:20:58.170 "data_size": 63488 00:20:58.170 } 00:20:58.170 ] 00:20:58.170 }' 00:20:58.170 12:01:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:58.429 12:01:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:58.429 12:01:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:58.429 12:01:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:58.429 12:01:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=728 00:20:58.429 12:01:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:58.429 12:01:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:58.429 12:01:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:58.429 12:01:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:58.429 12:01:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:58.429 12:01:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:58.429 12:01:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:58.429 12:01:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:58.429 [2024-07-12 12:01:48.550888] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:20:58.429 [2024-07-12 12:01:48.672549] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:20:58.429 12:01:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:58.429 "name": "raid_bdev1", 00:20:58.429 "uuid": "b3d939dd-c6d0-4afe-8a2d-25add101120d", 00:20:58.429 "strip_size_kb": 0, 00:20:58.429 "state": "online", 00:20:58.429 "raid_level": "raid1", 00:20:58.429 "superblock": true, 00:20:58.429 "num_base_bdevs": 4, 00:20:58.429 "num_base_bdevs_discovered": 3, 00:20:58.429 "num_base_bdevs_operational": 3, 00:20:58.429 "process": { 00:20:58.429 "type": "rebuild", 00:20:58.429 "target": "spare", 00:20:58.429 "progress": { 00:20:58.429 "blocks": 26624, 00:20:58.429 "percent": 41 00:20:58.429 } 00:20:58.429 }, 00:20:58.429 "base_bdevs_list": [ 00:20:58.429 { 00:20:58.429 "name": "spare", 00:20:58.429 "uuid": "985df082-3da1-52b1-9f63-f170378eb6fe", 00:20:58.429 "is_configured": true, 00:20:58.429 "data_offset": 2048, 00:20:58.429 "data_size": 63488 00:20:58.429 }, 00:20:58.429 { 00:20:58.429 "name": null, 00:20:58.429 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:58.429 "is_configured": false, 00:20:58.429 "data_offset": 2048, 00:20:58.429 "data_size": 63488 00:20:58.429 }, 00:20:58.429 { 00:20:58.429 "name": "BaseBdev3", 00:20:58.430 "uuid": "a8a2c5e3-38c2-5e0d-a5a8-e9f2aaf3c9f3", 00:20:58.430 "is_configured": true, 00:20:58.430 "data_offset": 2048, 00:20:58.430 "data_size": 63488 00:20:58.430 }, 00:20:58.430 { 00:20:58.430 "name": "BaseBdev4", 00:20:58.430 "uuid": "839a762c-822a-580a-a40d-c999ccf036a2", 00:20:58.430 "is_configured": true, 00:20:58.430 "data_offset": 2048, 00:20:58.430 "data_size": 63488 00:20:58.430 } 00:20:58.430 ] 00:20:58.430 }' 00:20:58.430 12:01:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:58.688 12:01:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:58.688 12:01:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:58.688 12:01:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:58.688 12:01:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:20:58.947 [2024-07-12 12:01:49.103420] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:20:59.885 12:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:59.885 12:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:59.885 12:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:59.885 12:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:59.885 12:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:59.885 12:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:59.885 12:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:59.885 12:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:59.885 [2024-07-12 12:01:49.913314] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:20:59.885 12:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:59.885 "name": "raid_bdev1", 00:20:59.885 "uuid": "b3d939dd-c6d0-4afe-8a2d-25add101120d", 00:20:59.885 "strip_size_kb": 0, 00:20:59.885 "state": "online", 00:20:59.885 "raid_level": "raid1", 00:20:59.885 "superblock": true, 00:20:59.885 "num_base_bdevs": 4, 00:20:59.885 "num_base_bdevs_discovered": 3, 00:20:59.885 "num_base_bdevs_operational": 3, 00:20:59.885 "process": { 00:20:59.885 "type": "rebuild", 00:20:59.885 "target": "spare", 00:20:59.885 "progress": { 00:20:59.885 "blocks": 47104, 00:20:59.885 "percent": 74 00:20:59.885 } 00:20:59.885 }, 00:20:59.885 "base_bdevs_list": [ 00:20:59.885 { 00:20:59.885 "name": "spare", 00:20:59.885 "uuid": "985df082-3da1-52b1-9f63-f170378eb6fe", 00:20:59.885 "is_configured": true, 00:20:59.885 "data_offset": 2048, 00:20:59.885 "data_size": 63488 00:20:59.885 }, 00:20:59.885 { 00:20:59.885 "name": null, 00:20:59.885 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:59.885 "is_configured": false, 00:20:59.885 "data_offset": 2048, 00:20:59.885 "data_size": 63488 00:20:59.885 }, 00:20:59.885 { 00:20:59.885 "name": "BaseBdev3", 00:20:59.885 "uuid": "a8a2c5e3-38c2-5e0d-a5a8-e9f2aaf3c9f3", 00:20:59.885 "is_configured": true, 00:20:59.885 "data_offset": 2048, 00:20:59.885 "data_size": 63488 00:20:59.885 }, 00:20:59.885 { 00:20:59.885 "name": "BaseBdev4", 00:20:59.885 "uuid": "839a762c-822a-580a-a40d-c999ccf036a2", 00:20:59.885 "is_configured": true, 00:20:59.885 "data_offset": 2048, 00:20:59.885 "data_size": 63488 00:20:59.885 } 00:20:59.885 ] 00:20:59.885 }' 00:20:59.885 12:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:59.885 12:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:59.885 12:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:59.885 12:01:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:59.885 12:01:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:21:00.452 [2024-07-12 12:01:50.564795] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:21:01.020 [2024-07-12 12:01:51.007018] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:21:01.020 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:01.020 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:01.020 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:01.020 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:01.020 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:01.020 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:01.020 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:01.020 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:01.020 [2024-07-12 12:01:51.112535] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:21:01.020 [2024-07-12 12:01:51.115686] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:01.020 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:01.020 "name": "raid_bdev1", 00:21:01.020 "uuid": "b3d939dd-c6d0-4afe-8a2d-25add101120d", 00:21:01.020 "strip_size_kb": 0, 00:21:01.020 "state": "online", 00:21:01.020 "raid_level": "raid1", 00:21:01.020 "superblock": true, 00:21:01.020 "num_base_bdevs": 4, 00:21:01.020 "num_base_bdevs_discovered": 3, 00:21:01.020 "num_base_bdevs_operational": 3, 00:21:01.020 "base_bdevs_list": [ 00:21:01.020 { 00:21:01.020 "name": "spare", 00:21:01.020 "uuid": "985df082-3da1-52b1-9f63-f170378eb6fe", 00:21:01.020 "is_configured": true, 00:21:01.020 "data_offset": 2048, 00:21:01.020 "data_size": 63488 00:21:01.020 }, 00:21:01.020 { 00:21:01.020 "name": null, 00:21:01.020 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:01.020 "is_configured": false, 00:21:01.020 "data_offset": 2048, 00:21:01.020 "data_size": 63488 00:21:01.020 }, 00:21:01.020 { 00:21:01.020 "name": "BaseBdev3", 00:21:01.020 "uuid": "a8a2c5e3-38c2-5e0d-a5a8-e9f2aaf3c9f3", 00:21:01.020 "is_configured": true, 00:21:01.020 "data_offset": 2048, 00:21:01.020 "data_size": 63488 00:21:01.020 }, 00:21:01.020 { 00:21:01.020 "name": "BaseBdev4", 00:21:01.020 "uuid": "839a762c-822a-580a-a40d-c999ccf036a2", 00:21:01.020 "is_configured": true, 00:21:01.020 "data_offset": 2048, 00:21:01.020 "data_size": 63488 00:21:01.020 } 00:21:01.020 ] 00:21:01.020 }' 00:21:01.020 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:01.020 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:21:01.020 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:01.278 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:21:01.278 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:21:01.278 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:01.278 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:01.278 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:01.278 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:01.278 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:01.278 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:01.278 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:01.278 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:01.278 "name": "raid_bdev1", 00:21:01.278 "uuid": "b3d939dd-c6d0-4afe-8a2d-25add101120d", 00:21:01.278 "strip_size_kb": 0, 00:21:01.278 "state": "online", 00:21:01.278 "raid_level": "raid1", 00:21:01.278 "superblock": true, 00:21:01.278 "num_base_bdevs": 4, 00:21:01.278 "num_base_bdevs_discovered": 3, 00:21:01.279 "num_base_bdevs_operational": 3, 00:21:01.279 "base_bdevs_list": [ 00:21:01.279 { 00:21:01.279 "name": "spare", 00:21:01.279 "uuid": "985df082-3da1-52b1-9f63-f170378eb6fe", 00:21:01.279 "is_configured": true, 00:21:01.279 "data_offset": 2048, 00:21:01.279 "data_size": 63488 00:21:01.279 }, 00:21:01.279 { 00:21:01.279 "name": null, 00:21:01.279 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:01.279 "is_configured": false, 00:21:01.279 "data_offset": 2048, 00:21:01.279 "data_size": 63488 00:21:01.279 }, 00:21:01.279 { 00:21:01.279 "name": "BaseBdev3", 00:21:01.279 "uuid": "a8a2c5e3-38c2-5e0d-a5a8-e9f2aaf3c9f3", 00:21:01.279 "is_configured": true, 00:21:01.279 "data_offset": 2048, 00:21:01.279 "data_size": 63488 00:21:01.279 }, 00:21:01.279 { 00:21:01.279 "name": "BaseBdev4", 00:21:01.279 "uuid": "839a762c-822a-580a-a40d-c999ccf036a2", 00:21:01.279 "is_configured": true, 00:21:01.279 "data_offset": 2048, 00:21:01.279 "data_size": 63488 00:21:01.279 } 00:21:01.279 ] 00:21:01.279 }' 00:21:01.279 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:01.279 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:01.279 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:01.538 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:01.538 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:01.538 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:01.538 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:01.538 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:01.538 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:01.538 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:01.538 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:01.538 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:01.538 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:01.538 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:01.538 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:01.538 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:01.538 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:01.538 "name": "raid_bdev1", 00:21:01.538 "uuid": "b3d939dd-c6d0-4afe-8a2d-25add101120d", 00:21:01.538 "strip_size_kb": 0, 00:21:01.538 "state": "online", 00:21:01.538 "raid_level": "raid1", 00:21:01.538 "superblock": true, 00:21:01.538 "num_base_bdevs": 4, 00:21:01.538 "num_base_bdevs_discovered": 3, 00:21:01.538 "num_base_bdevs_operational": 3, 00:21:01.538 "base_bdevs_list": [ 00:21:01.538 { 00:21:01.538 "name": "spare", 00:21:01.538 "uuid": "985df082-3da1-52b1-9f63-f170378eb6fe", 00:21:01.538 "is_configured": true, 00:21:01.538 "data_offset": 2048, 00:21:01.538 "data_size": 63488 00:21:01.538 }, 00:21:01.538 { 00:21:01.538 "name": null, 00:21:01.538 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:01.538 "is_configured": false, 00:21:01.538 "data_offset": 2048, 00:21:01.538 "data_size": 63488 00:21:01.538 }, 00:21:01.538 { 00:21:01.538 "name": "BaseBdev3", 00:21:01.538 "uuid": "a8a2c5e3-38c2-5e0d-a5a8-e9f2aaf3c9f3", 00:21:01.538 "is_configured": true, 00:21:01.538 "data_offset": 2048, 00:21:01.538 "data_size": 63488 00:21:01.538 }, 00:21:01.538 { 00:21:01.538 "name": "BaseBdev4", 00:21:01.538 "uuid": "839a762c-822a-580a-a40d-c999ccf036a2", 00:21:01.538 "is_configured": true, 00:21:01.538 "data_offset": 2048, 00:21:01.538 "data_size": 63488 00:21:01.538 } 00:21:01.538 ] 00:21:01.538 }' 00:21:01.538 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:01.538 12:01:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:02.106 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:02.365 [2024-07-12 12:01:52.361249] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:02.365 [2024-07-12 12:01:52.361272] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:02.365 00:21:02.365 Latency(us) 00:21:02.365 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:02.365 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:21:02.365 raid_bdev1 : 9.61 113.36 340.07 0.00 0.00 11899.78 236.98 120336.58 00:21:02.365 =================================================================================================================== 00:21:02.365 Total : 113.36 340.07 0.00 0.00 11899.78 236.98 120336.58 00:21:02.365 [2024-07-12 12:01:52.407986] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:02.365 [2024-07-12 12:01:52.408007] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:02.365 [2024-07-12 12:01:52.408068] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:02.365 [2024-07-12 12:01:52.408074] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1128dd0 name raid_bdev1, state offline 00:21:02.365 0 00:21:02.365 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:02.365 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:21:02.365 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:21:02.365 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:21:02.365 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:21:02.365 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:21:02.365 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:02.365 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:21:02.365 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:02.365 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:21:02.365 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:02.365 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:21:02.365 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:02.365 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:02.365 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:21:02.624 /dev/nbd0 00:21:02.624 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:02.624 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:02.624 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:21:02.624 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:21:02.624 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:21:02.624 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:21:02.624 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:21:02.624 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:21:02.624 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:21:02.624 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:21:02.624 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:02.624 1+0 records in 00:21:02.624 1+0 records out 00:21:02.624 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000208578 s, 19.6 MB/s 00:21:02.624 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:02.624 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:21:02.624 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:02.624 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:21:02.624 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:21:02.624 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:02.624 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:02.624 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:21:02.624 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:21:02.624 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@727 -- # continue 00:21:02.624 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:21:02.624 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:21:02.624 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:21:02.624 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:02.624 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:21:02.624 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:02.624 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:21:02.624 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:02.624 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:21:02.624 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:02.624 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:02.624 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:21:02.883 /dev/nbd1 00:21:02.883 12:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:21:02.883 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:21:02.883 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:21:02.883 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:21:02.883 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:21:02.883 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:21:02.883 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:21:02.883 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:21:02.883 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:21:02.883 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:21:02.883 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:02.883 1+0 records in 00:21:02.883 1+0 records out 00:21:02.883 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000213976 s, 19.1 MB/s 00:21:02.883 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:02.883 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:21:02.883 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:02.883 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:21:02.883 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:21:02.883 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:02.883 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:02.883 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:21:02.883 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:21:02.883 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:02.883 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:21:02.883 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:02.883 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:21:02.883 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:02.883 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:21:03.142 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:21:03.142 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:21:03.142 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:21:03.142 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:03.142 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:03.142 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:21:03.142 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:21:03.142 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:21:03.142 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:21:03.142 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:21:03.142 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:21:03.142 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:03.142 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:21:03.142 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:03.142 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:21:03.142 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:03.142 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:21:03.142 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:03.142 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:03.142 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:21:03.401 /dev/nbd1 00:21:03.401 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:21:03.401 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:21:03.401 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:21:03.401 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:21:03.401 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:21:03.401 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:21:03.401 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:21:03.401 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:21:03.401 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:21:03.401 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:21:03.401 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:03.401 1+0 records in 00:21:03.401 1+0 records out 00:21:03.401 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000225386 s, 18.2 MB/s 00:21:03.401 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:03.401 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:21:03.401 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:03.401 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:21:03.401 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:21:03.401 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:03.401 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:03.401 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:21:03.401 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:21:03.401 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:03.401 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:21:03.401 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:03.401 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:21:03.401 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:03.401 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:21:03.660 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:21:03.660 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:21:03.660 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:21:03.660 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:03.660 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:03.660 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:21:03.660 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:21:03.660 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:21:03.660 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:21:03.660 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:03.660 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:21:03.660 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:03.660 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:21:03.660 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:03.660 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:21:03.919 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:21:03.919 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:21:03.919 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:21:03.919 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:03.919 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:03.919 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:21:03.919 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:21:03.919 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:21:03.919 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:21:03.919 12:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:21:03.919 12:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:04.178 [2024-07-12 12:01:54.248301] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:04.178 [2024-07-12 12:01:54.248337] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:04.178 [2024-07-12 12:01:54.248350] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10b5eb0 00:21:04.178 [2024-07-12 12:01:54.248356] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:04.178 [2024-07-12 12:01:54.249616] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:04.178 [2024-07-12 12:01:54.249637] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:04.178 [2024-07-12 12:01:54.249699] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:21:04.178 [2024-07-12 12:01:54.249721] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:04.178 [2024-07-12 12:01:54.249796] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:04.178 [2024-07-12 12:01:54.249844] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:04.178 spare 00:21:04.178 12:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:04.178 12:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:04.178 12:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:04.178 12:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:04.178 12:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:04.178 12:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:04.178 12:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:04.178 12:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:04.178 12:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:04.178 12:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:04.178 12:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:04.178 12:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:04.178 [2024-07-12 12:01:54.350143] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x10b5a00 00:21:04.178 [2024-07-12 12:01:54.350154] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:04.178 [2024-07-12 12:01:54.350283] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc86230 00:21:04.178 [2024-07-12 12:01:54.350385] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x10b5a00 00:21:04.178 [2024-07-12 12:01:54.350390] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x10b5a00 00:21:04.178 [2024-07-12 12:01:54.350458] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:04.436 12:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:04.436 "name": "raid_bdev1", 00:21:04.436 "uuid": "b3d939dd-c6d0-4afe-8a2d-25add101120d", 00:21:04.436 "strip_size_kb": 0, 00:21:04.436 "state": "online", 00:21:04.436 "raid_level": "raid1", 00:21:04.436 "superblock": true, 00:21:04.436 "num_base_bdevs": 4, 00:21:04.436 "num_base_bdevs_discovered": 3, 00:21:04.436 "num_base_bdevs_operational": 3, 00:21:04.436 "base_bdevs_list": [ 00:21:04.436 { 00:21:04.436 "name": "spare", 00:21:04.436 "uuid": "985df082-3da1-52b1-9f63-f170378eb6fe", 00:21:04.436 "is_configured": true, 00:21:04.436 "data_offset": 2048, 00:21:04.436 "data_size": 63488 00:21:04.436 }, 00:21:04.436 { 00:21:04.436 "name": null, 00:21:04.436 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:04.436 "is_configured": false, 00:21:04.436 "data_offset": 2048, 00:21:04.436 "data_size": 63488 00:21:04.436 }, 00:21:04.436 { 00:21:04.436 "name": "BaseBdev3", 00:21:04.436 "uuid": "a8a2c5e3-38c2-5e0d-a5a8-e9f2aaf3c9f3", 00:21:04.436 "is_configured": true, 00:21:04.436 "data_offset": 2048, 00:21:04.436 "data_size": 63488 00:21:04.436 }, 00:21:04.436 { 00:21:04.436 "name": "BaseBdev4", 00:21:04.436 "uuid": "839a762c-822a-580a-a40d-c999ccf036a2", 00:21:04.436 "is_configured": true, 00:21:04.436 "data_offset": 2048, 00:21:04.436 "data_size": 63488 00:21:04.436 } 00:21:04.436 ] 00:21:04.436 }' 00:21:04.436 12:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:04.436 12:01:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:05.003 12:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:05.003 12:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:05.003 12:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:05.003 12:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:05.003 12:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:05.003 12:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:05.003 12:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:05.003 12:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:05.003 "name": "raid_bdev1", 00:21:05.003 "uuid": "b3d939dd-c6d0-4afe-8a2d-25add101120d", 00:21:05.003 "strip_size_kb": 0, 00:21:05.003 "state": "online", 00:21:05.003 "raid_level": "raid1", 00:21:05.003 "superblock": true, 00:21:05.003 "num_base_bdevs": 4, 00:21:05.003 "num_base_bdevs_discovered": 3, 00:21:05.003 "num_base_bdevs_operational": 3, 00:21:05.003 "base_bdevs_list": [ 00:21:05.003 { 00:21:05.003 "name": "spare", 00:21:05.003 "uuid": "985df082-3da1-52b1-9f63-f170378eb6fe", 00:21:05.003 "is_configured": true, 00:21:05.003 "data_offset": 2048, 00:21:05.003 "data_size": 63488 00:21:05.003 }, 00:21:05.003 { 00:21:05.003 "name": null, 00:21:05.003 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:05.003 "is_configured": false, 00:21:05.003 "data_offset": 2048, 00:21:05.003 "data_size": 63488 00:21:05.003 }, 00:21:05.003 { 00:21:05.003 "name": "BaseBdev3", 00:21:05.003 "uuid": "a8a2c5e3-38c2-5e0d-a5a8-e9f2aaf3c9f3", 00:21:05.003 "is_configured": true, 00:21:05.003 "data_offset": 2048, 00:21:05.003 "data_size": 63488 00:21:05.003 }, 00:21:05.003 { 00:21:05.003 "name": "BaseBdev4", 00:21:05.003 "uuid": "839a762c-822a-580a-a40d-c999ccf036a2", 00:21:05.003 "is_configured": true, 00:21:05.003 "data_offset": 2048, 00:21:05.003 "data_size": 63488 00:21:05.003 } 00:21:05.003 ] 00:21:05.003 }' 00:21:05.003 12:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:05.003 12:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:05.003 12:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:05.003 12:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:05.003 12:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:05.003 12:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:21:05.262 12:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:21:05.262 12:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:21:05.520 [2024-07-12 12:01:55.519751] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:05.520 12:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:05.520 12:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:05.520 12:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:05.520 12:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:05.520 12:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:05.520 12:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:05.520 12:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:05.520 12:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:05.520 12:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:05.520 12:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:05.520 12:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:05.520 12:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:05.520 12:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:05.520 "name": "raid_bdev1", 00:21:05.520 "uuid": "b3d939dd-c6d0-4afe-8a2d-25add101120d", 00:21:05.520 "strip_size_kb": 0, 00:21:05.520 "state": "online", 00:21:05.520 "raid_level": "raid1", 00:21:05.520 "superblock": true, 00:21:05.520 "num_base_bdevs": 4, 00:21:05.520 "num_base_bdevs_discovered": 2, 00:21:05.520 "num_base_bdevs_operational": 2, 00:21:05.520 "base_bdevs_list": [ 00:21:05.520 { 00:21:05.520 "name": null, 00:21:05.520 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:05.520 "is_configured": false, 00:21:05.520 "data_offset": 2048, 00:21:05.520 "data_size": 63488 00:21:05.520 }, 00:21:05.520 { 00:21:05.520 "name": null, 00:21:05.520 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:05.520 "is_configured": false, 00:21:05.520 "data_offset": 2048, 00:21:05.520 "data_size": 63488 00:21:05.520 }, 00:21:05.520 { 00:21:05.520 "name": "BaseBdev3", 00:21:05.520 "uuid": "a8a2c5e3-38c2-5e0d-a5a8-e9f2aaf3c9f3", 00:21:05.520 "is_configured": true, 00:21:05.520 "data_offset": 2048, 00:21:05.520 "data_size": 63488 00:21:05.520 }, 00:21:05.520 { 00:21:05.520 "name": "BaseBdev4", 00:21:05.520 "uuid": "839a762c-822a-580a-a40d-c999ccf036a2", 00:21:05.520 "is_configured": true, 00:21:05.520 "data_offset": 2048, 00:21:05.520 "data_size": 63488 00:21:05.520 } 00:21:05.520 ] 00:21:05.520 }' 00:21:05.520 12:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:05.520 12:01:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:06.087 12:01:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:06.087 [2024-07-12 12:01:56.325930] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:06.087 [2024-07-12 12:01:56.326040] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:21:06.087 [2024-07-12 12:01:56.326050] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:21:06.087 [2024-07-12 12:01:56.326066] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:06.087 [2024-07-12 12:01:56.329923] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10b14e0 00:21:06.087 [2024-07-12 12:01:56.331488] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:06.346 12:01:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:21:07.289 12:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:07.289 12:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:07.289 12:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:07.289 12:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:07.289 12:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:07.289 12:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:07.289 12:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:07.289 12:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:07.289 "name": "raid_bdev1", 00:21:07.289 "uuid": "b3d939dd-c6d0-4afe-8a2d-25add101120d", 00:21:07.289 "strip_size_kb": 0, 00:21:07.289 "state": "online", 00:21:07.289 "raid_level": "raid1", 00:21:07.289 "superblock": true, 00:21:07.289 "num_base_bdevs": 4, 00:21:07.289 "num_base_bdevs_discovered": 3, 00:21:07.289 "num_base_bdevs_operational": 3, 00:21:07.289 "process": { 00:21:07.289 "type": "rebuild", 00:21:07.289 "target": "spare", 00:21:07.289 "progress": { 00:21:07.289 "blocks": 22528, 00:21:07.289 "percent": 35 00:21:07.289 } 00:21:07.289 }, 00:21:07.289 "base_bdevs_list": [ 00:21:07.289 { 00:21:07.289 "name": "spare", 00:21:07.289 "uuid": "985df082-3da1-52b1-9f63-f170378eb6fe", 00:21:07.289 "is_configured": true, 00:21:07.289 "data_offset": 2048, 00:21:07.289 "data_size": 63488 00:21:07.289 }, 00:21:07.289 { 00:21:07.289 "name": null, 00:21:07.289 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:07.289 "is_configured": false, 00:21:07.289 "data_offset": 2048, 00:21:07.289 "data_size": 63488 00:21:07.289 }, 00:21:07.289 { 00:21:07.289 "name": "BaseBdev3", 00:21:07.289 "uuid": "a8a2c5e3-38c2-5e0d-a5a8-e9f2aaf3c9f3", 00:21:07.289 "is_configured": true, 00:21:07.289 "data_offset": 2048, 00:21:07.289 "data_size": 63488 00:21:07.289 }, 00:21:07.289 { 00:21:07.289 "name": "BaseBdev4", 00:21:07.289 "uuid": "839a762c-822a-580a-a40d-c999ccf036a2", 00:21:07.289 "is_configured": true, 00:21:07.290 "data_offset": 2048, 00:21:07.290 "data_size": 63488 00:21:07.290 } 00:21:07.290 ] 00:21:07.290 }' 00:21:07.290 12:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:07.550 12:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:07.550 12:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:07.550 12:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:07.550 12:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:21:07.550 [2024-07-12 12:01:57.754592] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:07.808 [2024-07-12 12:01:57.842042] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:07.808 [2024-07-12 12:01:57.842070] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:07.808 [2024-07-12 12:01:57.842079] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:07.808 [2024-07-12 12:01:57.842083] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:21:07.808 12:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:07.808 12:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:07.808 12:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:07.808 12:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:07.808 12:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:07.808 12:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:07.808 12:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:07.808 12:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:07.808 12:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:07.808 12:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:07.808 12:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:07.808 12:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:07.808 12:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:07.808 "name": "raid_bdev1", 00:21:07.808 "uuid": "b3d939dd-c6d0-4afe-8a2d-25add101120d", 00:21:07.808 "strip_size_kb": 0, 00:21:07.808 "state": "online", 00:21:07.808 "raid_level": "raid1", 00:21:07.808 "superblock": true, 00:21:07.808 "num_base_bdevs": 4, 00:21:07.808 "num_base_bdevs_discovered": 2, 00:21:07.808 "num_base_bdevs_operational": 2, 00:21:07.808 "base_bdevs_list": [ 00:21:07.808 { 00:21:07.808 "name": null, 00:21:07.808 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:07.808 "is_configured": false, 00:21:07.808 "data_offset": 2048, 00:21:07.808 "data_size": 63488 00:21:07.808 }, 00:21:07.808 { 00:21:07.808 "name": null, 00:21:07.808 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:07.808 "is_configured": false, 00:21:07.808 "data_offset": 2048, 00:21:07.808 "data_size": 63488 00:21:07.808 }, 00:21:07.808 { 00:21:07.808 "name": "BaseBdev3", 00:21:07.808 "uuid": "a8a2c5e3-38c2-5e0d-a5a8-e9f2aaf3c9f3", 00:21:07.808 "is_configured": true, 00:21:07.808 "data_offset": 2048, 00:21:07.808 "data_size": 63488 00:21:07.808 }, 00:21:07.808 { 00:21:07.808 "name": "BaseBdev4", 00:21:07.808 "uuid": "839a762c-822a-580a-a40d-c999ccf036a2", 00:21:07.808 "is_configured": true, 00:21:07.808 "data_offset": 2048, 00:21:07.808 "data_size": 63488 00:21:07.808 } 00:21:07.808 ] 00:21:07.808 }' 00:21:07.808 12:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:07.808 12:01:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:08.375 12:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:08.635 [2024-07-12 12:01:58.668035] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:08.635 [2024-07-12 12:01:58.668069] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:08.635 [2024-07-12 12:01:58.668080] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10b2940 00:21:08.635 [2024-07-12 12:01:58.668102] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:08.635 [2024-07-12 12:01:58.668370] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:08.635 [2024-07-12 12:01:58.668380] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:08.635 [2024-07-12 12:01:58.668437] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:21:08.635 [2024-07-12 12:01:58.668444] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:21:08.635 [2024-07-12 12:01:58.668449] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:21:08.635 [2024-07-12 12:01:58.668460] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:08.635 [2024-07-12 12:01:58.672346] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10b14e0 00:21:08.635 spare 00:21:08.635 [2024-07-12 12:01:58.673410] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:08.635 12:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:21:09.570 12:01:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:09.570 12:01:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:09.570 12:01:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:09.570 12:01:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:09.570 12:01:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:09.570 12:01:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:09.570 12:01:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:09.828 12:01:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:09.829 "name": "raid_bdev1", 00:21:09.829 "uuid": "b3d939dd-c6d0-4afe-8a2d-25add101120d", 00:21:09.829 "strip_size_kb": 0, 00:21:09.829 "state": "online", 00:21:09.829 "raid_level": "raid1", 00:21:09.829 "superblock": true, 00:21:09.829 "num_base_bdevs": 4, 00:21:09.829 "num_base_bdevs_discovered": 3, 00:21:09.829 "num_base_bdevs_operational": 3, 00:21:09.829 "process": { 00:21:09.829 "type": "rebuild", 00:21:09.829 "target": "spare", 00:21:09.829 "progress": { 00:21:09.829 "blocks": 22528, 00:21:09.829 "percent": 35 00:21:09.829 } 00:21:09.829 }, 00:21:09.829 "base_bdevs_list": [ 00:21:09.829 { 00:21:09.829 "name": "spare", 00:21:09.829 "uuid": "985df082-3da1-52b1-9f63-f170378eb6fe", 00:21:09.829 "is_configured": true, 00:21:09.829 "data_offset": 2048, 00:21:09.829 "data_size": 63488 00:21:09.829 }, 00:21:09.829 { 00:21:09.829 "name": null, 00:21:09.829 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:09.829 "is_configured": false, 00:21:09.829 "data_offset": 2048, 00:21:09.829 "data_size": 63488 00:21:09.829 }, 00:21:09.829 { 00:21:09.829 "name": "BaseBdev3", 00:21:09.829 "uuid": "a8a2c5e3-38c2-5e0d-a5a8-e9f2aaf3c9f3", 00:21:09.829 "is_configured": true, 00:21:09.829 "data_offset": 2048, 00:21:09.829 "data_size": 63488 00:21:09.829 }, 00:21:09.829 { 00:21:09.829 "name": "BaseBdev4", 00:21:09.829 "uuid": "839a762c-822a-580a-a40d-c999ccf036a2", 00:21:09.829 "is_configured": true, 00:21:09.829 "data_offset": 2048, 00:21:09.829 "data_size": 63488 00:21:09.829 } 00:21:09.829 ] 00:21:09.829 }' 00:21:09.829 12:01:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:09.829 12:01:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:09.829 12:01:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:09.829 12:01:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:09.829 12:01:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:21:10.087 [2024-07-12 12:02:00.107003] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:10.087 [2024-07-12 12:02:00.184009] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:10.087 [2024-07-12 12:02:00.184043] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:10.087 [2024-07-12 12:02:00.184052] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:10.087 [2024-07-12 12:02:00.184056] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:21:10.087 12:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:10.087 12:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:10.087 12:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:10.087 12:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:10.087 12:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:10.087 12:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:10.087 12:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:10.087 12:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:10.087 12:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:10.087 12:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:10.087 12:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:10.087 12:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:10.345 12:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:10.345 "name": "raid_bdev1", 00:21:10.345 "uuid": "b3d939dd-c6d0-4afe-8a2d-25add101120d", 00:21:10.345 "strip_size_kb": 0, 00:21:10.345 "state": "online", 00:21:10.345 "raid_level": "raid1", 00:21:10.345 "superblock": true, 00:21:10.345 "num_base_bdevs": 4, 00:21:10.345 "num_base_bdevs_discovered": 2, 00:21:10.345 "num_base_bdevs_operational": 2, 00:21:10.345 "base_bdevs_list": [ 00:21:10.345 { 00:21:10.345 "name": null, 00:21:10.345 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:10.345 "is_configured": false, 00:21:10.345 "data_offset": 2048, 00:21:10.345 "data_size": 63488 00:21:10.345 }, 00:21:10.345 { 00:21:10.345 "name": null, 00:21:10.345 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:10.345 "is_configured": false, 00:21:10.345 "data_offset": 2048, 00:21:10.345 "data_size": 63488 00:21:10.345 }, 00:21:10.345 { 00:21:10.345 "name": "BaseBdev3", 00:21:10.345 "uuid": "a8a2c5e3-38c2-5e0d-a5a8-e9f2aaf3c9f3", 00:21:10.345 "is_configured": true, 00:21:10.345 "data_offset": 2048, 00:21:10.345 "data_size": 63488 00:21:10.345 }, 00:21:10.345 { 00:21:10.345 "name": "BaseBdev4", 00:21:10.345 "uuid": "839a762c-822a-580a-a40d-c999ccf036a2", 00:21:10.345 "is_configured": true, 00:21:10.345 "data_offset": 2048, 00:21:10.345 "data_size": 63488 00:21:10.345 } 00:21:10.345 ] 00:21:10.345 }' 00:21:10.345 12:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:10.345 12:02:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:10.912 12:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:10.912 12:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:10.912 12:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:10.912 12:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:10.912 12:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:10.912 12:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:10.912 12:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:10.912 12:02:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:10.912 "name": "raid_bdev1", 00:21:10.912 "uuid": "b3d939dd-c6d0-4afe-8a2d-25add101120d", 00:21:10.912 "strip_size_kb": 0, 00:21:10.912 "state": "online", 00:21:10.912 "raid_level": "raid1", 00:21:10.912 "superblock": true, 00:21:10.912 "num_base_bdevs": 4, 00:21:10.912 "num_base_bdevs_discovered": 2, 00:21:10.912 "num_base_bdevs_operational": 2, 00:21:10.912 "base_bdevs_list": [ 00:21:10.912 { 00:21:10.912 "name": null, 00:21:10.912 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:10.912 "is_configured": false, 00:21:10.912 "data_offset": 2048, 00:21:10.912 "data_size": 63488 00:21:10.912 }, 00:21:10.912 { 00:21:10.912 "name": null, 00:21:10.912 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:10.912 "is_configured": false, 00:21:10.912 "data_offset": 2048, 00:21:10.912 "data_size": 63488 00:21:10.912 }, 00:21:10.912 { 00:21:10.912 "name": "BaseBdev3", 00:21:10.912 "uuid": "a8a2c5e3-38c2-5e0d-a5a8-e9f2aaf3c9f3", 00:21:10.912 "is_configured": true, 00:21:10.912 "data_offset": 2048, 00:21:10.912 "data_size": 63488 00:21:10.912 }, 00:21:10.912 { 00:21:10.912 "name": "BaseBdev4", 00:21:10.912 "uuid": "839a762c-822a-580a-a40d-c999ccf036a2", 00:21:10.912 "is_configured": true, 00:21:10.912 "data_offset": 2048, 00:21:10.912 "data_size": 63488 00:21:10.912 } 00:21:10.912 ] 00:21:10.912 }' 00:21:10.912 12:02:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:10.912 12:02:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:10.912 12:02:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:10.912 12:02:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:10.912 12:02:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:21:11.169 12:02:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:21:11.427 [2024-07-12 12:02:01.427161] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:21:11.427 [2024-07-12 12:02:01.427192] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:11.427 [2024-07-12 12:02:01.427204] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10b4ad0 00:21:11.427 [2024-07-12 12:02:01.427210] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:11.427 [2024-07-12 12:02:01.427459] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:11.427 [2024-07-12 12:02:01.427470] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:11.427 [2024-07-12 12:02:01.427513] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:21:11.427 [2024-07-12 12:02:01.427528] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:21:11.427 [2024-07-12 12:02:01.427533] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:21:11.427 BaseBdev1 00:21:11.427 12:02:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:21:12.363 12:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:12.363 12:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:12.363 12:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:12.363 12:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:12.363 12:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:12.363 12:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:12.363 12:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:12.363 12:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:12.363 12:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:12.363 12:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:12.363 12:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:12.363 12:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:12.622 12:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:12.622 "name": "raid_bdev1", 00:21:12.622 "uuid": "b3d939dd-c6d0-4afe-8a2d-25add101120d", 00:21:12.622 "strip_size_kb": 0, 00:21:12.622 "state": "online", 00:21:12.622 "raid_level": "raid1", 00:21:12.622 "superblock": true, 00:21:12.622 "num_base_bdevs": 4, 00:21:12.622 "num_base_bdevs_discovered": 2, 00:21:12.622 "num_base_bdevs_operational": 2, 00:21:12.622 "base_bdevs_list": [ 00:21:12.622 { 00:21:12.622 "name": null, 00:21:12.622 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:12.622 "is_configured": false, 00:21:12.622 "data_offset": 2048, 00:21:12.622 "data_size": 63488 00:21:12.622 }, 00:21:12.622 { 00:21:12.622 "name": null, 00:21:12.622 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:12.622 "is_configured": false, 00:21:12.622 "data_offset": 2048, 00:21:12.622 "data_size": 63488 00:21:12.622 }, 00:21:12.622 { 00:21:12.622 "name": "BaseBdev3", 00:21:12.622 "uuid": "a8a2c5e3-38c2-5e0d-a5a8-e9f2aaf3c9f3", 00:21:12.622 "is_configured": true, 00:21:12.622 "data_offset": 2048, 00:21:12.622 "data_size": 63488 00:21:12.622 }, 00:21:12.622 { 00:21:12.622 "name": "BaseBdev4", 00:21:12.622 "uuid": "839a762c-822a-580a-a40d-c999ccf036a2", 00:21:12.622 "is_configured": true, 00:21:12.622 "data_offset": 2048, 00:21:12.622 "data_size": 63488 00:21:12.622 } 00:21:12.622 ] 00:21:12.622 }' 00:21:12.622 12:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:12.622 12:02:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:12.891 12:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:12.891 12:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:12.891 12:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:12.891 12:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:12.891 12:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:12.891 12:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:12.891 12:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:13.150 12:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:13.150 "name": "raid_bdev1", 00:21:13.150 "uuid": "b3d939dd-c6d0-4afe-8a2d-25add101120d", 00:21:13.150 "strip_size_kb": 0, 00:21:13.150 "state": "online", 00:21:13.150 "raid_level": "raid1", 00:21:13.150 "superblock": true, 00:21:13.150 "num_base_bdevs": 4, 00:21:13.150 "num_base_bdevs_discovered": 2, 00:21:13.150 "num_base_bdevs_operational": 2, 00:21:13.150 "base_bdevs_list": [ 00:21:13.150 { 00:21:13.150 "name": null, 00:21:13.150 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:13.150 "is_configured": false, 00:21:13.150 "data_offset": 2048, 00:21:13.150 "data_size": 63488 00:21:13.150 }, 00:21:13.150 { 00:21:13.150 "name": null, 00:21:13.150 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:13.150 "is_configured": false, 00:21:13.150 "data_offset": 2048, 00:21:13.150 "data_size": 63488 00:21:13.150 }, 00:21:13.150 { 00:21:13.150 "name": "BaseBdev3", 00:21:13.150 "uuid": "a8a2c5e3-38c2-5e0d-a5a8-e9f2aaf3c9f3", 00:21:13.150 "is_configured": true, 00:21:13.150 "data_offset": 2048, 00:21:13.150 "data_size": 63488 00:21:13.150 }, 00:21:13.150 { 00:21:13.150 "name": "BaseBdev4", 00:21:13.150 "uuid": "839a762c-822a-580a-a40d-c999ccf036a2", 00:21:13.150 "is_configured": true, 00:21:13.150 "data_offset": 2048, 00:21:13.150 "data_size": 63488 00:21:13.150 } 00:21:13.150 ] 00:21:13.150 }' 00:21:13.150 12:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:13.150 12:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:13.150 12:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:13.150 12:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:13.150 12:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:21:13.150 12:02:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:21:13.150 12:02:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:21:13.150 12:02:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:13.150 12:02:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:13.150 12:02:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:13.150 12:02:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:13.150 12:02:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:13.150 12:02:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:13.150 12:02:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:13.150 12:02:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:21:13.150 12:02:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:21:13.408 [2024-07-12 12:02:03.532792] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:13.408 [2024-07-12 12:02:03.532884] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:21:13.408 [2024-07-12 12:02:03.532892] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:21:13.408 request: 00:21:13.408 { 00:21:13.408 "raid_bdev": "raid_bdev1", 00:21:13.408 "base_bdev": "BaseBdev1", 00:21:13.408 "method": "bdev_raid_add_base_bdev", 00:21:13.408 "req_id": 1 00:21:13.408 } 00:21:13.408 Got JSON-RPC error response 00:21:13.408 response: 00:21:13.408 { 00:21:13.408 "code": -22, 00:21:13.408 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:21:13.408 } 00:21:13.408 12:02:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:21:13.408 12:02:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:21:13.408 12:02:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:21:13.408 12:02:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:21:13.408 12:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:21:14.341 12:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:14.341 12:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:14.341 12:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:14.341 12:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:14.341 12:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:14.341 12:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:14.341 12:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:14.341 12:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:14.341 12:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:14.341 12:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:14.341 12:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:14.341 12:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:14.600 12:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:14.600 "name": "raid_bdev1", 00:21:14.600 "uuid": "b3d939dd-c6d0-4afe-8a2d-25add101120d", 00:21:14.600 "strip_size_kb": 0, 00:21:14.600 "state": "online", 00:21:14.600 "raid_level": "raid1", 00:21:14.600 "superblock": true, 00:21:14.600 "num_base_bdevs": 4, 00:21:14.600 "num_base_bdevs_discovered": 2, 00:21:14.600 "num_base_bdevs_operational": 2, 00:21:14.600 "base_bdevs_list": [ 00:21:14.600 { 00:21:14.600 "name": null, 00:21:14.600 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:14.600 "is_configured": false, 00:21:14.600 "data_offset": 2048, 00:21:14.600 "data_size": 63488 00:21:14.600 }, 00:21:14.600 { 00:21:14.600 "name": null, 00:21:14.600 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:14.600 "is_configured": false, 00:21:14.600 "data_offset": 2048, 00:21:14.600 "data_size": 63488 00:21:14.600 }, 00:21:14.600 { 00:21:14.600 "name": "BaseBdev3", 00:21:14.600 "uuid": "a8a2c5e3-38c2-5e0d-a5a8-e9f2aaf3c9f3", 00:21:14.600 "is_configured": true, 00:21:14.600 "data_offset": 2048, 00:21:14.600 "data_size": 63488 00:21:14.600 }, 00:21:14.600 { 00:21:14.600 "name": "BaseBdev4", 00:21:14.600 "uuid": "839a762c-822a-580a-a40d-c999ccf036a2", 00:21:14.600 "is_configured": true, 00:21:14.600 "data_offset": 2048, 00:21:14.600 "data_size": 63488 00:21:14.600 } 00:21:14.600 ] 00:21:14.600 }' 00:21:14.600 12:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:14.600 12:02:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:15.167 12:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:15.167 12:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:15.167 12:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:15.167 12:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:15.167 12:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:15.167 12:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:15.167 12:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:15.167 12:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:15.167 "name": "raid_bdev1", 00:21:15.167 "uuid": "b3d939dd-c6d0-4afe-8a2d-25add101120d", 00:21:15.167 "strip_size_kb": 0, 00:21:15.167 "state": "online", 00:21:15.167 "raid_level": "raid1", 00:21:15.167 "superblock": true, 00:21:15.167 "num_base_bdevs": 4, 00:21:15.167 "num_base_bdevs_discovered": 2, 00:21:15.167 "num_base_bdevs_operational": 2, 00:21:15.167 "base_bdevs_list": [ 00:21:15.167 { 00:21:15.167 "name": null, 00:21:15.167 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:15.167 "is_configured": false, 00:21:15.167 "data_offset": 2048, 00:21:15.167 "data_size": 63488 00:21:15.167 }, 00:21:15.167 { 00:21:15.167 "name": null, 00:21:15.167 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:15.167 "is_configured": false, 00:21:15.167 "data_offset": 2048, 00:21:15.167 "data_size": 63488 00:21:15.167 }, 00:21:15.167 { 00:21:15.167 "name": "BaseBdev3", 00:21:15.167 "uuid": "a8a2c5e3-38c2-5e0d-a5a8-e9f2aaf3c9f3", 00:21:15.167 "is_configured": true, 00:21:15.167 "data_offset": 2048, 00:21:15.167 "data_size": 63488 00:21:15.167 }, 00:21:15.167 { 00:21:15.167 "name": "BaseBdev4", 00:21:15.167 "uuid": "839a762c-822a-580a-a40d-c999ccf036a2", 00:21:15.167 "is_configured": true, 00:21:15.167 "data_offset": 2048, 00:21:15.167 "data_size": 63488 00:21:15.167 } 00:21:15.167 ] 00:21:15.167 }' 00:21:15.167 12:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:15.425 12:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:15.425 12:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:15.425 12:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:15.425 12:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 720360 00:21:15.425 12:02:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@948 -- # '[' -z 720360 ']' 00:21:15.425 12:02:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # kill -0 720360 00:21:15.425 12:02:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # uname 00:21:15.425 12:02:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:15.425 12:02:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 720360 00:21:15.425 12:02:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:15.425 12:02:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:15.425 12:02:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 720360' 00:21:15.425 killing process with pid 720360 00:21:15.425 12:02:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # kill 720360 00:21:15.425 Received shutdown signal, test time was about 22.701204 seconds 00:21:15.425 00:21:15.425 Latency(us) 00:21:15.425 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:15.425 =================================================================================================================== 00:21:15.425 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:21:15.425 [2024-07-12 12:02:05.531635] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:15.425 [2024-07-12 12:02:05.531708] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:15.425 [2024-07-12 12:02:05.531748] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:15.425 [2024-07-12 12:02:05.531754] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10b5a00 name raid_bdev1, state offline 00:21:15.425 12:02:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@972 -- # wait 720360 00:21:15.425 [2024-07-12 12:02:05.565354] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:15.683 12:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:21:15.683 00:21:15.683 real 0m26.847s 00:21:15.683 user 0m41.642s 00:21:15.683 sys 0m3.331s 00:21:15.683 12:02:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:15.683 12:02:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:15.683 ************************************ 00:21:15.683 END TEST raid_rebuild_test_sb_io 00:21:15.683 ************************************ 00:21:15.683 12:02:05 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:15.683 12:02:05 bdev_raid -- bdev/bdev_raid.sh@884 -- # '[' n == y ']' 00:21:15.683 12:02:05 bdev_raid -- bdev/bdev_raid.sh@896 -- # base_blocklen=4096 00:21:15.684 12:02:05 bdev_raid -- bdev/bdev_raid.sh@898 -- # run_test raid_state_function_test_sb_4k raid_state_function_test raid1 2 true 00:21:15.684 12:02:05 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:21:15.684 12:02:05 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:15.684 12:02:05 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:15.684 ************************************ 00:21:15.684 START TEST raid_state_function_test_sb_4k 00:21:15.684 ************************************ 00:21:15.684 12:02:05 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:21:15.684 12:02:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:21:15.684 12:02:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:21:15.684 12:02:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:21:15.684 12:02:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:21:15.684 12:02:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:21:15.684 12:02:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:15.684 12:02:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:21:15.684 12:02:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:15.684 12:02:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:15.684 12:02:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:21:15.684 12:02:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:15.684 12:02:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:15.684 12:02:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:21:15.684 12:02:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:21:15.684 12:02:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:21:15.684 12:02:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # local strip_size 00:21:15.684 12:02:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:21:15.684 12:02:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:21:15.684 12:02:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:21:15.684 12:02:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:21:15.684 12:02:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:21:15.684 12:02:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:21:15.684 12:02:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@244 -- # raid_pid=725263 00:21:15.684 12:02:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 725263' 00:21:15.684 Process raid pid: 725263 00:21:15.684 12:02:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:21:15.684 12:02:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@246 -- # waitforlisten 725263 /var/tmp/spdk-raid.sock 00:21:15.684 12:02:05 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@829 -- # '[' -z 725263 ']' 00:21:15.684 12:02:05 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:15.684 12:02:05 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:15.684 12:02:05 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:15.684 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:15.684 12:02:05 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:15.684 12:02:05 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:15.684 [2024-07-12 12:02:05.859441] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:21:15.684 [2024-07-12 12:02:05.859476] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:21:15.684 [2024-07-12 12:02:05.922976] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:15.968 [2024-07-12 12:02:06.000701] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:21:15.968 [2024-07-12 12:02:06.049525] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:15.968 [2024-07-12 12:02:06.049545] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:16.535 12:02:06 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:16.535 12:02:06 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@862 -- # return 0 00:21:16.535 12:02:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:21:16.792 [2024-07-12 12:02:06.783950] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:16.792 [2024-07-12 12:02:06.783980] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:16.792 [2024-07-12 12:02:06.783990] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:16.792 [2024-07-12 12:02:06.783996] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:16.792 12:02:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:21:16.792 12:02:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:16.792 12:02:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:16.792 12:02:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:16.792 12:02:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:16.792 12:02:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:16.793 12:02:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:16.793 12:02:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:16.793 12:02:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:16.793 12:02:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:16.793 12:02:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:16.793 12:02:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:16.793 12:02:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:16.793 "name": "Existed_Raid", 00:21:16.793 "uuid": "15cdb6ec-ebe1-4924-add7-0e760b31ce0b", 00:21:16.793 "strip_size_kb": 0, 00:21:16.793 "state": "configuring", 00:21:16.793 "raid_level": "raid1", 00:21:16.793 "superblock": true, 00:21:16.793 "num_base_bdevs": 2, 00:21:16.793 "num_base_bdevs_discovered": 0, 00:21:16.793 "num_base_bdevs_operational": 2, 00:21:16.793 "base_bdevs_list": [ 00:21:16.793 { 00:21:16.793 "name": "BaseBdev1", 00:21:16.793 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:16.793 "is_configured": false, 00:21:16.793 "data_offset": 0, 00:21:16.793 "data_size": 0 00:21:16.793 }, 00:21:16.793 { 00:21:16.793 "name": "BaseBdev2", 00:21:16.793 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:16.793 "is_configured": false, 00:21:16.793 "data_offset": 0, 00:21:16.793 "data_size": 0 00:21:16.793 } 00:21:16.793 ] 00:21:16.793 }' 00:21:16.793 12:02:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:16.793 12:02:06 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:17.357 12:02:07 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:17.615 [2024-07-12 12:02:07.622019] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:17.615 [2024-07-12 12:02:07.622036] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcca1b0 name Existed_Raid, state configuring 00:21:17.615 12:02:07 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:21:17.615 [2024-07-12 12:02:07.794483] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:17.615 [2024-07-12 12:02:07.794504] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:17.615 [2024-07-12 12:02:07.794509] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:17.615 [2024-07-12 12:02:07.794514] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:17.615 12:02:07 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1 00:21:17.873 [2024-07-12 12:02:07.963017] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:17.873 BaseBdev1 00:21:17.873 12:02:07 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:21:17.873 12:02:07 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:21:17.873 12:02:07 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:17.873 12:02:07 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local i 00:21:17.873 12:02:07 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:17.873 12:02:07 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:17.873 12:02:07 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:18.131 12:02:08 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:18.131 [ 00:21:18.131 { 00:21:18.131 "name": "BaseBdev1", 00:21:18.131 "aliases": [ 00:21:18.131 "e225e4f0-3073-49eb-bcbd-f2bca171ea0a" 00:21:18.131 ], 00:21:18.131 "product_name": "Malloc disk", 00:21:18.131 "block_size": 4096, 00:21:18.131 "num_blocks": 8192, 00:21:18.131 "uuid": "e225e4f0-3073-49eb-bcbd-f2bca171ea0a", 00:21:18.131 "assigned_rate_limits": { 00:21:18.131 "rw_ios_per_sec": 0, 00:21:18.131 "rw_mbytes_per_sec": 0, 00:21:18.131 "r_mbytes_per_sec": 0, 00:21:18.132 "w_mbytes_per_sec": 0 00:21:18.132 }, 00:21:18.132 "claimed": true, 00:21:18.132 "claim_type": "exclusive_write", 00:21:18.132 "zoned": false, 00:21:18.132 "supported_io_types": { 00:21:18.132 "read": true, 00:21:18.132 "write": true, 00:21:18.132 "unmap": true, 00:21:18.132 "flush": true, 00:21:18.132 "reset": true, 00:21:18.132 "nvme_admin": false, 00:21:18.132 "nvme_io": false, 00:21:18.132 "nvme_io_md": false, 00:21:18.132 "write_zeroes": true, 00:21:18.132 "zcopy": true, 00:21:18.132 "get_zone_info": false, 00:21:18.132 "zone_management": false, 00:21:18.132 "zone_append": false, 00:21:18.132 "compare": false, 00:21:18.132 "compare_and_write": false, 00:21:18.132 "abort": true, 00:21:18.132 "seek_hole": false, 00:21:18.132 "seek_data": false, 00:21:18.132 "copy": true, 00:21:18.132 "nvme_iov_md": false 00:21:18.132 }, 00:21:18.132 "memory_domains": [ 00:21:18.132 { 00:21:18.132 "dma_device_id": "system", 00:21:18.132 "dma_device_type": 1 00:21:18.132 }, 00:21:18.132 { 00:21:18.132 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:18.132 "dma_device_type": 2 00:21:18.132 } 00:21:18.132 ], 00:21:18.132 "driver_specific": {} 00:21:18.132 } 00:21:18.132 ] 00:21:18.132 12:02:08 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # return 0 00:21:18.132 12:02:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:21:18.132 12:02:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:18.132 12:02:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:18.132 12:02:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:18.132 12:02:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:18.132 12:02:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:18.132 12:02:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:18.132 12:02:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:18.132 12:02:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:18.132 12:02:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:18.132 12:02:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:18.132 12:02:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:18.389 12:02:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:18.389 "name": "Existed_Raid", 00:21:18.389 "uuid": "f4c93588-2961-4e55-969b-51ea1ca1fe4d", 00:21:18.389 "strip_size_kb": 0, 00:21:18.389 "state": "configuring", 00:21:18.389 "raid_level": "raid1", 00:21:18.389 "superblock": true, 00:21:18.389 "num_base_bdevs": 2, 00:21:18.389 "num_base_bdevs_discovered": 1, 00:21:18.389 "num_base_bdevs_operational": 2, 00:21:18.389 "base_bdevs_list": [ 00:21:18.389 { 00:21:18.389 "name": "BaseBdev1", 00:21:18.389 "uuid": "e225e4f0-3073-49eb-bcbd-f2bca171ea0a", 00:21:18.389 "is_configured": true, 00:21:18.389 "data_offset": 256, 00:21:18.389 "data_size": 7936 00:21:18.389 }, 00:21:18.389 { 00:21:18.389 "name": "BaseBdev2", 00:21:18.389 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:18.389 "is_configured": false, 00:21:18.389 "data_offset": 0, 00:21:18.389 "data_size": 0 00:21:18.389 } 00:21:18.389 ] 00:21:18.389 }' 00:21:18.389 12:02:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:18.389 12:02:08 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:18.953 12:02:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:18.953 [2024-07-12 12:02:09.109981] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:18.953 [2024-07-12 12:02:09.110010] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcc9aa0 name Existed_Raid, state configuring 00:21:18.953 12:02:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:21:19.211 [2024-07-12 12:02:09.278439] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:19.211 [2024-07-12 12:02:09.279468] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:19.211 [2024-07-12 12:02:09.279490] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:19.211 12:02:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:21:19.211 12:02:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:19.211 12:02:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:21:19.211 12:02:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:19.211 12:02:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:19.211 12:02:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:19.211 12:02:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:19.211 12:02:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:19.211 12:02:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:19.211 12:02:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:19.211 12:02:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:19.211 12:02:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:19.211 12:02:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:19.211 12:02:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:19.481 12:02:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:19.481 "name": "Existed_Raid", 00:21:19.481 "uuid": "8133e8ef-3492-46e4-93ec-e49921c948a7", 00:21:19.481 "strip_size_kb": 0, 00:21:19.481 "state": "configuring", 00:21:19.481 "raid_level": "raid1", 00:21:19.481 "superblock": true, 00:21:19.481 "num_base_bdevs": 2, 00:21:19.481 "num_base_bdevs_discovered": 1, 00:21:19.481 "num_base_bdevs_operational": 2, 00:21:19.481 "base_bdevs_list": [ 00:21:19.481 { 00:21:19.481 "name": "BaseBdev1", 00:21:19.481 "uuid": "e225e4f0-3073-49eb-bcbd-f2bca171ea0a", 00:21:19.481 "is_configured": true, 00:21:19.481 "data_offset": 256, 00:21:19.481 "data_size": 7936 00:21:19.481 }, 00:21:19.481 { 00:21:19.481 "name": "BaseBdev2", 00:21:19.481 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:19.481 "is_configured": false, 00:21:19.481 "data_offset": 0, 00:21:19.481 "data_size": 0 00:21:19.481 } 00:21:19.481 ] 00:21:19.481 }' 00:21:19.481 12:02:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:19.481 12:02:09 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:19.817 12:02:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2 00:21:20.078 [2024-07-12 12:02:10.111377] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:20.078 [2024-07-12 12:02:10.111483] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xcca890 00:21:20.078 [2024-07-12 12:02:10.111492] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:21:20.078 [2024-07-12 12:02:10.111624] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcc8c20 00:21:20.078 [2024-07-12 12:02:10.111708] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xcca890 00:21:20.078 [2024-07-12 12:02:10.111714] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xcca890 00:21:20.078 [2024-07-12 12:02:10.111777] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:20.078 BaseBdev2 00:21:20.078 12:02:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:21:20.078 12:02:10 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:21:20.078 12:02:10 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:20.078 12:02:10 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local i 00:21:20.078 12:02:10 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:20.078 12:02:10 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:20.079 12:02:10 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:20.079 12:02:10 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:20.339 [ 00:21:20.339 { 00:21:20.339 "name": "BaseBdev2", 00:21:20.339 "aliases": [ 00:21:20.339 "7ab4fec8-223d-4072-9657-5a3057d022ac" 00:21:20.339 ], 00:21:20.339 "product_name": "Malloc disk", 00:21:20.339 "block_size": 4096, 00:21:20.339 "num_blocks": 8192, 00:21:20.339 "uuid": "7ab4fec8-223d-4072-9657-5a3057d022ac", 00:21:20.339 "assigned_rate_limits": { 00:21:20.339 "rw_ios_per_sec": 0, 00:21:20.339 "rw_mbytes_per_sec": 0, 00:21:20.339 "r_mbytes_per_sec": 0, 00:21:20.339 "w_mbytes_per_sec": 0 00:21:20.339 }, 00:21:20.339 "claimed": true, 00:21:20.339 "claim_type": "exclusive_write", 00:21:20.339 "zoned": false, 00:21:20.339 "supported_io_types": { 00:21:20.339 "read": true, 00:21:20.339 "write": true, 00:21:20.339 "unmap": true, 00:21:20.339 "flush": true, 00:21:20.339 "reset": true, 00:21:20.339 "nvme_admin": false, 00:21:20.339 "nvme_io": false, 00:21:20.339 "nvme_io_md": false, 00:21:20.339 "write_zeroes": true, 00:21:20.339 "zcopy": true, 00:21:20.339 "get_zone_info": false, 00:21:20.339 "zone_management": false, 00:21:20.340 "zone_append": false, 00:21:20.340 "compare": false, 00:21:20.340 "compare_and_write": false, 00:21:20.340 "abort": true, 00:21:20.340 "seek_hole": false, 00:21:20.340 "seek_data": false, 00:21:20.340 "copy": true, 00:21:20.340 "nvme_iov_md": false 00:21:20.340 }, 00:21:20.340 "memory_domains": [ 00:21:20.340 { 00:21:20.340 "dma_device_id": "system", 00:21:20.340 "dma_device_type": 1 00:21:20.340 }, 00:21:20.340 { 00:21:20.340 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:20.340 "dma_device_type": 2 00:21:20.340 } 00:21:20.340 ], 00:21:20.340 "driver_specific": {} 00:21:20.340 } 00:21:20.340 ] 00:21:20.340 12:02:10 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # return 0 00:21:20.340 12:02:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:20.340 12:02:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:20.340 12:02:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:21:20.340 12:02:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:20.340 12:02:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:20.340 12:02:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:20.340 12:02:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:20.340 12:02:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:20.340 12:02:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:20.340 12:02:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:20.340 12:02:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:20.340 12:02:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:20.340 12:02:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:20.340 12:02:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:20.598 12:02:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:20.598 "name": "Existed_Raid", 00:21:20.598 "uuid": "8133e8ef-3492-46e4-93ec-e49921c948a7", 00:21:20.598 "strip_size_kb": 0, 00:21:20.598 "state": "online", 00:21:20.598 "raid_level": "raid1", 00:21:20.598 "superblock": true, 00:21:20.598 "num_base_bdevs": 2, 00:21:20.598 "num_base_bdevs_discovered": 2, 00:21:20.598 "num_base_bdevs_operational": 2, 00:21:20.598 "base_bdevs_list": [ 00:21:20.598 { 00:21:20.598 "name": "BaseBdev1", 00:21:20.598 "uuid": "e225e4f0-3073-49eb-bcbd-f2bca171ea0a", 00:21:20.598 "is_configured": true, 00:21:20.598 "data_offset": 256, 00:21:20.598 "data_size": 7936 00:21:20.598 }, 00:21:20.598 { 00:21:20.598 "name": "BaseBdev2", 00:21:20.598 "uuid": "7ab4fec8-223d-4072-9657-5a3057d022ac", 00:21:20.598 "is_configured": true, 00:21:20.598 "data_offset": 256, 00:21:20.598 "data_size": 7936 00:21:20.598 } 00:21:20.598 ] 00:21:20.598 }' 00:21:20.598 12:02:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:20.598 12:02:10 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:20.856 12:02:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:21:20.856 12:02:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:20.856 12:02:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:20.856 12:02:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:20.856 12:02:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:20.856 12:02:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@198 -- # local name 00:21:20.856 12:02:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:20.856 12:02:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:21.113 [2024-07-12 12:02:11.230448] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:21.113 12:02:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:21.113 "name": "Existed_Raid", 00:21:21.113 "aliases": [ 00:21:21.113 "8133e8ef-3492-46e4-93ec-e49921c948a7" 00:21:21.113 ], 00:21:21.113 "product_name": "Raid Volume", 00:21:21.113 "block_size": 4096, 00:21:21.113 "num_blocks": 7936, 00:21:21.113 "uuid": "8133e8ef-3492-46e4-93ec-e49921c948a7", 00:21:21.113 "assigned_rate_limits": { 00:21:21.113 "rw_ios_per_sec": 0, 00:21:21.113 "rw_mbytes_per_sec": 0, 00:21:21.113 "r_mbytes_per_sec": 0, 00:21:21.113 "w_mbytes_per_sec": 0 00:21:21.113 }, 00:21:21.113 "claimed": false, 00:21:21.113 "zoned": false, 00:21:21.113 "supported_io_types": { 00:21:21.113 "read": true, 00:21:21.113 "write": true, 00:21:21.113 "unmap": false, 00:21:21.113 "flush": false, 00:21:21.113 "reset": true, 00:21:21.113 "nvme_admin": false, 00:21:21.113 "nvme_io": false, 00:21:21.113 "nvme_io_md": false, 00:21:21.113 "write_zeroes": true, 00:21:21.113 "zcopy": false, 00:21:21.113 "get_zone_info": false, 00:21:21.113 "zone_management": false, 00:21:21.113 "zone_append": false, 00:21:21.113 "compare": false, 00:21:21.113 "compare_and_write": false, 00:21:21.113 "abort": false, 00:21:21.113 "seek_hole": false, 00:21:21.113 "seek_data": false, 00:21:21.113 "copy": false, 00:21:21.113 "nvme_iov_md": false 00:21:21.113 }, 00:21:21.113 "memory_domains": [ 00:21:21.113 { 00:21:21.113 "dma_device_id": "system", 00:21:21.113 "dma_device_type": 1 00:21:21.113 }, 00:21:21.113 { 00:21:21.113 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:21.113 "dma_device_type": 2 00:21:21.113 }, 00:21:21.113 { 00:21:21.113 "dma_device_id": "system", 00:21:21.113 "dma_device_type": 1 00:21:21.113 }, 00:21:21.113 { 00:21:21.113 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:21.113 "dma_device_type": 2 00:21:21.113 } 00:21:21.113 ], 00:21:21.113 "driver_specific": { 00:21:21.113 "raid": { 00:21:21.113 "uuid": "8133e8ef-3492-46e4-93ec-e49921c948a7", 00:21:21.113 "strip_size_kb": 0, 00:21:21.113 "state": "online", 00:21:21.113 "raid_level": "raid1", 00:21:21.113 "superblock": true, 00:21:21.113 "num_base_bdevs": 2, 00:21:21.113 "num_base_bdevs_discovered": 2, 00:21:21.113 "num_base_bdevs_operational": 2, 00:21:21.113 "base_bdevs_list": [ 00:21:21.113 { 00:21:21.113 "name": "BaseBdev1", 00:21:21.113 "uuid": "e225e4f0-3073-49eb-bcbd-f2bca171ea0a", 00:21:21.113 "is_configured": true, 00:21:21.113 "data_offset": 256, 00:21:21.113 "data_size": 7936 00:21:21.113 }, 00:21:21.113 { 00:21:21.113 "name": "BaseBdev2", 00:21:21.113 "uuid": "7ab4fec8-223d-4072-9657-5a3057d022ac", 00:21:21.113 "is_configured": true, 00:21:21.113 "data_offset": 256, 00:21:21.113 "data_size": 7936 00:21:21.113 } 00:21:21.113 ] 00:21:21.113 } 00:21:21.113 } 00:21:21.113 }' 00:21:21.113 12:02:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:21.113 12:02:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:21:21.113 BaseBdev2' 00:21:21.113 12:02:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:21.113 12:02:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:21:21.113 12:02:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:21.370 12:02:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:21.370 "name": "BaseBdev1", 00:21:21.370 "aliases": [ 00:21:21.370 "e225e4f0-3073-49eb-bcbd-f2bca171ea0a" 00:21:21.370 ], 00:21:21.370 "product_name": "Malloc disk", 00:21:21.370 "block_size": 4096, 00:21:21.370 "num_blocks": 8192, 00:21:21.370 "uuid": "e225e4f0-3073-49eb-bcbd-f2bca171ea0a", 00:21:21.370 "assigned_rate_limits": { 00:21:21.370 "rw_ios_per_sec": 0, 00:21:21.370 "rw_mbytes_per_sec": 0, 00:21:21.370 "r_mbytes_per_sec": 0, 00:21:21.370 "w_mbytes_per_sec": 0 00:21:21.370 }, 00:21:21.370 "claimed": true, 00:21:21.370 "claim_type": "exclusive_write", 00:21:21.370 "zoned": false, 00:21:21.370 "supported_io_types": { 00:21:21.370 "read": true, 00:21:21.370 "write": true, 00:21:21.370 "unmap": true, 00:21:21.370 "flush": true, 00:21:21.370 "reset": true, 00:21:21.370 "nvme_admin": false, 00:21:21.370 "nvme_io": false, 00:21:21.370 "nvme_io_md": false, 00:21:21.370 "write_zeroes": true, 00:21:21.370 "zcopy": true, 00:21:21.370 "get_zone_info": false, 00:21:21.371 "zone_management": false, 00:21:21.371 "zone_append": false, 00:21:21.371 "compare": false, 00:21:21.371 "compare_and_write": false, 00:21:21.371 "abort": true, 00:21:21.371 "seek_hole": false, 00:21:21.371 "seek_data": false, 00:21:21.371 "copy": true, 00:21:21.371 "nvme_iov_md": false 00:21:21.371 }, 00:21:21.371 "memory_domains": [ 00:21:21.371 { 00:21:21.371 "dma_device_id": "system", 00:21:21.371 "dma_device_type": 1 00:21:21.371 }, 00:21:21.371 { 00:21:21.371 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:21.371 "dma_device_type": 2 00:21:21.371 } 00:21:21.371 ], 00:21:21.371 "driver_specific": {} 00:21:21.371 }' 00:21:21.371 12:02:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:21.371 12:02:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:21.371 12:02:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:21:21.371 12:02:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:21.371 12:02:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:21.371 12:02:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:21.371 12:02:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:21.371 12:02:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:21.629 12:02:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:21.629 12:02:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:21.629 12:02:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:21.629 12:02:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:21.629 12:02:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:21.629 12:02:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:21.629 12:02:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:21.629 12:02:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:21.629 "name": "BaseBdev2", 00:21:21.629 "aliases": [ 00:21:21.629 "7ab4fec8-223d-4072-9657-5a3057d022ac" 00:21:21.629 ], 00:21:21.629 "product_name": "Malloc disk", 00:21:21.629 "block_size": 4096, 00:21:21.629 "num_blocks": 8192, 00:21:21.629 "uuid": "7ab4fec8-223d-4072-9657-5a3057d022ac", 00:21:21.629 "assigned_rate_limits": { 00:21:21.629 "rw_ios_per_sec": 0, 00:21:21.629 "rw_mbytes_per_sec": 0, 00:21:21.629 "r_mbytes_per_sec": 0, 00:21:21.629 "w_mbytes_per_sec": 0 00:21:21.629 }, 00:21:21.629 "claimed": true, 00:21:21.629 "claim_type": "exclusive_write", 00:21:21.629 "zoned": false, 00:21:21.629 "supported_io_types": { 00:21:21.629 "read": true, 00:21:21.629 "write": true, 00:21:21.629 "unmap": true, 00:21:21.629 "flush": true, 00:21:21.629 "reset": true, 00:21:21.629 "nvme_admin": false, 00:21:21.629 "nvme_io": false, 00:21:21.629 "nvme_io_md": false, 00:21:21.629 "write_zeroes": true, 00:21:21.629 "zcopy": true, 00:21:21.629 "get_zone_info": false, 00:21:21.629 "zone_management": false, 00:21:21.629 "zone_append": false, 00:21:21.629 "compare": false, 00:21:21.629 "compare_and_write": false, 00:21:21.629 "abort": true, 00:21:21.629 "seek_hole": false, 00:21:21.629 "seek_data": false, 00:21:21.629 "copy": true, 00:21:21.629 "nvme_iov_md": false 00:21:21.629 }, 00:21:21.629 "memory_domains": [ 00:21:21.629 { 00:21:21.629 "dma_device_id": "system", 00:21:21.629 "dma_device_type": 1 00:21:21.629 }, 00:21:21.629 { 00:21:21.629 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:21.629 "dma_device_type": 2 00:21:21.629 } 00:21:21.629 ], 00:21:21.629 "driver_specific": {} 00:21:21.629 }' 00:21:21.629 12:02:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:21.887 12:02:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:21.887 12:02:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:21:21.887 12:02:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:21.887 12:02:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:21.887 12:02:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:21.887 12:02:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:21.887 12:02:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:21.887 12:02:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:21.887 12:02:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:21.887 12:02:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:21.887 12:02:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:21.887 12:02:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:22.146 [2024-07-12 12:02:12.273000] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:22.146 12:02:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@275 -- # local expected_state 00:21:22.146 12:02:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:21:22.146 12:02:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:22.146 12:02:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:21:22.146 12:02:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:21:22.146 12:02:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:21:22.146 12:02:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:22.146 12:02:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:22.146 12:02:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:22.146 12:02:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:22.146 12:02:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:22.146 12:02:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:22.146 12:02:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:22.146 12:02:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:22.146 12:02:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:22.146 12:02:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:22.146 12:02:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:22.410 12:02:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:22.410 "name": "Existed_Raid", 00:21:22.410 "uuid": "8133e8ef-3492-46e4-93ec-e49921c948a7", 00:21:22.410 "strip_size_kb": 0, 00:21:22.410 "state": "online", 00:21:22.410 "raid_level": "raid1", 00:21:22.410 "superblock": true, 00:21:22.410 "num_base_bdevs": 2, 00:21:22.410 "num_base_bdevs_discovered": 1, 00:21:22.410 "num_base_bdevs_operational": 1, 00:21:22.410 "base_bdevs_list": [ 00:21:22.410 { 00:21:22.410 "name": null, 00:21:22.410 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:22.410 "is_configured": false, 00:21:22.410 "data_offset": 256, 00:21:22.410 "data_size": 7936 00:21:22.410 }, 00:21:22.410 { 00:21:22.410 "name": "BaseBdev2", 00:21:22.410 "uuid": "7ab4fec8-223d-4072-9657-5a3057d022ac", 00:21:22.410 "is_configured": true, 00:21:22.410 "data_offset": 256, 00:21:22.410 "data_size": 7936 00:21:22.410 } 00:21:22.410 ] 00:21:22.410 }' 00:21:22.410 12:02:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:22.410 12:02:12 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:22.975 12:02:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:21:22.975 12:02:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:22.975 12:02:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:22.975 12:02:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:22.975 12:02:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:22.975 12:02:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:22.975 12:02:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:21:23.232 [2024-07-12 12:02:13.248397] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:23.232 [2024-07-12 12:02:13.248460] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:23.232 [2024-07-12 12:02:13.258345] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:23.232 [2024-07-12 12:02:13.258387] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:23.232 [2024-07-12 12:02:13.258393] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcca890 name Existed_Raid, state offline 00:21:23.232 12:02:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:23.232 12:02:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:23.232 12:02:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:23.232 12:02:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:21:23.232 12:02:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:21:23.232 12:02:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:21:23.232 12:02:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:21:23.232 12:02:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@341 -- # killprocess 725263 00:21:23.232 12:02:13 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@948 -- # '[' -z 725263 ']' 00:21:23.232 12:02:13 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@952 -- # kill -0 725263 00:21:23.232 12:02:13 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # uname 00:21:23.232 12:02:13 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:23.232 12:02:13 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 725263 00:21:23.232 12:02:13 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:23.232 12:02:13 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:23.232 12:02:13 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 725263' 00:21:23.232 killing process with pid 725263 00:21:23.232 12:02:13 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@967 -- # kill 725263 00:21:23.232 [2024-07-12 12:02:13.474501] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:23.232 12:02:13 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@972 -- # wait 725263 00:21:23.232 [2024-07-12 12:02:13.475255] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:23.491 12:02:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@343 -- # return 0 00:21:23.491 00:21:23.491 real 0m7.843s 00:21:23.491 user 0m14.108s 00:21:23.491 sys 0m1.244s 00:21:23.491 12:02:13 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:23.491 12:02:13 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:23.491 ************************************ 00:21:23.491 END TEST raid_state_function_test_sb_4k 00:21:23.491 ************************************ 00:21:23.491 12:02:13 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:23.491 12:02:13 bdev_raid -- bdev/bdev_raid.sh@899 -- # run_test raid_superblock_test_4k raid_superblock_test raid1 2 00:21:23.491 12:02:13 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:21:23.491 12:02:13 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:23.491 12:02:13 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:23.491 ************************************ 00:21:23.491 START TEST raid_superblock_test_4k 00:21:23.491 ************************************ 00:21:23.491 12:02:13 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:21:23.491 12:02:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:21:23.491 12:02:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:21:23.491 12:02:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:21:23.491 12:02:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:21:23.491 12:02:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:21:23.491 12:02:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:21:23.491 12:02:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:21:23.491 12:02:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:21:23.491 12:02:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:21:23.491 12:02:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@398 -- # local strip_size 00:21:23.491 12:02:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:21:23.491 12:02:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:21:23.491 12:02:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:21:23.491 12:02:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:21:23.491 12:02:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:21:23.491 12:02:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@411 -- # raid_pid=726725 00:21:23.491 12:02:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@412 -- # waitforlisten 726725 /var/tmp/spdk-raid.sock 00:21:23.491 12:02:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:21:23.491 12:02:13 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@829 -- # '[' -z 726725 ']' 00:21:23.491 12:02:13 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:23.491 12:02:13 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:23.491 12:02:13 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:23.491 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:23.491 12:02:13 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:23.491 12:02:13 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:21:23.750 [2024-07-12 12:02:13.772097] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:21:23.750 [2024-07-12 12:02:13.772134] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid726725 ] 00:21:23.750 [2024-07-12 12:02:13.835813] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:23.750 [2024-07-12 12:02:13.913764] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:21:23.750 [2024-07-12 12:02:13.969451] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:23.750 [2024-07-12 12:02:13.969479] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:24.685 12:02:14 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:24.685 12:02:14 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@862 -- # return 0 00:21:24.685 12:02:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:21:24.685 12:02:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:24.685 12:02:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:21:24.685 12:02:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:21:24.685 12:02:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:21:24.685 12:02:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:24.685 12:02:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:24.685 12:02:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:24.685 12:02:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc1 00:21:24.685 malloc1 00:21:24.685 12:02:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:24.685 [2024-07-12 12:02:14.897700] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:24.685 [2024-07-12 12:02:14.897732] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:24.685 [2024-07-12 12:02:14.897744] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b66270 00:21:24.685 [2024-07-12 12:02:14.897765] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:24.685 [2024-07-12 12:02:14.898925] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:24.685 [2024-07-12 12:02:14.898944] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:24.685 pt1 00:21:24.685 12:02:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:24.685 12:02:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:24.685 12:02:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:21:24.685 12:02:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:21:24.685 12:02:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:21:24.685 12:02:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:24.685 12:02:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:24.685 12:02:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:24.685 12:02:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc2 00:21:24.944 malloc2 00:21:24.944 12:02:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:25.202 [2024-07-12 12:02:15.222088] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:25.202 [2024-07-12 12:02:15.222119] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:25.202 [2024-07-12 12:02:15.222128] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b67580 00:21:25.202 [2024-07-12 12:02:15.222134] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:25.202 [2024-07-12 12:02:15.223195] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:25.202 [2024-07-12 12:02:15.223214] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:25.202 pt2 00:21:25.202 12:02:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:25.202 12:02:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:25.202 12:02:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:21:25.202 [2024-07-12 12:02:15.390538] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:25.202 [2024-07-12 12:02:15.391444] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:25.202 [2024-07-12 12:02:15.391552] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d11890 00:21:25.202 [2024-07-12 12:02:15.391560] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:21:25.202 [2024-07-12 12:02:15.391691] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b7d160 00:21:25.202 [2024-07-12 12:02:15.391800] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d11890 00:21:25.202 [2024-07-12 12:02:15.391805] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d11890 00:21:25.202 [2024-07-12 12:02:15.391866] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:25.202 12:02:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:25.202 12:02:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:25.202 12:02:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:25.202 12:02:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:25.202 12:02:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:25.202 12:02:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:25.202 12:02:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:25.202 12:02:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:25.202 12:02:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:25.202 12:02:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:25.202 12:02:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:25.202 12:02:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:25.464 12:02:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:25.464 "name": "raid_bdev1", 00:21:25.464 "uuid": "e765c662-f7af-4b3f-8a44-e3c5db911f14", 00:21:25.464 "strip_size_kb": 0, 00:21:25.464 "state": "online", 00:21:25.464 "raid_level": "raid1", 00:21:25.464 "superblock": true, 00:21:25.464 "num_base_bdevs": 2, 00:21:25.464 "num_base_bdevs_discovered": 2, 00:21:25.464 "num_base_bdevs_operational": 2, 00:21:25.464 "base_bdevs_list": [ 00:21:25.464 { 00:21:25.464 "name": "pt1", 00:21:25.464 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:25.464 "is_configured": true, 00:21:25.464 "data_offset": 256, 00:21:25.464 "data_size": 7936 00:21:25.464 }, 00:21:25.464 { 00:21:25.464 "name": "pt2", 00:21:25.464 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:25.464 "is_configured": true, 00:21:25.464 "data_offset": 256, 00:21:25.464 "data_size": 7936 00:21:25.464 } 00:21:25.464 ] 00:21:25.464 }' 00:21:25.464 12:02:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:25.464 12:02:15 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:21:26.030 12:02:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:21:26.030 12:02:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:21:26.030 12:02:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:26.030 12:02:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:26.030 12:02:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:26.030 12:02:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:21:26.030 12:02:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:26.030 12:02:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:26.030 [2024-07-12 12:02:16.204776] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:26.030 12:02:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:26.030 "name": "raid_bdev1", 00:21:26.030 "aliases": [ 00:21:26.030 "e765c662-f7af-4b3f-8a44-e3c5db911f14" 00:21:26.030 ], 00:21:26.030 "product_name": "Raid Volume", 00:21:26.030 "block_size": 4096, 00:21:26.030 "num_blocks": 7936, 00:21:26.030 "uuid": "e765c662-f7af-4b3f-8a44-e3c5db911f14", 00:21:26.030 "assigned_rate_limits": { 00:21:26.030 "rw_ios_per_sec": 0, 00:21:26.030 "rw_mbytes_per_sec": 0, 00:21:26.030 "r_mbytes_per_sec": 0, 00:21:26.030 "w_mbytes_per_sec": 0 00:21:26.030 }, 00:21:26.030 "claimed": false, 00:21:26.030 "zoned": false, 00:21:26.030 "supported_io_types": { 00:21:26.030 "read": true, 00:21:26.030 "write": true, 00:21:26.030 "unmap": false, 00:21:26.030 "flush": false, 00:21:26.030 "reset": true, 00:21:26.030 "nvme_admin": false, 00:21:26.030 "nvme_io": false, 00:21:26.030 "nvme_io_md": false, 00:21:26.030 "write_zeroes": true, 00:21:26.030 "zcopy": false, 00:21:26.030 "get_zone_info": false, 00:21:26.030 "zone_management": false, 00:21:26.030 "zone_append": false, 00:21:26.030 "compare": false, 00:21:26.030 "compare_and_write": false, 00:21:26.030 "abort": false, 00:21:26.030 "seek_hole": false, 00:21:26.030 "seek_data": false, 00:21:26.030 "copy": false, 00:21:26.030 "nvme_iov_md": false 00:21:26.030 }, 00:21:26.030 "memory_domains": [ 00:21:26.030 { 00:21:26.030 "dma_device_id": "system", 00:21:26.030 "dma_device_type": 1 00:21:26.030 }, 00:21:26.030 { 00:21:26.030 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:26.030 "dma_device_type": 2 00:21:26.030 }, 00:21:26.030 { 00:21:26.030 "dma_device_id": "system", 00:21:26.030 "dma_device_type": 1 00:21:26.030 }, 00:21:26.030 { 00:21:26.030 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:26.030 "dma_device_type": 2 00:21:26.030 } 00:21:26.030 ], 00:21:26.030 "driver_specific": { 00:21:26.030 "raid": { 00:21:26.030 "uuid": "e765c662-f7af-4b3f-8a44-e3c5db911f14", 00:21:26.030 "strip_size_kb": 0, 00:21:26.030 "state": "online", 00:21:26.030 "raid_level": "raid1", 00:21:26.030 "superblock": true, 00:21:26.030 "num_base_bdevs": 2, 00:21:26.030 "num_base_bdevs_discovered": 2, 00:21:26.030 "num_base_bdevs_operational": 2, 00:21:26.030 "base_bdevs_list": [ 00:21:26.030 { 00:21:26.030 "name": "pt1", 00:21:26.030 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:26.030 "is_configured": true, 00:21:26.030 "data_offset": 256, 00:21:26.030 "data_size": 7936 00:21:26.030 }, 00:21:26.030 { 00:21:26.030 "name": "pt2", 00:21:26.030 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:26.030 "is_configured": true, 00:21:26.030 "data_offset": 256, 00:21:26.030 "data_size": 7936 00:21:26.030 } 00:21:26.030 ] 00:21:26.030 } 00:21:26.030 } 00:21:26.030 }' 00:21:26.030 12:02:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:26.030 12:02:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:21:26.030 pt2' 00:21:26.031 12:02:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:26.031 12:02:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:21:26.031 12:02:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:26.289 12:02:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:26.289 "name": "pt1", 00:21:26.289 "aliases": [ 00:21:26.289 "00000000-0000-0000-0000-000000000001" 00:21:26.289 ], 00:21:26.289 "product_name": "passthru", 00:21:26.289 "block_size": 4096, 00:21:26.289 "num_blocks": 8192, 00:21:26.289 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:26.289 "assigned_rate_limits": { 00:21:26.289 "rw_ios_per_sec": 0, 00:21:26.289 "rw_mbytes_per_sec": 0, 00:21:26.289 "r_mbytes_per_sec": 0, 00:21:26.289 "w_mbytes_per_sec": 0 00:21:26.289 }, 00:21:26.289 "claimed": true, 00:21:26.289 "claim_type": "exclusive_write", 00:21:26.289 "zoned": false, 00:21:26.289 "supported_io_types": { 00:21:26.289 "read": true, 00:21:26.289 "write": true, 00:21:26.289 "unmap": true, 00:21:26.289 "flush": true, 00:21:26.289 "reset": true, 00:21:26.289 "nvme_admin": false, 00:21:26.289 "nvme_io": false, 00:21:26.289 "nvme_io_md": false, 00:21:26.289 "write_zeroes": true, 00:21:26.289 "zcopy": true, 00:21:26.289 "get_zone_info": false, 00:21:26.289 "zone_management": false, 00:21:26.289 "zone_append": false, 00:21:26.289 "compare": false, 00:21:26.289 "compare_and_write": false, 00:21:26.289 "abort": true, 00:21:26.289 "seek_hole": false, 00:21:26.289 "seek_data": false, 00:21:26.289 "copy": true, 00:21:26.289 "nvme_iov_md": false 00:21:26.289 }, 00:21:26.289 "memory_domains": [ 00:21:26.289 { 00:21:26.289 "dma_device_id": "system", 00:21:26.289 "dma_device_type": 1 00:21:26.289 }, 00:21:26.289 { 00:21:26.289 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:26.289 "dma_device_type": 2 00:21:26.289 } 00:21:26.289 ], 00:21:26.289 "driver_specific": { 00:21:26.289 "passthru": { 00:21:26.289 "name": "pt1", 00:21:26.289 "base_bdev_name": "malloc1" 00:21:26.289 } 00:21:26.289 } 00:21:26.289 }' 00:21:26.289 12:02:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:26.289 12:02:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:26.289 12:02:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:21:26.289 12:02:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:26.289 12:02:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:26.548 12:02:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:26.548 12:02:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:26.548 12:02:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:26.548 12:02:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:26.548 12:02:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:26.548 12:02:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:26.548 12:02:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:26.548 12:02:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:26.548 12:02:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:21:26.548 12:02:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:26.806 12:02:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:26.807 "name": "pt2", 00:21:26.807 "aliases": [ 00:21:26.807 "00000000-0000-0000-0000-000000000002" 00:21:26.807 ], 00:21:26.807 "product_name": "passthru", 00:21:26.807 "block_size": 4096, 00:21:26.807 "num_blocks": 8192, 00:21:26.807 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:26.807 "assigned_rate_limits": { 00:21:26.807 "rw_ios_per_sec": 0, 00:21:26.807 "rw_mbytes_per_sec": 0, 00:21:26.807 "r_mbytes_per_sec": 0, 00:21:26.807 "w_mbytes_per_sec": 0 00:21:26.807 }, 00:21:26.807 "claimed": true, 00:21:26.807 "claim_type": "exclusive_write", 00:21:26.807 "zoned": false, 00:21:26.807 "supported_io_types": { 00:21:26.807 "read": true, 00:21:26.807 "write": true, 00:21:26.807 "unmap": true, 00:21:26.807 "flush": true, 00:21:26.807 "reset": true, 00:21:26.807 "nvme_admin": false, 00:21:26.807 "nvme_io": false, 00:21:26.807 "nvme_io_md": false, 00:21:26.807 "write_zeroes": true, 00:21:26.807 "zcopy": true, 00:21:26.807 "get_zone_info": false, 00:21:26.807 "zone_management": false, 00:21:26.807 "zone_append": false, 00:21:26.807 "compare": false, 00:21:26.807 "compare_and_write": false, 00:21:26.807 "abort": true, 00:21:26.807 "seek_hole": false, 00:21:26.807 "seek_data": false, 00:21:26.807 "copy": true, 00:21:26.807 "nvme_iov_md": false 00:21:26.807 }, 00:21:26.807 "memory_domains": [ 00:21:26.807 { 00:21:26.807 "dma_device_id": "system", 00:21:26.807 "dma_device_type": 1 00:21:26.807 }, 00:21:26.807 { 00:21:26.807 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:26.807 "dma_device_type": 2 00:21:26.807 } 00:21:26.807 ], 00:21:26.807 "driver_specific": { 00:21:26.807 "passthru": { 00:21:26.807 "name": "pt2", 00:21:26.807 "base_bdev_name": "malloc2" 00:21:26.807 } 00:21:26.807 } 00:21:26.807 }' 00:21:26.807 12:02:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:26.807 12:02:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:26.807 12:02:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:21:26.807 12:02:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:26.807 12:02:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:26.807 12:02:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:26.807 12:02:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:26.807 12:02:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:27.065 12:02:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:27.065 12:02:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:27.065 12:02:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:27.065 12:02:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:27.065 12:02:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:27.065 12:02:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:21:27.065 [2024-07-12 12:02:17.263524] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:27.065 12:02:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=e765c662-f7af-4b3f-8a44-e3c5db911f14 00:21:27.065 12:02:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@435 -- # '[' -z e765c662-f7af-4b3f-8a44-e3c5db911f14 ']' 00:21:27.065 12:02:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:27.323 [2024-07-12 12:02:17.423774] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:27.323 [2024-07-12 12:02:17.423786] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:27.323 [2024-07-12 12:02:17.423821] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:27.323 [2024-07-12 12:02:17.423857] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:27.323 [2024-07-12 12:02:17.423865] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d11890 name raid_bdev1, state offline 00:21:27.323 12:02:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:27.323 12:02:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:21:27.582 12:02:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:21:27.582 12:02:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:21:27.582 12:02:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:27.582 12:02:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:21:27.582 12:02:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:27.582 12:02:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:27.840 12:02:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:21:27.840 12:02:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:21:28.099 12:02:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:21:28.099 12:02:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:21:28.099 12:02:18 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@648 -- # local es=0 00:21:28.099 12:02:18 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:21:28.099 12:02:18 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:28.099 12:02:18 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:28.099 12:02:18 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:28.099 12:02:18 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:28.099 12:02:18 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:28.099 12:02:18 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:28.099 12:02:18 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:28.099 12:02:18 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:21:28.099 12:02:18 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:21:28.099 [2024-07-12 12:02:18.237875] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:21:28.099 [2024-07-12 12:02:18.238854] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:21:28.099 [2024-07-12 12:02:18.238896] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:21:28.099 [2024-07-12 12:02:18.238921] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:21:28.099 [2024-07-12 12:02:18.238931] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:28.099 [2024-07-12 12:02:18.238953] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d10a50 name raid_bdev1, state configuring 00:21:28.099 request: 00:21:28.099 { 00:21:28.099 "name": "raid_bdev1", 00:21:28.099 "raid_level": "raid1", 00:21:28.099 "base_bdevs": [ 00:21:28.099 "malloc1", 00:21:28.099 "malloc2" 00:21:28.099 ], 00:21:28.099 "superblock": false, 00:21:28.099 "method": "bdev_raid_create", 00:21:28.099 "req_id": 1 00:21:28.099 } 00:21:28.099 Got JSON-RPC error response 00:21:28.099 response: 00:21:28.099 { 00:21:28.099 "code": -17, 00:21:28.099 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:21:28.099 } 00:21:28.099 12:02:18 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # es=1 00:21:28.099 12:02:18 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:21:28.099 12:02:18 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:21:28.099 12:02:18 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:21:28.099 12:02:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:28.099 12:02:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:21:28.357 12:02:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:21:28.357 12:02:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:21:28.357 12:02:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:28.357 [2024-07-12 12:02:18.582746] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:28.357 [2024-07-12 12:02:18.582770] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:28.357 [2024-07-12 12:02:18.582781] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b664a0 00:21:28.357 [2024-07-12 12:02:18.582802] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:28.357 [2024-07-12 12:02:18.583970] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:28.357 [2024-07-12 12:02:18.583992] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:28.357 [2024-07-12 12:02:18.584036] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:21:28.357 [2024-07-12 12:02:18.584056] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:28.357 pt1 00:21:28.357 12:02:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:21:28.357 12:02:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:28.357 12:02:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:28.357 12:02:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:28.357 12:02:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:28.357 12:02:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:28.357 12:02:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:28.357 12:02:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:28.358 12:02:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:28.358 12:02:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:28.358 12:02:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:28.358 12:02:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:28.616 12:02:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:28.616 "name": "raid_bdev1", 00:21:28.616 "uuid": "e765c662-f7af-4b3f-8a44-e3c5db911f14", 00:21:28.616 "strip_size_kb": 0, 00:21:28.616 "state": "configuring", 00:21:28.616 "raid_level": "raid1", 00:21:28.616 "superblock": true, 00:21:28.616 "num_base_bdevs": 2, 00:21:28.616 "num_base_bdevs_discovered": 1, 00:21:28.616 "num_base_bdevs_operational": 2, 00:21:28.616 "base_bdevs_list": [ 00:21:28.616 { 00:21:28.616 "name": "pt1", 00:21:28.616 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:28.616 "is_configured": true, 00:21:28.616 "data_offset": 256, 00:21:28.616 "data_size": 7936 00:21:28.616 }, 00:21:28.616 { 00:21:28.616 "name": null, 00:21:28.616 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:28.616 "is_configured": false, 00:21:28.616 "data_offset": 256, 00:21:28.616 "data_size": 7936 00:21:28.616 } 00:21:28.616 ] 00:21:28.616 }' 00:21:28.616 12:02:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:28.616 12:02:18 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:21:29.183 12:02:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:21:29.183 12:02:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:21:29.183 12:02:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:29.183 12:02:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:29.183 [2024-07-12 12:02:19.348726] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:29.183 [2024-07-12 12:02:19.348776] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:29.183 [2024-07-12 12:02:19.348788] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d10e10 00:21:29.183 [2024-07-12 12:02:19.348810] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:29.183 [2024-07-12 12:02:19.349068] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:29.183 [2024-07-12 12:02:19.349077] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:29.183 [2024-07-12 12:02:19.349118] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:29.183 [2024-07-12 12:02:19.349130] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:29.183 [2024-07-12 12:02:19.349201] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b653b0 00:21:29.183 [2024-07-12 12:02:19.349207] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:21:29.183 [2024-07-12 12:02:19.349317] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b66df0 00:21:29.183 [2024-07-12 12:02:19.349402] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b653b0 00:21:29.183 [2024-07-12 12:02:19.349407] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1b653b0 00:21:29.183 [2024-07-12 12:02:19.349473] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:29.183 pt2 00:21:29.183 12:02:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:21:29.183 12:02:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:29.183 12:02:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:29.183 12:02:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:29.183 12:02:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:29.183 12:02:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:29.183 12:02:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:29.183 12:02:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:29.183 12:02:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:29.183 12:02:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:29.183 12:02:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:29.183 12:02:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:29.183 12:02:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:29.183 12:02:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:29.441 12:02:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:29.441 "name": "raid_bdev1", 00:21:29.441 "uuid": "e765c662-f7af-4b3f-8a44-e3c5db911f14", 00:21:29.441 "strip_size_kb": 0, 00:21:29.441 "state": "online", 00:21:29.441 "raid_level": "raid1", 00:21:29.441 "superblock": true, 00:21:29.441 "num_base_bdevs": 2, 00:21:29.441 "num_base_bdevs_discovered": 2, 00:21:29.441 "num_base_bdevs_operational": 2, 00:21:29.441 "base_bdevs_list": [ 00:21:29.441 { 00:21:29.441 "name": "pt1", 00:21:29.441 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:29.441 "is_configured": true, 00:21:29.441 "data_offset": 256, 00:21:29.441 "data_size": 7936 00:21:29.441 }, 00:21:29.441 { 00:21:29.441 "name": "pt2", 00:21:29.441 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:29.441 "is_configured": true, 00:21:29.441 "data_offset": 256, 00:21:29.441 "data_size": 7936 00:21:29.441 } 00:21:29.441 ] 00:21:29.441 }' 00:21:29.441 12:02:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:29.442 12:02:19 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:21:30.009 12:02:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:21:30.009 12:02:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:21:30.009 12:02:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:30.009 12:02:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:30.009 12:02:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:30.009 12:02:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:21:30.009 12:02:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:30.009 12:02:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:30.009 [2024-07-12 12:02:20.154961] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:30.009 12:02:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:30.009 "name": "raid_bdev1", 00:21:30.009 "aliases": [ 00:21:30.009 "e765c662-f7af-4b3f-8a44-e3c5db911f14" 00:21:30.009 ], 00:21:30.009 "product_name": "Raid Volume", 00:21:30.009 "block_size": 4096, 00:21:30.009 "num_blocks": 7936, 00:21:30.009 "uuid": "e765c662-f7af-4b3f-8a44-e3c5db911f14", 00:21:30.009 "assigned_rate_limits": { 00:21:30.009 "rw_ios_per_sec": 0, 00:21:30.009 "rw_mbytes_per_sec": 0, 00:21:30.009 "r_mbytes_per_sec": 0, 00:21:30.009 "w_mbytes_per_sec": 0 00:21:30.009 }, 00:21:30.009 "claimed": false, 00:21:30.009 "zoned": false, 00:21:30.009 "supported_io_types": { 00:21:30.009 "read": true, 00:21:30.009 "write": true, 00:21:30.009 "unmap": false, 00:21:30.009 "flush": false, 00:21:30.009 "reset": true, 00:21:30.009 "nvme_admin": false, 00:21:30.009 "nvme_io": false, 00:21:30.009 "nvme_io_md": false, 00:21:30.009 "write_zeroes": true, 00:21:30.009 "zcopy": false, 00:21:30.009 "get_zone_info": false, 00:21:30.009 "zone_management": false, 00:21:30.009 "zone_append": false, 00:21:30.009 "compare": false, 00:21:30.009 "compare_and_write": false, 00:21:30.009 "abort": false, 00:21:30.009 "seek_hole": false, 00:21:30.009 "seek_data": false, 00:21:30.009 "copy": false, 00:21:30.009 "nvme_iov_md": false 00:21:30.009 }, 00:21:30.009 "memory_domains": [ 00:21:30.009 { 00:21:30.009 "dma_device_id": "system", 00:21:30.009 "dma_device_type": 1 00:21:30.009 }, 00:21:30.009 { 00:21:30.009 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:30.009 "dma_device_type": 2 00:21:30.009 }, 00:21:30.009 { 00:21:30.009 "dma_device_id": "system", 00:21:30.009 "dma_device_type": 1 00:21:30.009 }, 00:21:30.009 { 00:21:30.009 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:30.009 "dma_device_type": 2 00:21:30.009 } 00:21:30.009 ], 00:21:30.009 "driver_specific": { 00:21:30.009 "raid": { 00:21:30.009 "uuid": "e765c662-f7af-4b3f-8a44-e3c5db911f14", 00:21:30.009 "strip_size_kb": 0, 00:21:30.009 "state": "online", 00:21:30.009 "raid_level": "raid1", 00:21:30.009 "superblock": true, 00:21:30.009 "num_base_bdevs": 2, 00:21:30.009 "num_base_bdevs_discovered": 2, 00:21:30.009 "num_base_bdevs_operational": 2, 00:21:30.009 "base_bdevs_list": [ 00:21:30.009 { 00:21:30.009 "name": "pt1", 00:21:30.010 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:30.010 "is_configured": true, 00:21:30.010 "data_offset": 256, 00:21:30.010 "data_size": 7936 00:21:30.010 }, 00:21:30.010 { 00:21:30.010 "name": "pt2", 00:21:30.010 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:30.010 "is_configured": true, 00:21:30.010 "data_offset": 256, 00:21:30.010 "data_size": 7936 00:21:30.010 } 00:21:30.010 ] 00:21:30.010 } 00:21:30.010 } 00:21:30.010 }' 00:21:30.010 12:02:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:30.010 12:02:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:21:30.010 pt2' 00:21:30.010 12:02:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:30.010 12:02:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:21:30.010 12:02:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:30.268 12:02:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:30.268 "name": "pt1", 00:21:30.268 "aliases": [ 00:21:30.268 "00000000-0000-0000-0000-000000000001" 00:21:30.268 ], 00:21:30.268 "product_name": "passthru", 00:21:30.268 "block_size": 4096, 00:21:30.268 "num_blocks": 8192, 00:21:30.268 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:30.268 "assigned_rate_limits": { 00:21:30.268 "rw_ios_per_sec": 0, 00:21:30.268 "rw_mbytes_per_sec": 0, 00:21:30.268 "r_mbytes_per_sec": 0, 00:21:30.269 "w_mbytes_per_sec": 0 00:21:30.269 }, 00:21:30.269 "claimed": true, 00:21:30.269 "claim_type": "exclusive_write", 00:21:30.269 "zoned": false, 00:21:30.269 "supported_io_types": { 00:21:30.269 "read": true, 00:21:30.269 "write": true, 00:21:30.269 "unmap": true, 00:21:30.269 "flush": true, 00:21:30.269 "reset": true, 00:21:30.269 "nvme_admin": false, 00:21:30.269 "nvme_io": false, 00:21:30.269 "nvme_io_md": false, 00:21:30.269 "write_zeroes": true, 00:21:30.269 "zcopy": true, 00:21:30.269 "get_zone_info": false, 00:21:30.269 "zone_management": false, 00:21:30.269 "zone_append": false, 00:21:30.269 "compare": false, 00:21:30.269 "compare_and_write": false, 00:21:30.269 "abort": true, 00:21:30.269 "seek_hole": false, 00:21:30.269 "seek_data": false, 00:21:30.269 "copy": true, 00:21:30.269 "nvme_iov_md": false 00:21:30.269 }, 00:21:30.269 "memory_domains": [ 00:21:30.269 { 00:21:30.269 "dma_device_id": "system", 00:21:30.269 "dma_device_type": 1 00:21:30.269 }, 00:21:30.269 { 00:21:30.269 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:30.269 "dma_device_type": 2 00:21:30.269 } 00:21:30.269 ], 00:21:30.269 "driver_specific": { 00:21:30.269 "passthru": { 00:21:30.269 "name": "pt1", 00:21:30.269 "base_bdev_name": "malloc1" 00:21:30.269 } 00:21:30.269 } 00:21:30.269 }' 00:21:30.269 12:02:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:30.269 12:02:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:30.269 12:02:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:21:30.269 12:02:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:30.269 12:02:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:30.526 12:02:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:30.526 12:02:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:30.526 12:02:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:30.526 12:02:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:30.526 12:02:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:30.526 12:02:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:30.526 12:02:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:30.526 12:02:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:30.526 12:02:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:21:30.526 12:02:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:30.784 12:02:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:30.784 "name": "pt2", 00:21:30.784 "aliases": [ 00:21:30.784 "00000000-0000-0000-0000-000000000002" 00:21:30.784 ], 00:21:30.784 "product_name": "passthru", 00:21:30.784 "block_size": 4096, 00:21:30.784 "num_blocks": 8192, 00:21:30.784 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:30.784 "assigned_rate_limits": { 00:21:30.784 "rw_ios_per_sec": 0, 00:21:30.784 "rw_mbytes_per_sec": 0, 00:21:30.784 "r_mbytes_per_sec": 0, 00:21:30.784 "w_mbytes_per_sec": 0 00:21:30.784 }, 00:21:30.784 "claimed": true, 00:21:30.784 "claim_type": "exclusive_write", 00:21:30.784 "zoned": false, 00:21:30.784 "supported_io_types": { 00:21:30.784 "read": true, 00:21:30.784 "write": true, 00:21:30.784 "unmap": true, 00:21:30.784 "flush": true, 00:21:30.784 "reset": true, 00:21:30.784 "nvme_admin": false, 00:21:30.784 "nvme_io": false, 00:21:30.784 "nvme_io_md": false, 00:21:30.784 "write_zeroes": true, 00:21:30.784 "zcopy": true, 00:21:30.784 "get_zone_info": false, 00:21:30.784 "zone_management": false, 00:21:30.784 "zone_append": false, 00:21:30.784 "compare": false, 00:21:30.784 "compare_and_write": false, 00:21:30.784 "abort": true, 00:21:30.784 "seek_hole": false, 00:21:30.784 "seek_data": false, 00:21:30.784 "copy": true, 00:21:30.784 "nvme_iov_md": false 00:21:30.784 }, 00:21:30.784 "memory_domains": [ 00:21:30.784 { 00:21:30.784 "dma_device_id": "system", 00:21:30.784 "dma_device_type": 1 00:21:30.784 }, 00:21:30.784 { 00:21:30.784 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:30.784 "dma_device_type": 2 00:21:30.784 } 00:21:30.784 ], 00:21:30.784 "driver_specific": { 00:21:30.784 "passthru": { 00:21:30.784 "name": "pt2", 00:21:30.784 "base_bdev_name": "malloc2" 00:21:30.784 } 00:21:30.784 } 00:21:30.784 }' 00:21:30.784 12:02:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:30.784 12:02:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:30.784 12:02:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:21:30.784 12:02:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:30.784 12:02:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:31.045 12:02:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:31.045 12:02:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:31.045 12:02:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:31.045 12:02:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:31.045 12:02:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:31.045 12:02:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:31.045 12:02:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:31.045 12:02:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:31.045 12:02:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:21:31.304 [2024-07-12 12:02:21.346038] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:31.304 12:02:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # '[' e765c662-f7af-4b3f-8a44-e3c5db911f14 '!=' e765c662-f7af-4b3f-8a44-e3c5db911f14 ']' 00:21:31.304 12:02:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:21:31.304 12:02:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:31.304 12:02:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:21:31.304 12:02:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:21:31.304 [2024-07-12 12:02:21.514331] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:21:31.304 12:02:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:31.304 12:02:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:31.304 12:02:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:31.304 12:02:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:31.304 12:02:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:31.304 12:02:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:31.304 12:02:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:31.304 12:02:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:31.304 12:02:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:31.304 12:02:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:31.304 12:02:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:31.304 12:02:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:31.563 12:02:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:31.564 "name": "raid_bdev1", 00:21:31.564 "uuid": "e765c662-f7af-4b3f-8a44-e3c5db911f14", 00:21:31.564 "strip_size_kb": 0, 00:21:31.564 "state": "online", 00:21:31.564 "raid_level": "raid1", 00:21:31.564 "superblock": true, 00:21:31.564 "num_base_bdevs": 2, 00:21:31.564 "num_base_bdevs_discovered": 1, 00:21:31.564 "num_base_bdevs_operational": 1, 00:21:31.564 "base_bdevs_list": [ 00:21:31.564 { 00:21:31.564 "name": null, 00:21:31.564 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:31.564 "is_configured": false, 00:21:31.564 "data_offset": 256, 00:21:31.564 "data_size": 7936 00:21:31.564 }, 00:21:31.564 { 00:21:31.564 "name": "pt2", 00:21:31.564 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:31.564 "is_configured": true, 00:21:31.564 "data_offset": 256, 00:21:31.564 "data_size": 7936 00:21:31.564 } 00:21:31.564 ] 00:21:31.564 }' 00:21:31.564 12:02:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:31.564 12:02:21 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:21:32.130 12:02:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:32.130 [2024-07-12 12:02:22.332431] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:32.130 [2024-07-12 12:02:22.332450] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:32.130 [2024-07-12 12:02:22.332491] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:32.130 [2024-07-12 12:02:22.332527] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:32.130 [2024-07-12 12:02:22.332533] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b653b0 name raid_bdev1, state offline 00:21:32.130 12:02:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:32.130 12:02:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:21:32.389 12:02:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:21:32.389 12:02:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:21:32.389 12:02:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:21:32.389 12:02:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:21:32.389 12:02:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:32.648 12:02:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:21:32.648 12:02:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:21:32.648 12:02:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:21:32.648 12:02:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:21:32.648 12:02:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@518 -- # i=1 00:21:32.648 12:02:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:32.648 [2024-07-12 12:02:22.813668] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:32.648 [2024-07-12 12:02:22.813698] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:32.648 [2024-07-12 12:02:22.813709] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b666d0 00:21:32.648 [2024-07-12 12:02:22.813730] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:32.648 [2024-07-12 12:02:22.814885] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:32.648 [2024-07-12 12:02:22.814904] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:32.648 [2024-07-12 12:02:22.814947] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:32.648 [2024-07-12 12:02:22.814967] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:32.648 [2024-07-12 12:02:22.815029] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d10280 00:21:32.648 [2024-07-12 12:02:22.815034] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:21:32.648 [2024-07-12 12:02:22.815146] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b65e70 00:21:32.648 [2024-07-12 12:02:22.815228] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d10280 00:21:32.648 [2024-07-12 12:02:22.815233] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d10280 00:21:32.648 [2024-07-12 12:02:22.815297] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:32.648 pt2 00:21:32.648 12:02:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:32.648 12:02:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:32.648 12:02:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:32.648 12:02:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:32.648 12:02:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:32.648 12:02:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:32.648 12:02:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:32.648 12:02:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:32.648 12:02:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:32.648 12:02:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:32.648 12:02:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:32.648 12:02:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:32.909 12:02:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:32.909 "name": "raid_bdev1", 00:21:32.909 "uuid": "e765c662-f7af-4b3f-8a44-e3c5db911f14", 00:21:32.909 "strip_size_kb": 0, 00:21:32.909 "state": "online", 00:21:32.909 "raid_level": "raid1", 00:21:32.909 "superblock": true, 00:21:32.909 "num_base_bdevs": 2, 00:21:32.909 "num_base_bdevs_discovered": 1, 00:21:32.909 "num_base_bdevs_operational": 1, 00:21:32.909 "base_bdevs_list": [ 00:21:32.909 { 00:21:32.909 "name": null, 00:21:32.909 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:32.909 "is_configured": false, 00:21:32.909 "data_offset": 256, 00:21:32.909 "data_size": 7936 00:21:32.909 }, 00:21:32.909 { 00:21:32.909 "name": "pt2", 00:21:32.909 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:32.909 "is_configured": true, 00:21:32.909 "data_offset": 256, 00:21:32.909 "data_size": 7936 00:21:32.909 } 00:21:32.909 ] 00:21:32.909 }' 00:21:32.909 12:02:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:32.909 12:02:22 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:21:33.477 12:02:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:33.477 [2024-07-12 12:02:23.631788] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:33.477 [2024-07-12 12:02:23.631805] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:33.477 [2024-07-12 12:02:23.631842] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:33.477 [2024-07-12 12:02:23.631873] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:33.477 [2024-07-12 12:02:23.631880] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d10280 name raid_bdev1, state offline 00:21:33.477 12:02:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:33.477 12:02:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:21:33.735 12:02:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:21:33.735 12:02:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:21:33.735 12:02:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:21:33.735 12:02:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:33.735 [2024-07-12 12:02:23.972659] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:33.735 [2024-07-12 12:02:23.972691] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:33.735 [2024-07-12 12:02:23.972701] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d11040 00:21:33.735 [2024-07-12 12:02:23.972722] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:33.735 [2024-07-12 12:02:23.973874] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:33.735 [2024-07-12 12:02:23.973892] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:33.735 [2024-07-12 12:02:23.973934] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:21:33.735 [2024-07-12 12:02:23.973954] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:33.735 [2024-07-12 12:02:23.974021] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:21:33.735 [2024-07-12 12:02:23.974032] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:33.735 [2024-07-12 12:02:23.974040] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d16560 name raid_bdev1, state configuring 00:21:33.735 [2024-07-12 12:02:23.974054] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:33.735 [2024-07-12 12:02:23.974094] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d14150 00:21:33.735 [2024-07-12 12:02:23.974100] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:21:33.735 [2024-07-12 12:02:23.974208] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d12fc0 00:21:33.735 [2024-07-12 12:02:23.974289] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d14150 00:21:33.735 [2024-07-12 12:02:23.974294] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d14150 00:21:33.735 [2024-07-12 12:02:23.974358] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:33.735 pt1 00:21:33.993 12:02:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:21:33.993 12:02:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:33.993 12:02:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:33.993 12:02:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:33.993 12:02:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:33.993 12:02:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:33.993 12:02:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:33.993 12:02:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:33.993 12:02:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:33.993 12:02:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:33.993 12:02:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:33.993 12:02:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:33.993 12:02:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:33.993 12:02:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:33.993 "name": "raid_bdev1", 00:21:33.993 "uuid": "e765c662-f7af-4b3f-8a44-e3c5db911f14", 00:21:33.993 "strip_size_kb": 0, 00:21:33.993 "state": "online", 00:21:33.993 "raid_level": "raid1", 00:21:33.993 "superblock": true, 00:21:33.993 "num_base_bdevs": 2, 00:21:33.993 "num_base_bdevs_discovered": 1, 00:21:33.993 "num_base_bdevs_operational": 1, 00:21:33.993 "base_bdevs_list": [ 00:21:33.993 { 00:21:33.993 "name": null, 00:21:33.993 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:33.993 "is_configured": false, 00:21:33.993 "data_offset": 256, 00:21:33.993 "data_size": 7936 00:21:33.993 }, 00:21:33.993 { 00:21:33.993 "name": "pt2", 00:21:33.993 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:33.993 "is_configured": true, 00:21:33.993 "data_offset": 256, 00:21:33.993 "data_size": 7936 00:21:33.993 } 00:21:33.993 ] 00:21:33.993 }' 00:21:33.993 12:02:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:33.993 12:02:24 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:21:34.561 12:02:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:21:34.561 12:02:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:21:34.561 12:02:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:21:34.561 12:02:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:34.561 12:02:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:21:34.820 [2024-07-12 12:02:24.927280] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:34.820 12:02:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # '[' e765c662-f7af-4b3f-8a44-e3c5db911f14 '!=' e765c662-f7af-4b3f-8a44-e3c5db911f14 ']' 00:21:34.820 12:02:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@562 -- # killprocess 726725 00:21:34.820 12:02:24 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@948 -- # '[' -z 726725 ']' 00:21:34.820 12:02:24 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@952 -- # kill -0 726725 00:21:34.820 12:02:24 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # uname 00:21:34.820 12:02:24 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:34.820 12:02:24 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 726725 00:21:34.820 12:02:24 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:34.820 12:02:24 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:34.820 12:02:24 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 726725' 00:21:34.820 killing process with pid 726725 00:21:34.820 12:02:24 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@967 -- # kill 726725 00:21:34.820 [2024-07-12 12:02:24.986327] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:34.820 [2024-07-12 12:02:24.986367] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:34.820 [2024-07-12 12:02:24.986395] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:34.820 [2024-07-12 12:02:24.986401] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d14150 name raid_bdev1, state offline 00:21:34.820 12:02:24 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@972 -- # wait 726725 00:21:34.820 [2024-07-12 12:02:25.001766] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:35.080 12:02:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@564 -- # return 0 00:21:35.080 00:21:35.080 real 0m11.455s 00:21:35.080 user 0m21.011s 00:21:35.080 sys 0m1.761s 00:21:35.080 12:02:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:35.080 12:02:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:21:35.080 ************************************ 00:21:35.080 END TEST raid_superblock_test_4k 00:21:35.080 ************************************ 00:21:35.080 12:02:25 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:35.080 12:02:25 bdev_raid -- bdev/bdev_raid.sh@900 -- # '[' true = true ']' 00:21:35.080 12:02:25 bdev_raid -- bdev/bdev_raid.sh@901 -- # run_test raid_rebuild_test_sb_4k raid_rebuild_test raid1 2 true false true 00:21:35.080 12:02:25 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:21:35.080 12:02:25 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:35.080 12:02:25 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:35.080 ************************************ 00:21:35.080 START TEST raid_rebuild_test_sb_4k 00:21:35.080 ************************************ 00:21:35.080 12:02:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:21:35.080 12:02:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:21:35.080 12:02:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:21:35.080 12:02:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:21:35.080 12:02:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:21:35.080 12:02:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@572 -- # local verify=true 00:21:35.080 12:02:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:21:35.080 12:02:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:35.080 12:02:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:21:35.080 12:02:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:35.080 12:02:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:35.080 12:02:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:21:35.080 12:02:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:35.080 12:02:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:35.080 12:02:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:21:35.080 12:02:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:21:35.080 12:02:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:21:35.080 12:02:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # local strip_size 00:21:35.080 12:02:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@576 -- # local create_arg 00:21:35.080 12:02:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:21:35.080 12:02:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@578 -- # local data_offset 00:21:35.080 12:02:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:21:35.080 12:02:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:21:35.080 12:02:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:21:35.080 12:02:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:21:35.080 12:02:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@596 -- # raid_pid=728992 00:21:35.080 12:02:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@597 -- # waitforlisten 728992 /var/tmp/spdk-raid.sock 00:21:35.080 12:02:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@829 -- # '[' -z 728992 ']' 00:21:35.080 12:02:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:35.080 12:02:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:35.080 12:02:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:35.080 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:35.080 12:02:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:35.080 12:02:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:21:35.080 12:02:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:35.080 [2024-07-12 12:02:25.288052] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:21:35.080 [2024-07-12 12:02:25.288088] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid728992 ] 00:21:35.080 I/O size of 3145728 is greater than zero copy threshold (65536). 00:21:35.080 Zero copy mechanism will not be used. 00:21:35.338 [2024-07-12 12:02:25.351440] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:35.338 [2024-07-12 12:02:25.428682] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:21:35.338 [2024-07-12 12:02:25.479230] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:35.338 [2024-07-12 12:02:25.479264] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:35.905 12:02:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:35.905 12:02:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@862 -- # return 0 00:21:35.905 12:02:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:35.905 12:02:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1_malloc 00:21:36.163 BaseBdev1_malloc 00:21:36.163 12:02:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:21:36.163 [2024-07-12 12:02:26.378323] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:21:36.163 [2024-07-12 12:02:26.378352] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:36.163 [2024-07-12 12:02:26.378363] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa63010 00:21:36.163 [2024-07-12 12:02:26.378369] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:36.163 [2024-07-12 12:02:26.379480] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:36.163 [2024-07-12 12:02:26.379499] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:36.163 BaseBdev1 00:21:36.163 12:02:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:36.163 12:02:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2_malloc 00:21:36.421 BaseBdev2_malloc 00:21:36.421 12:02:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:21:36.679 [2024-07-12 12:02:26.702716] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:21:36.679 [2024-07-12 12:02:26.702745] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:36.679 [2024-07-12 12:02:26.702758] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa63b60 00:21:36.679 [2024-07-12 12:02:26.702764] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:36.679 [2024-07-12 12:02:26.703833] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:36.679 [2024-07-12 12:02:26.703853] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:36.679 BaseBdev2 00:21:36.679 12:02:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b spare_malloc 00:21:36.679 spare_malloc 00:21:36.679 12:02:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:21:36.947 spare_delay 00:21:36.947 12:02:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:36.947 [2024-07-12 12:02:27.179470] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:36.947 [2024-07-12 12:02:27.179497] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:36.947 [2024-07-12 12:02:27.179507] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc0e880 00:21:36.947 [2024-07-12 12:02:27.179513] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:36.947 [2024-07-12 12:02:27.180523] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:36.947 [2024-07-12 12:02:27.180542] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:36.947 spare 00:21:36.947 12:02:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:21:37.206 [2024-07-12 12:02:27.343954] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:37.206 [2024-07-12 12:02:27.344865] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:37.206 [2024-07-12 12:02:27.344972] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xc126c0 00:21:37.206 [2024-07-12 12:02:27.344980] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:21:37.206 [2024-07-12 12:02:27.345106] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc0eb10 00:21:37.206 [2024-07-12 12:02:27.345198] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc126c0 00:21:37.206 [2024-07-12 12:02:27.345207] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc126c0 00:21:37.206 [2024-07-12 12:02:27.345268] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:37.206 12:02:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:37.206 12:02:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:37.206 12:02:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:37.206 12:02:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:37.206 12:02:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:37.206 12:02:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:37.206 12:02:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:37.206 12:02:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:37.206 12:02:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:37.206 12:02:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:37.206 12:02:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:37.206 12:02:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:37.465 12:02:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:37.465 "name": "raid_bdev1", 00:21:37.465 "uuid": "199ccd54-ebb5-4a92-abf2-3293b215035e", 00:21:37.465 "strip_size_kb": 0, 00:21:37.465 "state": "online", 00:21:37.465 "raid_level": "raid1", 00:21:37.465 "superblock": true, 00:21:37.465 "num_base_bdevs": 2, 00:21:37.465 "num_base_bdevs_discovered": 2, 00:21:37.465 "num_base_bdevs_operational": 2, 00:21:37.465 "base_bdevs_list": [ 00:21:37.465 { 00:21:37.465 "name": "BaseBdev1", 00:21:37.465 "uuid": "4286efe1-cdc9-5c58-a165-db5ea1073e28", 00:21:37.465 "is_configured": true, 00:21:37.465 "data_offset": 256, 00:21:37.465 "data_size": 7936 00:21:37.465 }, 00:21:37.465 { 00:21:37.465 "name": "BaseBdev2", 00:21:37.465 "uuid": "2261fb42-c080-55ec-be1d-101f9e1bcc74", 00:21:37.465 "is_configured": true, 00:21:37.465 "data_offset": 256, 00:21:37.465 "data_size": 7936 00:21:37.465 } 00:21:37.465 ] 00:21:37.465 }' 00:21:37.465 12:02:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:37.465 12:02:27 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:38.032 12:02:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:38.032 12:02:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:21:38.032 [2024-07-12 12:02:28.146155] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:38.032 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:21:38.032 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:38.032 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:21:38.291 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:21:38.291 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:21:38.291 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:21:38.291 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:21:38.291 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:21:38.291 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:38.291 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:21:38.291 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:38.291 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:21:38.291 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:38.291 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:21:38.291 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:38.291 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:38.291 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:21:38.291 [2024-07-12 12:02:28.482896] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc14af0 00:21:38.291 /dev/nbd0 00:21:38.291 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:38.291 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:38.291 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:21:38.291 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:21:38.291 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:21:38.291 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:21:38.291 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:21:38.291 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:21:38.291 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:21:38.291 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:21:38.291 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:38.291 1+0 records in 00:21:38.291 1+0 records out 00:21:38.291 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000207944 s, 19.7 MB/s 00:21:38.291 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:38.291 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:21:38.291 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:38.291 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:21:38.291 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:21:38.291 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:38.291 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:38.291 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:21:38.291 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:21:38.291 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:21:38.857 7936+0 records in 00:21:38.857 7936+0 records out 00:21:38.857 32505856 bytes (33 MB, 31 MiB) copied, 0.458325 s, 70.9 MB/s 00:21:38.857 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:21:38.857 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:38.857 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:21:38.857 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:38.857 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:21:38.857 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:38.857 12:02:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:21:39.116 12:02:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:21:39.116 12:02:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:21:39.116 12:02:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:21:39.116 12:02:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:39.116 12:02:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:39.116 12:02:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:21:39.116 [2024-07-12 12:02:29.190297] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:39.116 12:02:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:21:39.116 12:02:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:21:39.116 12:02:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:21:39.116 [2024-07-12 12:02:29.348252] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:39.374 12:02:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:39.374 12:02:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:39.374 12:02:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:39.374 12:02:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:39.374 12:02:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:39.374 12:02:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:39.374 12:02:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:39.374 12:02:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:39.374 12:02:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:39.374 12:02:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:39.374 12:02:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:39.374 12:02:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:39.374 12:02:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:39.374 "name": "raid_bdev1", 00:21:39.374 "uuid": "199ccd54-ebb5-4a92-abf2-3293b215035e", 00:21:39.374 "strip_size_kb": 0, 00:21:39.374 "state": "online", 00:21:39.374 "raid_level": "raid1", 00:21:39.374 "superblock": true, 00:21:39.374 "num_base_bdevs": 2, 00:21:39.374 "num_base_bdevs_discovered": 1, 00:21:39.374 "num_base_bdevs_operational": 1, 00:21:39.374 "base_bdevs_list": [ 00:21:39.374 { 00:21:39.374 "name": null, 00:21:39.374 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:39.374 "is_configured": false, 00:21:39.374 "data_offset": 256, 00:21:39.374 "data_size": 7936 00:21:39.374 }, 00:21:39.375 { 00:21:39.375 "name": "BaseBdev2", 00:21:39.375 "uuid": "2261fb42-c080-55ec-be1d-101f9e1bcc74", 00:21:39.375 "is_configured": true, 00:21:39.375 "data_offset": 256, 00:21:39.375 "data_size": 7936 00:21:39.375 } 00:21:39.375 ] 00:21:39.375 }' 00:21:39.375 12:02:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:39.375 12:02:29 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:39.941 12:02:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:39.941 [2024-07-12 12:02:30.182437] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:39.941 [2024-07-12 12:02:30.186732] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc14af0 00:21:39.941 [2024-07-12 12:02:30.188173] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:40.200 12:02:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@646 -- # sleep 1 00:21:41.196 12:02:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:41.196 12:02:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:41.196 12:02:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:41.196 12:02:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:41.196 12:02:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:41.196 12:02:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:41.196 12:02:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:41.196 12:02:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:41.196 "name": "raid_bdev1", 00:21:41.196 "uuid": "199ccd54-ebb5-4a92-abf2-3293b215035e", 00:21:41.196 "strip_size_kb": 0, 00:21:41.196 "state": "online", 00:21:41.196 "raid_level": "raid1", 00:21:41.196 "superblock": true, 00:21:41.196 "num_base_bdevs": 2, 00:21:41.196 "num_base_bdevs_discovered": 2, 00:21:41.196 "num_base_bdevs_operational": 2, 00:21:41.196 "process": { 00:21:41.196 "type": "rebuild", 00:21:41.196 "target": "spare", 00:21:41.196 "progress": { 00:21:41.196 "blocks": 2816, 00:21:41.196 "percent": 35 00:21:41.196 } 00:21:41.196 }, 00:21:41.196 "base_bdevs_list": [ 00:21:41.196 { 00:21:41.196 "name": "spare", 00:21:41.196 "uuid": "40be32ab-67e8-525b-abea-205bc17a32bc", 00:21:41.196 "is_configured": true, 00:21:41.196 "data_offset": 256, 00:21:41.196 "data_size": 7936 00:21:41.196 }, 00:21:41.196 { 00:21:41.196 "name": "BaseBdev2", 00:21:41.196 "uuid": "2261fb42-c080-55ec-be1d-101f9e1bcc74", 00:21:41.196 "is_configured": true, 00:21:41.196 "data_offset": 256, 00:21:41.196 "data_size": 7936 00:21:41.196 } 00:21:41.196 ] 00:21:41.196 }' 00:21:41.196 12:02:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:41.196 12:02:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:41.196 12:02:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:41.455 12:02:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:41.455 12:02:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:21:41.455 [2024-07-12 12:02:31.611216] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:41.455 [2024-07-12 12:02:31.698748] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:41.455 [2024-07-12 12:02:31.698781] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:41.455 [2024-07-12 12:02:31.698790] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:41.455 [2024-07-12 12:02:31.698794] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:21:41.713 12:02:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:41.713 12:02:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:41.714 12:02:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:41.714 12:02:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:41.714 12:02:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:41.714 12:02:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:41.714 12:02:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:41.714 12:02:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:41.714 12:02:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:41.714 12:02:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:41.714 12:02:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:41.714 12:02:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:41.714 12:02:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:41.714 "name": "raid_bdev1", 00:21:41.714 "uuid": "199ccd54-ebb5-4a92-abf2-3293b215035e", 00:21:41.714 "strip_size_kb": 0, 00:21:41.714 "state": "online", 00:21:41.714 "raid_level": "raid1", 00:21:41.714 "superblock": true, 00:21:41.714 "num_base_bdevs": 2, 00:21:41.714 "num_base_bdevs_discovered": 1, 00:21:41.714 "num_base_bdevs_operational": 1, 00:21:41.714 "base_bdevs_list": [ 00:21:41.714 { 00:21:41.714 "name": null, 00:21:41.714 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:41.714 "is_configured": false, 00:21:41.714 "data_offset": 256, 00:21:41.714 "data_size": 7936 00:21:41.714 }, 00:21:41.714 { 00:21:41.714 "name": "BaseBdev2", 00:21:41.714 "uuid": "2261fb42-c080-55ec-be1d-101f9e1bcc74", 00:21:41.714 "is_configured": true, 00:21:41.714 "data_offset": 256, 00:21:41.714 "data_size": 7936 00:21:41.714 } 00:21:41.714 ] 00:21:41.714 }' 00:21:41.714 12:02:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:41.714 12:02:31 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:42.280 12:02:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:42.280 12:02:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:42.280 12:02:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:42.280 12:02:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:42.280 12:02:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:42.280 12:02:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:42.280 12:02:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:42.539 12:02:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:42.539 "name": "raid_bdev1", 00:21:42.539 "uuid": "199ccd54-ebb5-4a92-abf2-3293b215035e", 00:21:42.539 "strip_size_kb": 0, 00:21:42.539 "state": "online", 00:21:42.539 "raid_level": "raid1", 00:21:42.539 "superblock": true, 00:21:42.539 "num_base_bdevs": 2, 00:21:42.539 "num_base_bdevs_discovered": 1, 00:21:42.539 "num_base_bdevs_operational": 1, 00:21:42.539 "base_bdevs_list": [ 00:21:42.539 { 00:21:42.539 "name": null, 00:21:42.539 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:42.539 "is_configured": false, 00:21:42.539 "data_offset": 256, 00:21:42.539 "data_size": 7936 00:21:42.539 }, 00:21:42.539 { 00:21:42.539 "name": "BaseBdev2", 00:21:42.539 "uuid": "2261fb42-c080-55ec-be1d-101f9e1bcc74", 00:21:42.539 "is_configured": true, 00:21:42.539 "data_offset": 256, 00:21:42.539 "data_size": 7936 00:21:42.539 } 00:21:42.539 ] 00:21:42.539 }' 00:21:42.539 12:02:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:42.539 12:02:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:42.539 12:02:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:42.539 12:02:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:42.539 12:02:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:42.798 [2024-07-12 12:02:32.793687] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:42.798 [2024-07-12 12:02:32.797991] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc0eb10 00:21:42.798 [2024-07-12 12:02:32.799048] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:42.798 12:02:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@662 -- # sleep 1 00:21:43.732 12:02:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:43.732 12:02:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:43.732 12:02:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:43.732 12:02:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:43.732 12:02:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:43.732 12:02:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:43.732 12:02:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:43.991 12:02:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:43.991 "name": "raid_bdev1", 00:21:43.991 "uuid": "199ccd54-ebb5-4a92-abf2-3293b215035e", 00:21:43.991 "strip_size_kb": 0, 00:21:43.991 "state": "online", 00:21:43.991 "raid_level": "raid1", 00:21:43.991 "superblock": true, 00:21:43.991 "num_base_bdevs": 2, 00:21:43.991 "num_base_bdevs_discovered": 2, 00:21:43.991 "num_base_bdevs_operational": 2, 00:21:43.991 "process": { 00:21:43.991 "type": "rebuild", 00:21:43.991 "target": "spare", 00:21:43.991 "progress": { 00:21:43.992 "blocks": 2816, 00:21:43.992 "percent": 35 00:21:43.992 } 00:21:43.992 }, 00:21:43.992 "base_bdevs_list": [ 00:21:43.992 { 00:21:43.992 "name": "spare", 00:21:43.992 "uuid": "40be32ab-67e8-525b-abea-205bc17a32bc", 00:21:43.992 "is_configured": true, 00:21:43.992 "data_offset": 256, 00:21:43.992 "data_size": 7936 00:21:43.992 }, 00:21:43.992 { 00:21:43.992 "name": "BaseBdev2", 00:21:43.992 "uuid": "2261fb42-c080-55ec-be1d-101f9e1bcc74", 00:21:43.992 "is_configured": true, 00:21:43.992 "data_offset": 256, 00:21:43.992 "data_size": 7936 00:21:43.992 } 00:21:43.992 ] 00:21:43.992 }' 00:21:43.992 12:02:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:43.992 12:02:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:43.992 12:02:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:43.992 12:02:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:43.992 12:02:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:21:43.992 12:02:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:21:43.992 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:21:43.992 12:02:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:21:43.992 12:02:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:21:43.992 12:02:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:21:43.992 12:02:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@705 -- # local timeout=774 00:21:43.992 12:02:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:43.992 12:02:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:43.992 12:02:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:43.992 12:02:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:43.992 12:02:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:43.992 12:02:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:43.992 12:02:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:43.992 12:02:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:43.992 12:02:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:43.992 "name": "raid_bdev1", 00:21:43.992 "uuid": "199ccd54-ebb5-4a92-abf2-3293b215035e", 00:21:43.992 "strip_size_kb": 0, 00:21:43.992 "state": "online", 00:21:43.992 "raid_level": "raid1", 00:21:43.992 "superblock": true, 00:21:43.992 "num_base_bdevs": 2, 00:21:43.992 "num_base_bdevs_discovered": 2, 00:21:43.992 "num_base_bdevs_operational": 2, 00:21:43.992 "process": { 00:21:43.992 "type": "rebuild", 00:21:43.992 "target": "spare", 00:21:43.992 "progress": { 00:21:43.992 "blocks": 3584, 00:21:43.992 "percent": 45 00:21:43.992 } 00:21:43.992 }, 00:21:43.992 "base_bdevs_list": [ 00:21:43.992 { 00:21:43.992 "name": "spare", 00:21:43.992 "uuid": "40be32ab-67e8-525b-abea-205bc17a32bc", 00:21:43.992 "is_configured": true, 00:21:43.992 "data_offset": 256, 00:21:43.992 "data_size": 7936 00:21:43.992 }, 00:21:43.992 { 00:21:43.992 "name": "BaseBdev2", 00:21:43.992 "uuid": "2261fb42-c080-55ec-be1d-101f9e1bcc74", 00:21:43.992 "is_configured": true, 00:21:43.992 "data_offset": 256, 00:21:43.992 "data_size": 7936 00:21:43.992 } 00:21:43.992 ] 00:21:43.992 }' 00:21:43.992 12:02:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:44.251 12:02:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:44.251 12:02:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:44.251 12:02:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:44.251 12:02:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:21:45.189 12:02:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:45.189 12:02:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:45.189 12:02:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:45.189 12:02:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:45.189 12:02:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:45.189 12:02:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:45.189 12:02:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:45.189 12:02:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:45.447 12:02:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:45.447 "name": "raid_bdev1", 00:21:45.447 "uuid": "199ccd54-ebb5-4a92-abf2-3293b215035e", 00:21:45.447 "strip_size_kb": 0, 00:21:45.447 "state": "online", 00:21:45.447 "raid_level": "raid1", 00:21:45.447 "superblock": true, 00:21:45.447 "num_base_bdevs": 2, 00:21:45.447 "num_base_bdevs_discovered": 2, 00:21:45.447 "num_base_bdevs_operational": 2, 00:21:45.447 "process": { 00:21:45.447 "type": "rebuild", 00:21:45.447 "target": "spare", 00:21:45.447 "progress": { 00:21:45.447 "blocks": 6656, 00:21:45.447 "percent": 83 00:21:45.447 } 00:21:45.447 }, 00:21:45.448 "base_bdevs_list": [ 00:21:45.448 { 00:21:45.448 "name": "spare", 00:21:45.448 "uuid": "40be32ab-67e8-525b-abea-205bc17a32bc", 00:21:45.448 "is_configured": true, 00:21:45.448 "data_offset": 256, 00:21:45.448 "data_size": 7936 00:21:45.448 }, 00:21:45.448 { 00:21:45.448 "name": "BaseBdev2", 00:21:45.448 "uuid": "2261fb42-c080-55ec-be1d-101f9e1bcc74", 00:21:45.448 "is_configured": true, 00:21:45.448 "data_offset": 256, 00:21:45.448 "data_size": 7936 00:21:45.448 } 00:21:45.448 ] 00:21:45.448 }' 00:21:45.448 12:02:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:45.448 12:02:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:45.448 12:02:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:45.448 12:02:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:45.448 12:02:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:21:45.708 [2024-07-12 12:02:35.920429] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:21:45.709 [2024-07-12 12:02:35.920468] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:21:45.709 [2024-07-12 12:02:35.920527] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:46.645 12:02:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:46.645 12:02:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:46.645 12:02:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:46.645 12:02:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:46.645 12:02:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:46.645 12:02:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:46.645 12:02:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:46.645 12:02:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:46.645 12:02:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:46.645 "name": "raid_bdev1", 00:21:46.645 "uuid": "199ccd54-ebb5-4a92-abf2-3293b215035e", 00:21:46.645 "strip_size_kb": 0, 00:21:46.645 "state": "online", 00:21:46.645 "raid_level": "raid1", 00:21:46.645 "superblock": true, 00:21:46.645 "num_base_bdevs": 2, 00:21:46.645 "num_base_bdevs_discovered": 2, 00:21:46.645 "num_base_bdevs_operational": 2, 00:21:46.645 "base_bdevs_list": [ 00:21:46.645 { 00:21:46.645 "name": "spare", 00:21:46.645 "uuid": "40be32ab-67e8-525b-abea-205bc17a32bc", 00:21:46.645 "is_configured": true, 00:21:46.645 "data_offset": 256, 00:21:46.645 "data_size": 7936 00:21:46.645 }, 00:21:46.645 { 00:21:46.645 "name": "BaseBdev2", 00:21:46.645 "uuid": "2261fb42-c080-55ec-be1d-101f9e1bcc74", 00:21:46.645 "is_configured": true, 00:21:46.645 "data_offset": 256, 00:21:46.645 "data_size": 7936 00:21:46.645 } 00:21:46.645 ] 00:21:46.645 }' 00:21:46.645 12:02:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:46.645 12:02:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:21:46.645 12:02:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:46.645 12:02:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:21:46.645 12:02:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@708 -- # break 00:21:46.645 12:02:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:46.645 12:02:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:46.645 12:02:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:46.645 12:02:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:46.645 12:02:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:46.645 12:02:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:46.645 12:02:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:46.912 12:02:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:46.912 "name": "raid_bdev1", 00:21:46.912 "uuid": "199ccd54-ebb5-4a92-abf2-3293b215035e", 00:21:46.912 "strip_size_kb": 0, 00:21:46.912 "state": "online", 00:21:46.912 "raid_level": "raid1", 00:21:46.912 "superblock": true, 00:21:46.912 "num_base_bdevs": 2, 00:21:46.912 "num_base_bdevs_discovered": 2, 00:21:46.912 "num_base_bdevs_operational": 2, 00:21:46.912 "base_bdevs_list": [ 00:21:46.912 { 00:21:46.912 "name": "spare", 00:21:46.912 "uuid": "40be32ab-67e8-525b-abea-205bc17a32bc", 00:21:46.912 "is_configured": true, 00:21:46.912 "data_offset": 256, 00:21:46.912 "data_size": 7936 00:21:46.912 }, 00:21:46.912 { 00:21:46.912 "name": "BaseBdev2", 00:21:46.912 "uuid": "2261fb42-c080-55ec-be1d-101f9e1bcc74", 00:21:46.912 "is_configured": true, 00:21:46.912 "data_offset": 256, 00:21:46.912 "data_size": 7936 00:21:46.912 } 00:21:46.912 ] 00:21:46.912 }' 00:21:46.912 12:02:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:46.912 12:02:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:46.912 12:02:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:46.912 12:02:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:46.912 12:02:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:46.912 12:02:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:46.912 12:02:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:46.912 12:02:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:46.912 12:02:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:46.912 12:02:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:46.912 12:02:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:46.912 12:02:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:46.912 12:02:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:46.912 12:02:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:46.912 12:02:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:46.912 12:02:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:47.173 12:02:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:47.173 "name": "raid_bdev1", 00:21:47.173 "uuid": "199ccd54-ebb5-4a92-abf2-3293b215035e", 00:21:47.173 "strip_size_kb": 0, 00:21:47.173 "state": "online", 00:21:47.173 "raid_level": "raid1", 00:21:47.173 "superblock": true, 00:21:47.173 "num_base_bdevs": 2, 00:21:47.173 "num_base_bdevs_discovered": 2, 00:21:47.173 "num_base_bdevs_operational": 2, 00:21:47.173 "base_bdevs_list": [ 00:21:47.173 { 00:21:47.173 "name": "spare", 00:21:47.173 "uuid": "40be32ab-67e8-525b-abea-205bc17a32bc", 00:21:47.173 "is_configured": true, 00:21:47.173 "data_offset": 256, 00:21:47.173 "data_size": 7936 00:21:47.173 }, 00:21:47.173 { 00:21:47.173 "name": "BaseBdev2", 00:21:47.173 "uuid": "2261fb42-c080-55ec-be1d-101f9e1bcc74", 00:21:47.173 "is_configured": true, 00:21:47.173 "data_offset": 256, 00:21:47.173 "data_size": 7936 00:21:47.173 } 00:21:47.174 ] 00:21:47.174 }' 00:21:47.174 12:02:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:47.174 12:02:37 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:47.739 12:02:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:47.740 [2024-07-12 12:02:37.833060] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:47.740 [2024-07-12 12:02:37.833080] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:47.740 [2024-07-12 12:02:37.833121] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:47.740 [2024-07-12 12:02:37.833169] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:47.740 [2024-07-12 12:02:37.833175] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc126c0 name raid_bdev1, state offline 00:21:47.740 12:02:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # jq length 00:21:47.740 12:02:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:47.998 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:21:47.998 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:21:47.998 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:21:47.998 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:21:47.998 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:47.998 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:21:47.998 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:47.998 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:21:47.998 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:47.998 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:21:47.998 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:47.998 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:21:47.998 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:21:47.998 /dev/nbd0 00:21:47.998 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:47.998 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:47.998 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:21:47.998 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:21:47.998 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:21:47.998 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:21:47.998 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:21:47.998 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:21:47.998 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:21:47.998 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:21:47.998 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:47.998 1+0 records in 00:21:47.998 1+0 records out 00:21:47.998 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000222134 s, 18.4 MB/s 00:21:47.998 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:47.998 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:21:47.998 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:47.998 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:21:47.998 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:21:47.998 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:47.998 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:21:47.998 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:21:48.258 /dev/nbd1 00:21:48.258 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:21:48.258 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:21:48.258 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:21:48.258 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:21:48.258 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:21:48.258 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:21:48.258 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:21:48.258 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:21:48.258 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:21:48.258 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:21:48.258 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:48.258 1+0 records in 00:21:48.258 1+0 records out 00:21:48.258 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000223359 s, 18.3 MB/s 00:21:48.258 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:48.258 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:21:48.258 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:48.258 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:21:48.258 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:21:48.258 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:48.258 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:21:48.258 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:21:48.258 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:21:48.258 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:48.258 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:21:48.258 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:48.258 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:21:48.258 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:48.258 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:21:48.517 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:21:48.517 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:21:48.517 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:21:48.517 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:48.517 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:48.517 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:21:48.517 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:21:48.517 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:21:48.517 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:48.517 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:21:48.775 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:21:48.775 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:21:48.775 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:21:48.775 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:48.775 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:48.775 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:21:48.775 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:21:48.775 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:21:48.775 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:21:48.775 12:02:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:21:49.033 12:02:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:49.033 [2024-07-12 12:02:39.189802] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:49.033 [2024-07-12 12:02:39.189832] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:49.033 [2024-07-12 12:02:39.189843] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc148f0 00:21:49.033 [2024-07-12 12:02:39.189848] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:49.033 [2024-07-12 12:02:39.191011] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:49.033 [2024-07-12 12:02:39.191030] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:49.033 [2024-07-12 12:02:39.191077] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:21:49.033 [2024-07-12 12:02:39.191095] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:49.033 [2024-07-12 12:02:39.191164] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:49.033 spare 00:21:49.033 12:02:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:49.033 12:02:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:49.033 12:02:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:49.033 12:02:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:49.033 12:02:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:49.033 12:02:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:49.033 12:02:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:49.033 12:02:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:49.033 12:02:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:49.033 12:02:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:49.033 12:02:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:49.033 12:02:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:49.292 [2024-07-12 12:02:39.291456] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xa61bb0 00:21:49.292 [2024-07-12 12:02:39.291466] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:21:49.292 [2024-07-12 12:02:39.291589] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc0dc90 00:21:49.292 [2024-07-12 12:02:39.291687] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa61bb0 00:21:49.292 [2024-07-12 12:02:39.291693] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xa61bb0 00:21:49.292 [2024-07-12 12:02:39.291754] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:49.292 12:02:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:49.292 "name": "raid_bdev1", 00:21:49.292 "uuid": "199ccd54-ebb5-4a92-abf2-3293b215035e", 00:21:49.292 "strip_size_kb": 0, 00:21:49.292 "state": "online", 00:21:49.292 "raid_level": "raid1", 00:21:49.292 "superblock": true, 00:21:49.292 "num_base_bdevs": 2, 00:21:49.292 "num_base_bdevs_discovered": 2, 00:21:49.292 "num_base_bdevs_operational": 2, 00:21:49.292 "base_bdevs_list": [ 00:21:49.292 { 00:21:49.292 "name": "spare", 00:21:49.292 "uuid": "40be32ab-67e8-525b-abea-205bc17a32bc", 00:21:49.292 "is_configured": true, 00:21:49.292 "data_offset": 256, 00:21:49.292 "data_size": 7936 00:21:49.292 }, 00:21:49.292 { 00:21:49.292 "name": "BaseBdev2", 00:21:49.292 "uuid": "2261fb42-c080-55ec-be1d-101f9e1bcc74", 00:21:49.292 "is_configured": true, 00:21:49.292 "data_offset": 256, 00:21:49.292 "data_size": 7936 00:21:49.292 } 00:21:49.292 ] 00:21:49.292 }' 00:21:49.292 12:02:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:49.292 12:02:39 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:49.860 12:02:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:49.860 12:02:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:49.860 12:02:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:49.860 12:02:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:49.860 12:02:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:49.860 12:02:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:49.860 12:02:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:49.860 12:02:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:49.860 "name": "raid_bdev1", 00:21:49.860 "uuid": "199ccd54-ebb5-4a92-abf2-3293b215035e", 00:21:49.860 "strip_size_kb": 0, 00:21:49.860 "state": "online", 00:21:49.860 "raid_level": "raid1", 00:21:49.860 "superblock": true, 00:21:49.860 "num_base_bdevs": 2, 00:21:49.860 "num_base_bdevs_discovered": 2, 00:21:49.860 "num_base_bdevs_operational": 2, 00:21:49.860 "base_bdevs_list": [ 00:21:49.860 { 00:21:49.860 "name": "spare", 00:21:49.860 "uuid": "40be32ab-67e8-525b-abea-205bc17a32bc", 00:21:49.860 "is_configured": true, 00:21:49.860 "data_offset": 256, 00:21:49.860 "data_size": 7936 00:21:49.860 }, 00:21:49.860 { 00:21:49.860 "name": "BaseBdev2", 00:21:49.860 "uuid": "2261fb42-c080-55ec-be1d-101f9e1bcc74", 00:21:49.860 "is_configured": true, 00:21:49.860 "data_offset": 256, 00:21:49.860 "data_size": 7936 00:21:49.860 } 00:21:49.860 ] 00:21:49.860 }' 00:21:49.860 12:02:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:49.860 12:02:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:50.120 12:02:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:50.120 12:02:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:50.120 12:02:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:50.120 12:02:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:21:50.120 12:02:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:21:50.120 12:02:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:21:50.379 [2024-07-12 12:02:40.493224] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:50.379 12:02:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:50.379 12:02:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:50.379 12:02:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:50.379 12:02:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:50.379 12:02:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:50.379 12:02:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:50.379 12:02:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:50.379 12:02:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:50.379 12:02:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:50.379 12:02:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:50.379 12:02:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:50.379 12:02:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:50.639 12:02:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:50.639 "name": "raid_bdev1", 00:21:50.639 "uuid": "199ccd54-ebb5-4a92-abf2-3293b215035e", 00:21:50.639 "strip_size_kb": 0, 00:21:50.639 "state": "online", 00:21:50.639 "raid_level": "raid1", 00:21:50.639 "superblock": true, 00:21:50.639 "num_base_bdevs": 2, 00:21:50.639 "num_base_bdevs_discovered": 1, 00:21:50.639 "num_base_bdevs_operational": 1, 00:21:50.639 "base_bdevs_list": [ 00:21:50.639 { 00:21:50.639 "name": null, 00:21:50.639 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:50.639 "is_configured": false, 00:21:50.639 "data_offset": 256, 00:21:50.639 "data_size": 7936 00:21:50.639 }, 00:21:50.639 { 00:21:50.639 "name": "BaseBdev2", 00:21:50.639 "uuid": "2261fb42-c080-55ec-be1d-101f9e1bcc74", 00:21:50.639 "is_configured": true, 00:21:50.639 "data_offset": 256, 00:21:50.639 "data_size": 7936 00:21:50.639 } 00:21:50.639 ] 00:21:50.639 }' 00:21:50.639 12:02:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:50.639 12:02:40 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:51.206 12:02:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:51.206 [2024-07-12 12:02:41.303326] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:51.206 [2024-07-12 12:02:41.303434] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:21:51.206 [2024-07-12 12:02:41.303444] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:21:51.206 [2024-07-12 12:02:41.303462] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:51.206 [2024-07-12 12:02:41.307678] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x76a230 00:21:51.206 [2024-07-12 12:02:41.309226] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:51.206 12:02:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@755 -- # sleep 1 00:21:52.142 12:02:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:52.142 12:02:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:52.142 12:02:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:52.142 12:02:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:52.142 12:02:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:52.142 12:02:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:52.142 12:02:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:52.401 12:02:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:52.401 "name": "raid_bdev1", 00:21:52.401 "uuid": "199ccd54-ebb5-4a92-abf2-3293b215035e", 00:21:52.401 "strip_size_kb": 0, 00:21:52.401 "state": "online", 00:21:52.401 "raid_level": "raid1", 00:21:52.401 "superblock": true, 00:21:52.401 "num_base_bdevs": 2, 00:21:52.401 "num_base_bdevs_discovered": 2, 00:21:52.401 "num_base_bdevs_operational": 2, 00:21:52.401 "process": { 00:21:52.401 "type": "rebuild", 00:21:52.401 "target": "spare", 00:21:52.401 "progress": { 00:21:52.401 "blocks": 2816, 00:21:52.401 "percent": 35 00:21:52.401 } 00:21:52.401 }, 00:21:52.401 "base_bdevs_list": [ 00:21:52.401 { 00:21:52.401 "name": "spare", 00:21:52.401 "uuid": "40be32ab-67e8-525b-abea-205bc17a32bc", 00:21:52.401 "is_configured": true, 00:21:52.401 "data_offset": 256, 00:21:52.401 "data_size": 7936 00:21:52.401 }, 00:21:52.401 { 00:21:52.401 "name": "BaseBdev2", 00:21:52.401 "uuid": "2261fb42-c080-55ec-be1d-101f9e1bcc74", 00:21:52.401 "is_configured": true, 00:21:52.401 "data_offset": 256, 00:21:52.401 "data_size": 7936 00:21:52.401 } 00:21:52.401 ] 00:21:52.401 }' 00:21:52.401 12:02:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:52.401 12:02:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:52.401 12:02:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:52.401 12:02:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:52.401 12:02:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:21:52.660 [2024-07-12 12:02:42.735832] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:52.660 [2024-07-12 12:02:42.819705] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:52.660 [2024-07-12 12:02:42.819731] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:52.660 [2024-07-12 12:02:42.819740] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:52.660 [2024-07-12 12:02:42.819759] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:21:52.660 12:02:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:52.660 12:02:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:52.660 12:02:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:52.660 12:02:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:52.660 12:02:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:52.660 12:02:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:52.660 12:02:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:52.660 12:02:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:52.660 12:02:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:52.660 12:02:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:52.660 12:02:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:52.660 12:02:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:52.919 12:02:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:52.919 "name": "raid_bdev1", 00:21:52.919 "uuid": "199ccd54-ebb5-4a92-abf2-3293b215035e", 00:21:52.919 "strip_size_kb": 0, 00:21:52.919 "state": "online", 00:21:52.919 "raid_level": "raid1", 00:21:52.919 "superblock": true, 00:21:52.919 "num_base_bdevs": 2, 00:21:52.919 "num_base_bdevs_discovered": 1, 00:21:52.919 "num_base_bdevs_operational": 1, 00:21:52.919 "base_bdevs_list": [ 00:21:52.919 { 00:21:52.919 "name": null, 00:21:52.919 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:52.919 "is_configured": false, 00:21:52.919 "data_offset": 256, 00:21:52.919 "data_size": 7936 00:21:52.919 }, 00:21:52.919 { 00:21:52.919 "name": "BaseBdev2", 00:21:52.919 "uuid": "2261fb42-c080-55ec-be1d-101f9e1bcc74", 00:21:52.919 "is_configured": true, 00:21:52.919 "data_offset": 256, 00:21:52.919 "data_size": 7936 00:21:52.919 } 00:21:52.919 ] 00:21:52.919 }' 00:21:52.919 12:02:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:52.919 12:02:43 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:53.483 12:02:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:53.483 [2024-07-12 12:02:43.637861] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:53.483 [2024-07-12 12:02:43.637895] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:53.483 [2024-07-12 12:02:43.637906] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa59890 00:21:53.483 [2024-07-12 12:02:43.637912] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:53.483 [2024-07-12 12:02:43.638167] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:53.483 [2024-07-12 12:02:43.638176] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:53.483 [2024-07-12 12:02:43.638231] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:21:53.483 [2024-07-12 12:02:43.638238] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:21:53.483 [2024-07-12 12:02:43.638244] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:21:53.483 [2024-07-12 12:02:43.638254] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:53.483 [2024-07-12 12:02:43.642377] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x76a230 00:21:53.483 spare 00:21:53.483 [2024-07-12 12:02:43.643445] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:53.483 12:02:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@762 -- # sleep 1 00:21:54.418 12:02:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:54.418 12:02:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:54.418 12:02:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:54.418 12:02:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:54.418 12:02:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:54.418 12:02:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:54.418 12:02:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:54.677 12:02:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:54.677 "name": "raid_bdev1", 00:21:54.677 "uuid": "199ccd54-ebb5-4a92-abf2-3293b215035e", 00:21:54.677 "strip_size_kb": 0, 00:21:54.677 "state": "online", 00:21:54.677 "raid_level": "raid1", 00:21:54.677 "superblock": true, 00:21:54.677 "num_base_bdevs": 2, 00:21:54.677 "num_base_bdevs_discovered": 2, 00:21:54.677 "num_base_bdevs_operational": 2, 00:21:54.677 "process": { 00:21:54.677 "type": "rebuild", 00:21:54.677 "target": "spare", 00:21:54.677 "progress": { 00:21:54.677 "blocks": 2816, 00:21:54.677 "percent": 35 00:21:54.677 } 00:21:54.677 }, 00:21:54.677 "base_bdevs_list": [ 00:21:54.677 { 00:21:54.677 "name": "spare", 00:21:54.677 "uuid": "40be32ab-67e8-525b-abea-205bc17a32bc", 00:21:54.677 "is_configured": true, 00:21:54.677 "data_offset": 256, 00:21:54.677 "data_size": 7936 00:21:54.677 }, 00:21:54.677 { 00:21:54.677 "name": "BaseBdev2", 00:21:54.677 "uuid": "2261fb42-c080-55ec-be1d-101f9e1bcc74", 00:21:54.677 "is_configured": true, 00:21:54.677 "data_offset": 256, 00:21:54.677 "data_size": 7936 00:21:54.677 } 00:21:54.677 ] 00:21:54.677 }' 00:21:54.677 12:02:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:54.677 12:02:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:54.677 12:02:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:54.677 12:02:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:54.677 12:02:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:21:54.936 [2024-07-12 12:02:45.070452] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:54.936 [2024-07-12 12:02:45.153957] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:54.936 [2024-07-12 12:02:45.153988] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:54.936 [2024-07-12 12:02:45.153997] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:54.936 [2024-07-12 12:02:45.154002] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:21:54.936 12:02:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:54.936 12:02:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:54.936 12:02:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:54.936 12:02:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:54.936 12:02:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:54.936 12:02:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:54.936 12:02:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:54.936 12:02:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:54.936 12:02:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:54.936 12:02:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:55.193 12:02:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:55.193 12:02:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:55.193 12:02:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:55.193 "name": "raid_bdev1", 00:21:55.193 "uuid": "199ccd54-ebb5-4a92-abf2-3293b215035e", 00:21:55.193 "strip_size_kb": 0, 00:21:55.193 "state": "online", 00:21:55.193 "raid_level": "raid1", 00:21:55.193 "superblock": true, 00:21:55.193 "num_base_bdevs": 2, 00:21:55.193 "num_base_bdevs_discovered": 1, 00:21:55.193 "num_base_bdevs_operational": 1, 00:21:55.193 "base_bdevs_list": [ 00:21:55.193 { 00:21:55.193 "name": null, 00:21:55.193 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:55.193 "is_configured": false, 00:21:55.193 "data_offset": 256, 00:21:55.193 "data_size": 7936 00:21:55.193 }, 00:21:55.193 { 00:21:55.193 "name": "BaseBdev2", 00:21:55.193 "uuid": "2261fb42-c080-55ec-be1d-101f9e1bcc74", 00:21:55.193 "is_configured": true, 00:21:55.193 "data_offset": 256, 00:21:55.193 "data_size": 7936 00:21:55.193 } 00:21:55.193 ] 00:21:55.194 }' 00:21:55.194 12:02:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:55.194 12:02:45 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:55.760 12:02:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:55.760 12:02:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:55.760 12:02:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:55.760 12:02:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:55.760 12:02:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:55.760 12:02:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:55.760 12:02:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:56.017 12:02:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:56.017 "name": "raid_bdev1", 00:21:56.017 "uuid": "199ccd54-ebb5-4a92-abf2-3293b215035e", 00:21:56.017 "strip_size_kb": 0, 00:21:56.017 "state": "online", 00:21:56.017 "raid_level": "raid1", 00:21:56.017 "superblock": true, 00:21:56.017 "num_base_bdevs": 2, 00:21:56.017 "num_base_bdevs_discovered": 1, 00:21:56.017 "num_base_bdevs_operational": 1, 00:21:56.017 "base_bdevs_list": [ 00:21:56.017 { 00:21:56.017 "name": null, 00:21:56.017 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:56.017 "is_configured": false, 00:21:56.017 "data_offset": 256, 00:21:56.017 "data_size": 7936 00:21:56.017 }, 00:21:56.017 { 00:21:56.017 "name": "BaseBdev2", 00:21:56.017 "uuid": "2261fb42-c080-55ec-be1d-101f9e1bcc74", 00:21:56.017 "is_configured": true, 00:21:56.017 "data_offset": 256, 00:21:56.017 "data_size": 7936 00:21:56.017 } 00:21:56.017 ] 00:21:56.017 }' 00:21:56.017 12:02:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:56.017 12:02:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:56.017 12:02:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:56.017 12:02:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:56.017 12:02:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:21:56.017 12:02:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:21:56.275 [2024-07-12 12:02:46.409245] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:21:56.275 [2024-07-12 12:02:46.409275] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:56.275 [2024-07-12 12:02:46.409287] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa61f30 00:21:56.275 [2024-07-12 12:02:46.409308] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:56.275 [2024-07-12 12:02:46.409560] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:56.275 [2024-07-12 12:02:46.409571] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:56.275 [2024-07-12 12:02:46.409615] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:21:56.275 [2024-07-12 12:02:46.409623] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:21:56.275 [2024-07-12 12:02:46.409628] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:21:56.275 BaseBdev1 00:21:56.275 12:02:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@773 -- # sleep 1 00:21:57.208 12:02:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:57.208 12:02:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:57.208 12:02:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:57.208 12:02:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:57.208 12:02:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:57.208 12:02:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:57.208 12:02:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:57.208 12:02:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:57.208 12:02:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:57.208 12:02:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:57.208 12:02:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:57.208 12:02:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:57.466 12:02:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:57.466 "name": "raid_bdev1", 00:21:57.466 "uuid": "199ccd54-ebb5-4a92-abf2-3293b215035e", 00:21:57.466 "strip_size_kb": 0, 00:21:57.466 "state": "online", 00:21:57.466 "raid_level": "raid1", 00:21:57.466 "superblock": true, 00:21:57.466 "num_base_bdevs": 2, 00:21:57.466 "num_base_bdevs_discovered": 1, 00:21:57.466 "num_base_bdevs_operational": 1, 00:21:57.466 "base_bdevs_list": [ 00:21:57.466 { 00:21:57.466 "name": null, 00:21:57.466 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:57.466 "is_configured": false, 00:21:57.466 "data_offset": 256, 00:21:57.466 "data_size": 7936 00:21:57.466 }, 00:21:57.466 { 00:21:57.466 "name": "BaseBdev2", 00:21:57.466 "uuid": "2261fb42-c080-55ec-be1d-101f9e1bcc74", 00:21:57.466 "is_configured": true, 00:21:57.466 "data_offset": 256, 00:21:57.466 "data_size": 7936 00:21:57.466 } 00:21:57.466 ] 00:21:57.466 }' 00:21:57.466 12:02:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:57.466 12:02:47 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:58.033 12:02:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:58.033 12:02:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:58.033 12:02:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:58.033 12:02:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:58.033 12:02:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:58.033 12:02:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:58.033 12:02:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:58.033 12:02:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:58.033 "name": "raid_bdev1", 00:21:58.033 "uuid": "199ccd54-ebb5-4a92-abf2-3293b215035e", 00:21:58.033 "strip_size_kb": 0, 00:21:58.033 "state": "online", 00:21:58.033 "raid_level": "raid1", 00:21:58.033 "superblock": true, 00:21:58.033 "num_base_bdevs": 2, 00:21:58.033 "num_base_bdevs_discovered": 1, 00:21:58.033 "num_base_bdevs_operational": 1, 00:21:58.033 "base_bdevs_list": [ 00:21:58.033 { 00:21:58.033 "name": null, 00:21:58.033 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:58.033 "is_configured": false, 00:21:58.033 "data_offset": 256, 00:21:58.033 "data_size": 7936 00:21:58.033 }, 00:21:58.033 { 00:21:58.033 "name": "BaseBdev2", 00:21:58.033 "uuid": "2261fb42-c080-55ec-be1d-101f9e1bcc74", 00:21:58.033 "is_configured": true, 00:21:58.033 "data_offset": 256, 00:21:58.033 "data_size": 7936 00:21:58.033 } 00:21:58.033 ] 00:21:58.033 }' 00:21:58.033 12:02:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:58.292 12:02:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:58.292 12:02:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:58.292 12:02:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:58.292 12:02:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:21:58.292 12:02:48 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@648 -- # local es=0 00:21:58.292 12:02:48 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:21:58.292 12:02:48 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:58.292 12:02:48 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:58.292 12:02:48 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:58.292 12:02:48 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:58.292 12:02:48 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:58.292 12:02:48 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:58.292 12:02:48 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:58.292 12:02:48 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:21:58.292 12:02:48 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:21:58.292 [2024-07-12 12:02:48.506735] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:58.293 [2024-07-12 12:02:48.506828] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:21:58.293 [2024-07-12 12:02:48.506837] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:21:58.293 request: 00:21:58.293 { 00:21:58.293 "raid_bdev": "raid_bdev1", 00:21:58.293 "base_bdev": "BaseBdev1", 00:21:58.293 "method": "bdev_raid_add_base_bdev", 00:21:58.293 "req_id": 1 00:21:58.293 } 00:21:58.293 Got JSON-RPC error response 00:21:58.293 response: 00:21:58.293 { 00:21:58.293 "code": -22, 00:21:58.293 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:21:58.293 } 00:21:58.293 12:02:48 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # es=1 00:21:58.293 12:02:48 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:21:58.293 12:02:48 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:21:58.293 12:02:48 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:21:58.293 12:02:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@777 -- # sleep 1 00:21:59.671 12:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:59.671 12:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:59.671 12:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:59.671 12:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:59.671 12:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:59.671 12:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:59.671 12:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:59.671 12:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:59.671 12:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:59.671 12:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:59.671 12:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:59.671 12:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:59.671 12:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:59.671 "name": "raid_bdev1", 00:21:59.671 "uuid": "199ccd54-ebb5-4a92-abf2-3293b215035e", 00:21:59.671 "strip_size_kb": 0, 00:21:59.672 "state": "online", 00:21:59.672 "raid_level": "raid1", 00:21:59.672 "superblock": true, 00:21:59.672 "num_base_bdevs": 2, 00:21:59.672 "num_base_bdevs_discovered": 1, 00:21:59.672 "num_base_bdevs_operational": 1, 00:21:59.672 "base_bdevs_list": [ 00:21:59.672 { 00:21:59.672 "name": null, 00:21:59.672 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:59.672 "is_configured": false, 00:21:59.672 "data_offset": 256, 00:21:59.672 "data_size": 7936 00:21:59.672 }, 00:21:59.672 { 00:21:59.672 "name": "BaseBdev2", 00:21:59.672 "uuid": "2261fb42-c080-55ec-be1d-101f9e1bcc74", 00:21:59.672 "is_configured": true, 00:21:59.672 "data_offset": 256, 00:21:59.672 "data_size": 7936 00:21:59.672 } 00:21:59.672 ] 00:21:59.672 }' 00:21:59.672 12:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:59.672 12:02:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:22:00.239 12:02:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:00.239 12:02:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:00.239 12:02:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:00.239 12:02:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:00.239 12:02:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:00.239 12:02:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:00.239 12:02:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:00.239 12:02:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:00.239 "name": "raid_bdev1", 00:22:00.239 "uuid": "199ccd54-ebb5-4a92-abf2-3293b215035e", 00:22:00.239 "strip_size_kb": 0, 00:22:00.239 "state": "online", 00:22:00.239 "raid_level": "raid1", 00:22:00.239 "superblock": true, 00:22:00.239 "num_base_bdevs": 2, 00:22:00.239 "num_base_bdevs_discovered": 1, 00:22:00.239 "num_base_bdevs_operational": 1, 00:22:00.239 "base_bdevs_list": [ 00:22:00.239 { 00:22:00.239 "name": null, 00:22:00.239 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:00.239 "is_configured": false, 00:22:00.239 "data_offset": 256, 00:22:00.239 "data_size": 7936 00:22:00.239 }, 00:22:00.239 { 00:22:00.239 "name": "BaseBdev2", 00:22:00.239 "uuid": "2261fb42-c080-55ec-be1d-101f9e1bcc74", 00:22:00.239 "is_configured": true, 00:22:00.239 "data_offset": 256, 00:22:00.239 "data_size": 7936 00:22:00.239 } 00:22:00.239 ] 00:22:00.239 }' 00:22:00.239 12:02:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:00.239 12:02:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:00.239 12:02:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:00.239 12:02:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:00.239 12:02:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@782 -- # killprocess 728992 00:22:00.239 12:02:50 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@948 -- # '[' -z 728992 ']' 00:22:00.239 12:02:50 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@952 -- # kill -0 728992 00:22:00.239 12:02:50 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # uname 00:22:00.239 12:02:50 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:00.239 12:02:50 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 728992 00:22:00.499 12:02:50 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:00.499 12:02:50 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:00.499 12:02:50 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 728992' 00:22:00.499 killing process with pid 728992 00:22:00.499 12:02:50 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@967 -- # kill 728992 00:22:00.499 Received shutdown signal, test time was about 60.000000 seconds 00:22:00.499 00:22:00.499 Latency(us) 00:22:00.499 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:22:00.499 =================================================================================================================== 00:22:00.499 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:22:00.499 [2024-07-12 12:02:50.492189] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:00.499 12:02:50 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@972 -- # wait 728992 00:22:00.499 [2024-07-12 12:02:50.492260] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:00.499 [2024-07-12 12:02:50.492294] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:00.499 [2024-07-12 12:02:50.492305] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa61bb0 name raid_bdev1, state offline 00:22:00.499 [2024-07-12 12:02:50.515696] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:00.499 12:02:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@784 -- # return 0 00:22:00.499 00:22:00.499 real 0m25.458s 00:22:00.499 user 0m39.160s 00:22:00.499 sys 0m3.188s 00:22:00.499 12:02:50 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:00.499 12:02:50 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:22:00.499 ************************************ 00:22:00.499 END TEST raid_rebuild_test_sb_4k 00:22:00.499 ************************************ 00:22:00.499 12:02:50 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:00.499 12:02:50 bdev_raid -- bdev/bdev_raid.sh@904 -- # base_malloc_params='-m 32' 00:22:00.499 12:02:50 bdev_raid -- bdev/bdev_raid.sh@905 -- # run_test raid_state_function_test_sb_md_separate raid_state_function_test raid1 2 true 00:22:00.499 12:02:50 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:22:00.499 12:02:50 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:00.499 12:02:50 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:00.758 ************************************ 00:22:00.758 START TEST raid_state_function_test_sb_md_separate 00:22:00.758 ************************************ 00:22:00.758 12:02:50 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:22:00.758 12:02:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:22:00.758 12:02:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:22:00.759 12:02:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:22:00.759 12:02:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:22:00.759 12:02:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:22:00.759 12:02:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:00.759 12:02:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:22:00.759 12:02:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:00.759 12:02:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:00.759 12:02:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:22:00.759 12:02:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:00.759 12:02:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:00.759 12:02:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:22:00.759 12:02:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:22:00.759 12:02:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:22:00.759 12:02:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # local strip_size 00:22:00.759 12:02:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:22:00.759 12:02:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:22:00.759 12:02:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:22:00.759 12:02:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:22:00.759 12:02:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:22:00.759 12:02:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:22:00.759 12:02:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@244 -- # raid_pid=733555 00:22:00.759 12:02:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 733555' 00:22:00.759 Process raid pid: 733555 00:22:00.759 12:02:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@246 -- # waitforlisten 733555 /var/tmp/spdk-raid.sock 00:22:00.759 12:02:50 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@829 -- # '[' -z 733555 ']' 00:22:00.759 12:02:50 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:00.759 12:02:50 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:00.759 12:02:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:22:00.759 12:02:50 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:00.759 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:00.759 12:02:50 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:00.759 12:02:50 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:00.759 [2024-07-12 12:02:50.809296] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:22:00.759 [2024-07-12 12:02:50.809332] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:22:00.759 [2024-07-12 12:02:50.872711] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:00.759 [2024-07-12 12:02:50.949800] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:22:00.759 [2024-07-12 12:02:51.000527] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:00.759 [2024-07-12 12:02:51.000549] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:01.695 12:02:51 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:01.695 12:02:51 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@862 -- # return 0 00:22:01.695 12:02:51 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:22:01.695 [2024-07-12 12:02:51.751337] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:01.695 [2024-07-12 12:02:51.751365] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:01.695 [2024-07-12 12:02:51.751371] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:01.695 [2024-07-12 12:02:51.751376] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:01.695 12:02:51 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:22:01.695 12:02:51 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:01.695 12:02:51 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:01.695 12:02:51 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:01.695 12:02:51 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:01.695 12:02:51 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:01.696 12:02:51 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:01.696 12:02:51 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:01.696 12:02:51 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:01.696 12:02:51 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:01.696 12:02:51 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:01.696 12:02:51 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:01.696 12:02:51 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:01.696 "name": "Existed_Raid", 00:22:01.696 "uuid": "dfcd87a3-ebba-431e-8b35-cedb2a19d472", 00:22:01.696 "strip_size_kb": 0, 00:22:01.696 "state": "configuring", 00:22:01.696 "raid_level": "raid1", 00:22:01.696 "superblock": true, 00:22:01.696 "num_base_bdevs": 2, 00:22:01.696 "num_base_bdevs_discovered": 0, 00:22:01.696 "num_base_bdevs_operational": 2, 00:22:01.696 "base_bdevs_list": [ 00:22:01.696 { 00:22:01.696 "name": "BaseBdev1", 00:22:01.696 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:01.696 "is_configured": false, 00:22:01.696 "data_offset": 0, 00:22:01.696 "data_size": 0 00:22:01.696 }, 00:22:01.696 { 00:22:01.696 "name": "BaseBdev2", 00:22:01.696 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:01.696 "is_configured": false, 00:22:01.696 "data_offset": 0, 00:22:01.696 "data_size": 0 00:22:01.696 } 00:22:01.696 ] 00:22:01.696 }' 00:22:01.696 12:02:51 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:01.696 12:02:51 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:02.264 12:02:52 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:02.523 [2024-07-12 12:02:52.557334] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:02.523 [2024-07-12 12:02:52.557355] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17e21b0 name Existed_Raid, state configuring 00:22:02.523 12:02:52 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:22:02.523 [2024-07-12 12:02:52.729885] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:02.523 [2024-07-12 12:02:52.729904] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:02.523 [2024-07-12 12:02:52.729909] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:02.524 [2024-07-12 12:02:52.729914] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:02.524 12:02:52 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1 00:22:02.782 [2024-07-12 12:02:52.911596] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:02.782 BaseBdev1 00:22:02.782 12:02:52 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:22:02.782 12:02:52 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:22:02.782 12:02:52 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:02.782 12:02:52 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local i 00:22:02.782 12:02:52 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:02.782 12:02:52 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:02.782 12:02:52 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:03.041 12:02:53 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:22:03.041 [ 00:22:03.041 { 00:22:03.041 "name": "BaseBdev1", 00:22:03.041 "aliases": [ 00:22:03.041 "c61ae0fb-4f0b-4743-8286-a60e7ac29bfc" 00:22:03.041 ], 00:22:03.041 "product_name": "Malloc disk", 00:22:03.041 "block_size": 4096, 00:22:03.041 "num_blocks": 8192, 00:22:03.041 "uuid": "c61ae0fb-4f0b-4743-8286-a60e7ac29bfc", 00:22:03.041 "md_size": 32, 00:22:03.041 "md_interleave": false, 00:22:03.041 "dif_type": 0, 00:22:03.041 "assigned_rate_limits": { 00:22:03.041 "rw_ios_per_sec": 0, 00:22:03.041 "rw_mbytes_per_sec": 0, 00:22:03.041 "r_mbytes_per_sec": 0, 00:22:03.041 "w_mbytes_per_sec": 0 00:22:03.041 }, 00:22:03.041 "claimed": true, 00:22:03.041 "claim_type": "exclusive_write", 00:22:03.041 "zoned": false, 00:22:03.041 "supported_io_types": { 00:22:03.041 "read": true, 00:22:03.041 "write": true, 00:22:03.041 "unmap": true, 00:22:03.041 "flush": true, 00:22:03.041 "reset": true, 00:22:03.041 "nvme_admin": false, 00:22:03.041 "nvme_io": false, 00:22:03.041 "nvme_io_md": false, 00:22:03.041 "write_zeroes": true, 00:22:03.041 "zcopy": true, 00:22:03.041 "get_zone_info": false, 00:22:03.041 "zone_management": false, 00:22:03.041 "zone_append": false, 00:22:03.041 "compare": false, 00:22:03.041 "compare_and_write": false, 00:22:03.041 "abort": true, 00:22:03.041 "seek_hole": false, 00:22:03.041 "seek_data": false, 00:22:03.041 "copy": true, 00:22:03.041 "nvme_iov_md": false 00:22:03.041 }, 00:22:03.041 "memory_domains": [ 00:22:03.041 { 00:22:03.041 "dma_device_id": "system", 00:22:03.041 "dma_device_type": 1 00:22:03.041 }, 00:22:03.041 { 00:22:03.041 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:03.041 "dma_device_type": 2 00:22:03.041 } 00:22:03.041 ], 00:22:03.041 "driver_specific": {} 00:22:03.041 } 00:22:03.041 ] 00:22:03.041 12:02:53 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # return 0 00:22:03.041 12:02:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:22:03.041 12:02:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:03.041 12:02:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:03.041 12:02:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:03.041 12:02:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:03.041 12:02:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:03.041 12:02:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:03.041 12:02:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:03.041 12:02:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:03.041 12:02:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:03.041 12:02:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:03.041 12:02:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:03.312 12:02:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:03.312 "name": "Existed_Raid", 00:22:03.312 "uuid": "674eebce-1b62-4327-b687-003b2fc0d36e", 00:22:03.312 "strip_size_kb": 0, 00:22:03.312 "state": "configuring", 00:22:03.312 "raid_level": "raid1", 00:22:03.312 "superblock": true, 00:22:03.312 "num_base_bdevs": 2, 00:22:03.312 "num_base_bdevs_discovered": 1, 00:22:03.312 "num_base_bdevs_operational": 2, 00:22:03.312 "base_bdevs_list": [ 00:22:03.312 { 00:22:03.312 "name": "BaseBdev1", 00:22:03.312 "uuid": "c61ae0fb-4f0b-4743-8286-a60e7ac29bfc", 00:22:03.312 "is_configured": true, 00:22:03.312 "data_offset": 256, 00:22:03.312 "data_size": 7936 00:22:03.312 }, 00:22:03.312 { 00:22:03.312 "name": "BaseBdev2", 00:22:03.312 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:03.313 "is_configured": false, 00:22:03.313 "data_offset": 0, 00:22:03.313 "data_size": 0 00:22:03.313 } 00:22:03.313 ] 00:22:03.313 }' 00:22:03.313 12:02:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:03.313 12:02:53 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:03.925 12:02:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:03.925 [2024-07-12 12:02:54.062702] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:03.925 [2024-07-12 12:02:54.062735] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17e1aa0 name Existed_Raid, state configuring 00:22:03.925 12:02:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:22:04.183 [2024-07-12 12:02:54.231165] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:04.183 [2024-07-12 12:02:54.232269] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:04.183 [2024-07-12 12:02:54.232292] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:04.183 12:02:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:22:04.183 12:02:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:04.183 12:02:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:22:04.183 12:02:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:04.183 12:02:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:04.183 12:02:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:04.183 12:02:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:04.183 12:02:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:04.183 12:02:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:04.183 12:02:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:04.183 12:02:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:04.183 12:02:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:04.183 12:02:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:04.183 12:02:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:04.183 12:02:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:04.183 "name": "Existed_Raid", 00:22:04.183 "uuid": "58fa1872-5d0a-4321-ba1d-dccee5092548", 00:22:04.183 "strip_size_kb": 0, 00:22:04.183 "state": "configuring", 00:22:04.183 "raid_level": "raid1", 00:22:04.183 "superblock": true, 00:22:04.183 "num_base_bdevs": 2, 00:22:04.183 "num_base_bdevs_discovered": 1, 00:22:04.183 "num_base_bdevs_operational": 2, 00:22:04.183 "base_bdevs_list": [ 00:22:04.183 { 00:22:04.183 "name": "BaseBdev1", 00:22:04.183 "uuid": "c61ae0fb-4f0b-4743-8286-a60e7ac29bfc", 00:22:04.183 "is_configured": true, 00:22:04.183 "data_offset": 256, 00:22:04.183 "data_size": 7936 00:22:04.183 }, 00:22:04.183 { 00:22:04.183 "name": "BaseBdev2", 00:22:04.183 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:04.183 "is_configured": false, 00:22:04.183 "data_offset": 0, 00:22:04.183 "data_size": 0 00:22:04.183 } 00:22:04.183 ] 00:22:04.183 }' 00:22:04.183 12:02:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:04.183 12:02:54 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:04.750 12:02:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2 00:22:05.008 [2024-07-12 12:02:55.060487] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:05.008 [2024-07-12 12:02:55.060595] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x197f430 00:22:05.008 [2024-07-12 12:02:55.060603] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:22:05.008 [2024-07-12 12:02:55.060663] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17d9830 00:22:05.008 [2024-07-12 12:02:55.060737] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x197f430 00:22:05.008 [2024-07-12 12:02:55.060743] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x197f430 00:22:05.008 [2024-07-12 12:02:55.060786] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:05.008 BaseBdev2 00:22:05.008 12:02:55 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:22:05.008 12:02:55 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:22:05.008 12:02:55 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:05.008 12:02:55 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local i 00:22:05.009 12:02:55 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:05.009 12:02:55 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:05.009 12:02:55 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:05.009 12:02:55 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:22:05.267 [ 00:22:05.267 { 00:22:05.267 "name": "BaseBdev2", 00:22:05.267 "aliases": [ 00:22:05.267 "114b0233-9ca6-48eb-aba6-98b480c38c19" 00:22:05.267 ], 00:22:05.267 "product_name": "Malloc disk", 00:22:05.267 "block_size": 4096, 00:22:05.267 "num_blocks": 8192, 00:22:05.267 "uuid": "114b0233-9ca6-48eb-aba6-98b480c38c19", 00:22:05.267 "md_size": 32, 00:22:05.267 "md_interleave": false, 00:22:05.267 "dif_type": 0, 00:22:05.267 "assigned_rate_limits": { 00:22:05.267 "rw_ios_per_sec": 0, 00:22:05.267 "rw_mbytes_per_sec": 0, 00:22:05.267 "r_mbytes_per_sec": 0, 00:22:05.267 "w_mbytes_per_sec": 0 00:22:05.267 }, 00:22:05.267 "claimed": true, 00:22:05.267 "claim_type": "exclusive_write", 00:22:05.267 "zoned": false, 00:22:05.267 "supported_io_types": { 00:22:05.267 "read": true, 00:22:05.267 "write": true, 00:22:05.267 "unmap": true, 00:22:05.267 "flush": true, 00:22:05.267 "reset": true, 00:22:05.267 "nvme_admin": false, 00:22:05.268 "nvme_io": false, 00:22:05.268 "nvme_io_md": false, 00:22:05.268 "write_zeroes": true, 00:22:05.268 "zcopy": true, 00:22:05.268 "get_zone_info": false, 00:22:05.268 "zone_management": false, 00:22:05.268 "zone_append": false, 00:22:05.268 "compare": false, 00:22:05.268 "compare_and_write": false, 00:22:05.268 "abort": true, 00:22:05.268 "seek_hole": false, 00:22:05.268 "seek_data": false, 00:22:05.268 "copy": true, 00:22:05.268 "nvme_iov_md": false 00:22:05.268 }, 00:22:05.268 "memory_domains": [ 00:22:05.268 { 00:22:05.268 "dma_device_id": "system", 00:22:05.268 "dma_device_type": 1 00:22:05.268 }, 00:22:05.268 { 00:22:05.268 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:05.268 "dma_device_type": 2 00:22:05.268 } 00:22:05.268 ], 00:22:05.268 "driver_specific": {} 00:22:05.268 } 00:22:05.268 ] 00:22:05.268 12:02:55 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # return 0 00:22:05.268 12:02:55 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:05.268 12:02:55 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:05.268 12:02:55 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:22:05.268 12:02:55 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:05.268 12:02:55 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:05.268 12:02:55 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:05.268 12:02:55 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:05.268 12:02:55 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:05.268 12:02:55 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:05.268 12:02:55 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:05.268 12:02:55 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:05.268 12:02:55 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:05.268 12:02:55 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:05.268 12:02:55 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:05.527 12:02:55 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:05.527 "name": "Existed_Raid", 00:22:05.527 "uuid": "58fa1872-5d0a-4321-ba1d-dccee5092548", 00:22:05.527 "strip_size_kb": 0, 00:22:05.527 "state": "online", 00:22:05.527 "raid_level": "raid1", 00:22:05.527 "superblock": true, 00:22:05.527 "num_base_bdevs": 2, 00:22:05.527 "num_base_bdevs_discovered": 2, 00:22:05.527 "num_base_bdevs_operational": 2, 00:22:05.527 "base_bdevs_list": [ 00:22:05.527 { 00:22:05.527 "name": "BaseBdev1", 00:22:05.527 "uuid": "c61ae0fb-4f0b-4743-8286-a60e7ac29bfc", 00:22:05.527 "is_configured": true, 00:22:05.527 "data_offset": 256, 00:22:05.527 "data_size": 7936 00:22:05.527 }, 00:22:05.527 { 00:22:05.527 "name": "BaseBdev2", 00:22:05.527 "uuid": "114b0233-9ca6-48eb-aba6-98b480c38c19", 00:22:05.527 "is_configured": true, 00:22:05.527 "data_offset": 256, 00:22:05.527 "data_size": 7936 00:22:05.527 } 00:22:05.527 ] 00:22:05.527 }' 00:22:05.527 12:02:55 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:05.527 12:02:55 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:06.095 12:02:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:22:06.095 12:02:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:22:06.095 12:02:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:06.095 12:02:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:06.095 12:02:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:06.095 12:02:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:22:06.095 12:02:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:22:06.095 12:02:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:06.095 [2024-07-12 12:02:56.191603] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:06.095 12:02:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:06.095 "name": "Existed_Raid", 00:22:06.095 "aliases": [ 00:22:06.095 "58fa1872-5d0a-4321-ba1d-dccee5092548" 00:22:06.095 ], 00:22:06.095 "product_name": "Raid Volume", 00:22:06.095 "block_size": 4096, 00:22:06.095 "num_blocks": 7936, 00:22:06.095 "uuid": "58fa1872-5d0a-4321-ba1d-dccee5092548", 00:22:06.095 "md_size": 32, 00:22:06.095 "md_interleave": false, 00:22:06.095 "dif_type": 0, 00:22:06.095 "assigned_rate_limits": { 00:22:06.095 "rw_ios_per_sec": 0, 00:22:06.095 "rw_mbytes_per_sec": 0, 00:22:06.095 "r_mbytes_per_sec": 0, 00:22:06.095 "w_mbytes_per_sec": 0 00:22:06.095 }, 00:22:06.095 "claimed": false, 00:22:06.095 "zoned": false, 00:22:06.095 "supported_io_types": { 00:22:06.095 "read": true, 00:22:06.095 "write": true, 00:22:06.095 "unmap": false, 00:22:06.095 "flush": false, 00:22:06.095 "reset": true, 00:22:06.095 "nvme_admin": false, 00:22:06.095 "nvme_io": false, 00:22:06.095 "nvme_io_md": false, 00:22:06.095 "write_zeroes": true, 00:22:06.095 "zcopy": false, 00:22:06.095 "get_zone_info": false, 00:22:06.095 "zone_management": false, 00:22:06.095 "zone_append": false, 00:22:06.095 "compare": false, 00:22:06.095 "compare_and_write": false, 00:22:06.095 "abort": false, 00:22:06.095 "seek_hole": false, 00:22:06.095 "seek_data": false, 00:22:06.095 "copy": false, 00:22:06.095 "nvme_iov_md": false 00:22:06.095 }, 00:22:06.095 "memory_domains": [ 00:22:06.095 { 00:22:06.095 "dma_device_id": "system", 00:22:06.095 "dma_device_type": 1 00:22:06.095 }, 00:22:06.095 { 00:22:06.095 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:06.095 "dma_device_type": 2 00:22:06.095 }, 00:22:06.095 { 00:22:06.095 "dma_device_id": "system", 00:22:06.095 "dma_device_type": 1 00:22:06.095 }, 00:22:06.095 { 00:22:06.095 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:06.095 "dma_device_type": 2 00:22:06.095 } 00:22:06.095 ], 00:22:06.095 "driver_specific": { 00:22:06.095 "raid": { 00:22:06.095 "uuid": "58fa1872-5d0a-4321-ba1d-dccee5092548", 00:22:06.095 "strip_size_kb": 0, 00:22:06.095 "state": "online", 00:22:06.095 "raid_level": "raid1", 00:22:06.095 "superblock": true, 00:22:06.095 "num_base_bdevs": 2, 00:22:06.095 "num_base_bdevs_discovered": 2, 00:22:06.095 "num_base_bdevs_operational": 2, 00:22:06.095 "base_bdevs_list": [ 00:22:06.095 { 00:22:06.095 "name": "BaseBdev1", 00:22:06.095 "uuid": "c61ae0fb-4f0b-4743-8286-a60e7ac29bfc", 00:22:06.095 "is_configured": true, 00:22:06.095 "data_offset": 256, 00:22:06.095 "data_size": 7936 00:22:06.095 }, 00:22:06.095 { 00:22:06.095 "name": "BaseBdev2", 00:22:06.095 "uuid": "114b0233-9ca6-48eb-aba6-98b480c38c19", 00:22:06.095 "is_configured": true, 00:22:06.095 "data_offset": 256, 00:22:06.095 "data_size": 7936 00:22:06.095 } 00:22:06.095 ] 00:22:06.095 } 00:22:06.095 } 00:22:06.095 }' 00:22:06.095 12:02:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:06.095 12:02:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:22:06.095 BaseBdev2' 00:22:06.096 12:02:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:06.096 12:02:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:22:06.096 12:02:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:06.354 12:02:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:06.354 "name": "BaseBdev1", 00:22:06.354 "aliases": [ 00:22:06.354 "c61ae0fb-4f0b-4743-8286-a60e7ac29bfc" 00:22:06.354 ], 00:22:06.354 "product_name": "Malloc disk", 00:22:06.354 "block_size": 4096, 00:22:06.354 "num_blocks": 8192, 00:22:06.354 "uuid": "c61ae0fb-4f0b-4743-8286-a60e7ac29bfc", 00:22:06.354 "md_size": 32, 00:22:06.354 "md_interleave": false, 00:22:06.354 "dif_type": 0, 00:22:06.354 "assigned_rate_limits": { 00:22:06.354 "rw_ios_per_sec": 0, 00:22:06.354 "rw_mbytes_per_sec": 0, 00:22:06.354 "r_mbytes_per_sec": 0, 00:22:06.354 "w_mbytes_per_sec": 0 00:22:06.354 }, 00:22:06.354 "claimed": true, 00:22:06.354 "claim_type": "exclusive_write", 00:22:06.354 "zoned": false, 00:22:06.354 "supported_io_types": { 00:22:06.354 "read": true, 00:22:06.354 "write": true, 00:22:06.354 "unmap": true, 00:22:06.354 "flush": true, 00:22:06.354 "reset": true, 00:22:06.354 "nvme_admin": false, 00:22:06.354 "nvme_io": false, 00:22:06.354 "nvme_io_md": false, 00:22:06.354 "write_zeroes": true, 00:22:06.354 "zcopy": true, 00:22:06.354 "get_zone_info": false, 00:22:06.354 "zone_management": false, 00:22:06.354 "zone_append": false, 00:22:06.354 "compare": false, 00:22:06.354 "compare_and_write": false, 00:22:06.354 "abort": true, 00:22:06.354 "seek_hole": false, 00:22:06.354 "seek_data": false, 00:22:06.354 "copy": true, 00:22:06.354 "nvme_iov_md": false 00:22:06.354 }, 00:22:06.354 "memory_domains": [ 00:22:06.354 { 00:22:06.354 "dma_device_id": "system", 00:22:06.354 "dma_device_type": 1 00:22:06.354 }, 00:22:06.354 { 00:22:06.354 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:06.354 "dma_device_type": 2 00:22:06.354 } 00:22:06.354 ], 00:22:06.354 "driver_specific": {} 00:22:06.354 }' 00:22:06.354 12:02:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:06.354 12:02:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:06.354 12:02:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:22:06.354 12:02:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:06.354 12:02:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:06.354 12:02:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:22:06.354 12:02:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:06.613 12:02:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:06.613 12:02:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:22:06.613 12:02:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:06.613 12:02:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:06.613 12:02:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:22:06.613 12:02:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:06.613 12:02:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:22:06.613 12:02:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:06.870 12:02:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:06.870 "name": "BaseBdev2", 00:22:06.870 "aliases": [ 00:22:06.870 "114b0233-9ca6-48eb-aba6-98b480c38c19" 00:22:06.870 ], 00:22:06.870 "product_name": "Malloc disk", 00:22:06.870 "block_size": 4096, 00:22:06.870 "num_blocks": 8192, 00:22:06.870 "uuid": "114b0233-9ca6-48eb-aba6-98b480c38c19", 00:22:06.870 "md_size": 32, 00:22:06.870 "md_interleave": false, 00:22:06.870 "dif_type": 0, 00:22:06.870 "assigned_rate_limits": { 00:22:06.870 "rw_ios_per_sec": 0, 00:22:06.870 "rw_mbytes_per_sec": 0, 00:22:06.870 "r_mbytes_per_sec": 0, 00:22:06.870 "w_mbytes_per_sec": 0 00:22:06.870 }, 00:22:06.870 "claimed": true, 00:22:06.870 "claim_type": "exclusive_write", 00:22:06.870 "zoned": false, 00:22:06.870 "supported_io_types": { 00:22:06.870 "read": true, 00:22:06.870 "write": true, 00:22:06.870 "unmap": true, 00:22:06.870 "flush": true, 00:22:06.870 "reset": true, 00:22:06.870 "nvme_admin": false, 00:22:06.870 "nvme_io": false, 00:22:06.870 "nvme_io_md": false, 00:22:06.870 "write_zeroes": true, 00:22:06.870 "zcopy": true, 00:22:06.870 "get_zone_info": false, 00:22:06.870 "zone_management": false, 00:22:06.870 "zone_append": false, 00:22:06.870 "compare": false, 00:22:06.870 "compare_and_write": false, 00:22:06.871 "abort": true, 00:22:06.871 "seek_hole": false, 00:22:06.871 "seek_data": false, 00:22:06.871 "copy": true, 00:22:06.871 "nvme_iov_md": false 00:22:06.871 }, 00:22:06.871 "memory_domains": [ 00:22:06.871 { 00:22:06.871 "dma_device_id": "system", 00:22:06.871 "dma_device_type": 1 00:22:06.871 }, 00:22:06.871 { 00:22:06.871 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:06.871 "dma_device_type": 2 00:22:06.871 } 00:22:06.871 ], 00:22:06.871 "driver_specific": {} 00:22:06.871 }' 00:22:06.871 12:02:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:06.871 12:02:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:06.871 12:02:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:22:06.871 12:02:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:06.871 12:02:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:06.871 12:02:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:22:06.871 12:02:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:06.871 12:02:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:07.130 12:02:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:22:07.130 12:02:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:07.130 12:02:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:07.130 12:02:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:22:07.130 12:02:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:22:07.414 [2024-07-12 12:02:57.382561] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:07.414 12:02:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@275 -- # local expected_state 00:22:07.414 12:02:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:22:07.414 12:02:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:07.415 12:02:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:22:07.415 12:02:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:22:07.415 12:02:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:22:07.415 12:02:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:07.415 12:02:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:07.415 12:02:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:07.415 12:02:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:07.415 12:02:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:07.415 12:02:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:07.415 12:02:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:07.415 12:02:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:07.415 12:02:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:07.415 12:02:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:07.415 12:02:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:07.415 12:02:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:07.415 "name": "Existed_Raid", 00:22:07.415 "uuid": "58fa1872-5d0a-4321-ba1d-dccee5092548", 00:22:07.415 "strip_size_kb": 0, 00:22:07.415 "state": "online", 00:22:07.415 "raid_level": "raid1", 00:22:07.415 "superblock": true, 00:22:07.415 "num_base_bdevs": 2, 00:22:07.415 "num_base_bdevs_discovered": 1, 00:22:07.415 "num_base_bdevs_operational": 1, 00:22:07.415 "base_bdevs_list": [ 00:22:07.415 { 00:22:07.415 "name": null, 00:22:07.415 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:07.415 "is_configured": false, 00:22:07.415 "data_offset": 256, 00:22:07.415 "data_size": 7936 00:22:07.415 }, 00:22:07.415 { 00:22:07.415 "name": "BaseBdev2", 00:22:07.415 "uuid": "114b0233-9ca6-48eb-aba6-98b480c38c19", 00:22:07.415 "is_configured": true, 00:22:07.415 "data_offset": 256, 00:22:07.415 "data_size": 7936 00:22:07.415 } 00:22:07.415 ] 00:22:07.415 }' 00:22:07.415 12:02:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:07.415 12:02:57 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:07.983 12:02:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:22:07.983 12:02:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:07.983 12:02:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:07.983 12:02:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:07.983 12:02:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:07.983 12:02:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:07.983 12:02:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:22:08.241 [2024-07-12 12:02:58.342723] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:08.241 [2024-07-12 12:02:58.342789] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:08.241 [2024-07-12 12:02:58.353339] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:08.241 [2024-07-12 12:02:58.353365] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:08.242 [2024-07-12 12:02:58.353371] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x197f430 name Existed_Raid, state offline 00:22:08.242 12:02:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:08.242 12:02:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:08.242 12:02:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:08.242 12:02:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:22:08.500 12:02:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:22:08.500 12:02:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:22:08.500 12:02:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:22:08.500 12:02:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@341 -- # killprocess 733555 00:22:08.500 12:02:58 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@948 -- # '[' -z 733555 ']' 00:22:08.500 12:02:58 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@952 -- # kill -0 733555 00:22:08.500 12:02:58 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # uname 00:22:08.500 12:02:58 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:08.500 12:02:58 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 733555 00:22:08.500 12:02:58 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:08.500 12:02:58 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:08.500 12:02:58 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 733555' 00:22:08.500 killing process with pid 733555 00:22:08.500 12:02:58 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@967 -- # kill 733555 00:22:08.500 [2024-07-12 12:02:58.591207] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:08.500 12:02:58 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@972 -- # wait 733555 00:22:08.500 [2024-07-12 12:02:58.591969] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:08.759 12:02:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@343 -- # return 0 00:22:08.759 00:22:08.759 real 0m8.011s 00:22:08.759 user 0m14.347s 00:22:08.759 sys 0m1.305s 00:22:08.759 12:02:58 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:08.759 12:02:58 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:08.759 ************************************ 00:22:08.759 END TEST raid_state_function_test_sb_md_separate 00:22:08.759 ************************************ 00:22:08.759 12:02:58 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:08.759 12:02:58 bdev_raid -- bdev/bdev_raid.sh@906 -- # run_test raid_superblock_test_md_separate raid_superblock_test raid1 2 00:22:08.759 12:02:58 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:22:08.759 12:02:58 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:08.759 12:02:58 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:08.759 ************************************ 00:22:08.759 START TEST raid_superblock_test_md_separate 00:22:08.759 ************************************ 00:22:08.759 12:02:58 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:22:08.759 12:02:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:22:08.759 12:02:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:22:08.759 12:02:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:22:08.759 12:02:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:22:08.759 12:02:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:22:08.759 12:02:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:22:08.759 12:02:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:22:08.759 12:02:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:22:08.759 12:02:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:22:08.759 12:02:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@398 -- # local strip_size 00:22:08.759 12:02:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:22:08.759 12:02:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:22:08.759 12:02:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:22:08.759 12:02:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:22:08.759 12:02:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:22:08.759 12:02:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@411 -- # raid_pid=735143 00:22:08.759 12:02:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@412 -- # waitforlisten 735143 /var/tmp/spdk-raid.sock 00:22:08.759 12:02:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:22:08.759 12:02:58 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@829 -- # '[' -z 735143 ']' 00:22:08.759 12:02:58 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:08.759 12:02:58 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:08.759 12:02:58 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:08.759 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:08.759 12:02:58 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:08.759 12:02:58 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:08.759 [2024-07-12 12:02:58.875771] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:22:08.759 [2024-07-12 12:02:58.875805] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid735143 ] 00:22:08.759 [2024-07-12 12:02:58.938283] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:09.018 [2024-07-12 12:02:59.017236] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:22:09.018 [2024-07-12 12:02:59.076174] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:09.018 [2024-07-12 12:02:59.076198] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:09.586 12:02:59 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:09.586 12:02:59 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@862 -- # return 0 00:22:09.586 12:02:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:22:09.586 12:02:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:09.586 12:02:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:22:09.586 12:02:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:22:09.586 12:02:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:22:09.586 12:02:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:09.586 12:02:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:09.586 12:02:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:09.586 12:02:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc1 00:22:09.586 malloc1 00:22:09.845 12:02:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:09.845 [2024-07-12 12:02:59.988343] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:09.845 [2024-07-12 12:02:59.988376] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:09.845 [2024-07-12 12:02:59.988388] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc94d40 00:22:09.845 [2024-07-12 12:02:59.988394] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:09.845 [2024-07-12 12:02:59.989502] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:09.845 [2024-07-12 12:02:59.989529] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:09.845 pt1 00:22:09.845 12:03:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:09.845 12:03:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:09.845 12:03:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:22:09.845 12:03:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:22:09.845 12:03:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:22:09.845 12:03:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:09.845 12:03:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:09.845 12:03:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:09.845 12:03:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc2 00:22:10.104 malloc2 00:22:10.104 12:03:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:10.104 [2024-07-12 12:03:00.325663] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:10.104 [2024-07-12 12:03:00.325703] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:10.104 [2024-07-12 12:03:00.325713] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe213b0 00:22:10.104 [2024-07-12 12:03:00.325719] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:10.104 [2024-07-12 12:03:00.326718] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:10.104 [2024-07-12 12:03:00.326736] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:10.104 pt2 00:22:10.104 12:03:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:10.104 12:03:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:10.104 12:03:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:22:10.361 [2024-07-12 12:03:00.498122] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:10.361 [2024-07-12 12:03:00.499020] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:10.361 [2024-07-12 12:03:00.499124] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe15890 00:22:10.361 [2024-07-12 12:03:00.499132] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:22:10.361 [2024-07-12 12:03:00.499181] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe17c80 00:22:10.361 [2024-07-12 12:03:00.499257] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe15890 00:22:10.361 [2024-07-12 12:03:00.499262] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe15890 00:22:10.361 [2024-07-12 12:03:00.499307] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:10.361 12:03:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:10.361 12:03:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:10.361 12:03:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:10.361 12:03:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:10.361 12:03:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:10.361 12:03:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:10.361 12:03:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:10.361 12:03:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:10.361 12:03:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:10.361 12:03:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:10.361 12:03:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:10.361 12:03:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:10.619 12:03:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:10.619 "name": "raid_bdev1", 00:22:10.619 "uuid": "97afe5a5-bb65-498a-9bf8-d2f605bc5709", 00:22:10.619 "strip_size_kb": 0, 00:22:10.619 "state": "online", 00:22:10.619 "raid_level": "raid1", 00:22:10.619 "superblock": true, 00:22:10.619 "num_base_bdevs": 2, 00:22:10.619 "num_base_bdevs_discovered": 2, 00:22:10.619 "num_base_bdevs_operational": 2, 00:22:10.619 "base_bdevs_list": [ 00:22:10.619 { 00:22:10.619 "name": "pt1", 00:22:10.619 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:10.619 "is_configured": true, 00:22:10.619 "data_offset": 256, 00:22:10.619 "data_size": 7936 00:22:10.619 }, 00:22:10.619 { 00:22:10.619 "name": "pt2", 00:22:10.619 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:10.619 "is_configured": true, 00:22:10.619 "data_offset": 256, 00:22:10.619 "data_size": 7936 00:22:10.619 } 00:22:10.619 ] 00:22:10.619 }' 00:22:10.619 12:03:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:10.619 12:03:00 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:11.187 12:03:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:22:11.187 12:03:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:22:11.187 12:03:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:11.187 12:03:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:11.187 12:03:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:11.187 12:03:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:22:11.187 12:03:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:11.187 12:03:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:11.187 [2024-07-12 12:03:01.292362] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:11.187 12:03:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:11.187 "name": "raid_bdev1", 00:22:11.187 "aliases": [ 00:22:11.187 "97afe5a5-bb65-498a-9bf8-d2f605bc5709" 00:22:11.187 ], 00:22:11.187 "product_name": "Raid Volume", 00:22:11.187 "block_size": 4096, 00:22:11.187 "num_blocks": 7936, 00:22:11.187 "uuid": "97afe5a5-bb65-498a-9bf8-d2f605bc5709", 00:22:11.187 "md_size": 32, 00:22:11.187 "md_interleave": false, 00:22:11.187 "dif_type": 0, 00:22:11.187 "assigned_rate_limits": { 00:22:11.187 "rw_ios_per_sec": 0, 00:22:11.187 "rw_mbytes_per_sec": 0, 00:22:11.187 "r_mbytes_per_sec": 0, 00:22:11.187 "w_mbytes_per_sec": 0 00:22:11.187 }, 00:22:11.187 "claimed": false, 00:22:11.187 "zoned": false, 00:22:11.187 "supported_io_types": { 00:22:11.187 "read": true, 00:22:11.187 "write": true, 00:22:11.187 "unmap": false, 00:22:11.187 "flush": false, 00:22:11.187 "reset": true, 00:22:11.187 "nvme_admin": false, 00:22:11.187 "nvme_io": false, 00:22:11.187 "nvme_io_md": false, 00:22:11.187 "write_zeroes": true, 00:22:11.187 "zcopy": false, 00:22:11.187 "get_zone_info": false, 00:22:11.187 "zone_management": false, 00:22:11.187 "zone_append": false, 00:22:11.187 "compare": false, 00:22:11.187 "compare_and_write": false, 00:22:11.187 "abort": false, 00:22:11.187 "seek_hole": false, 00:22:11.187 "seek_data": false, 00:22:11.187 "copy": false, 00:22:11.187 "nvme_iov_md": false 00:22:11.187 }, 00:22:11.187 "memory_domains": [ 00:22:11.187 { 00:22:11.187 "dma_device_id": "system", 00:22:11.187 "dma_device_type": 1 00:22:11.187 }, 00:22:11.187 { 00:22:11.187 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:11.187 "dma_device_type": 2 00:22:11.187 }, 00:22:11.187 { 00:22:11.187 "dma_device_id": "system", 00:22:11.187 "dma_device_type": 1 00:22:11.187 }, 00:22:11.187 { 00:22:11.187 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:11.187 "dma_device_type": 2 00:22:11.187 } 00:22:11.187 ], 00:22:11.187 "driver_specific": { 00:22:11.187 "raid": { 00:22:11.187 "uuid": "97afe5a5-bb65-498a-9bf8-d2f605bc5709", 00:22:11.187 "strip_size_kb": 0, 00:22:11.187 "state": "online", 00:22:11.187 "raid_level": "raid1", 00:22:11.187 "superblock": true, 00:22:11.187 "num_base_bdevs": 2, 00:22:11.187 "num_base_bdevs_discovered": 2, 00:22:11.187 "num_base_bdevs_operational": 2, 00:22:11.187 "base_bdevs_list": [ 00:22:11.187 { 00:22:11.187 "name": "pt1", 00:22:11.187 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:11.187 "is_configured": true, 00:22:11.187 "data_offset": 256, 00:22:11.187 "data_size": 7936 00:22:11.187 }, 00:22:11.187 { 00:22:11.187 "name": "pt2", 00:22:11.187 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:11.187 "is_configured": true, 00:22:11.187 "data_offset": 256, 00:22:11.187 "data_size": 7936 00:22:11.187 } 00:22:11.187 ] 00:22:11.187 } 00:22:11.187 } 00:22:11.187 }' 00:22:11.187 12:03:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:11.187 12:03:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:22:11.187 pt2' 00:22:11.187 12:03:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:11.187 12:03:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:22:11.187 12:03:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:11.446 12:03:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:11.446 "name": "pt1", 00:22:11.446 "aliases": [ 00:22:11.446 "00000000-0000-0000-0000-000000000001" 00:22:11.446 ], 00:22:11.446 "product_name": "passthru", 00:22:11.446 "block_size": 4096, 00:22:11.446 "num_blocks": 8192, 00:22:11.446 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:11.446 "md_size": 32, 00:22:11.446 "md_interleave": false, 00:22:11.446 "dif_type": 0, 00:22:11.446 "assigned_rate_limits": { 00:22:11.446 "rw_ios_per_sec": 0, 00:22:11.446 "rw_mbytes_per_sec": 0, 00:22:11.446 "r_mbytes_per_sec": 0, 00:22:11.446 "w_mbytes_per_sec": 0 00:22:11.446 }, 00:22:11.446 "claimed": true, 00:22:11.446 "claim_type": "exclusive_write", 00:22:11.446 "zoned": false, 00:22:11.446 "supported_io_types": { 00:22:11.446 "read": true, 00:22:11.446 "write": true, 00:22:11.446 "unmap": true, 00:22:11.446 "flush": true, 00:22:11.446 "reset": true, 00:22:11.446 "nvme_admin": false, 00:22:11.446 "nvme_io": false, 00:22:11.446 "nvme_io_md": false, 00:22:11.446 "write_zeroes": true, 00:22:11.446 "zcopy": true, 00:22:11.446 "get_zone_info": false, 00:22:11.446 "zone_management": false, 00:22:11.446 "zone_append": false, 00:22:11.446 "compare": false, 00:22:11.446 "compare_and_write": false, 00:22:11.446 "abort": true, 00:22:11.446 "seek_hole": false, 00:22:11.446 "seek_data": false, 00:22:11.446 "copy": true, 00:22:11.446 "nvme_iov_md": false 00:22:11.446 }, 00:22:11.446 "memory_domains": [ 00:22:11.446 { 00:22:11.446 "dma_device_id": "system", 00:22:11.446 "dma_device_type": 1 00:22:11.446 }, 00:22:11.446 { 00:22:11.446 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:11.446 "dma_device_type": 2 00:22:11.446 } 00:22:11.446 ], 00:22:11.446 "driver_specific": { 00:22:11.446 "passthru": { 00:22:11.446 "name": "pt1", 00:22:11.446 "base_bdev_name": "malloc1" 00:22:11.446 } 00:22:11.446 } 00:22:11.446 }' 00:22:11.446 12:03:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:11.446 12:03:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:11.446 12:03:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:22:11.446 12:03:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:11.446 12:03:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:11.446 12:03:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:22:11.446 12:03:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:11.705 12:03:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:11.705 12:03:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:22:11.705 12:03:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:11.705 12:03:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:11.705 12:03:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:22:11.705 12:03:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:11.705 12:03:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:11.705 12:03:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:22:11.963 12:03:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:11.963 "name": "pt2", 00:22:11.963 "aliases": [ 00:22:11.963 "00000000-0000-0000-0000-000000000002" 00:22:11.963 ], 00:22:11.963 "product_name": "passthru", 00:22:11.963 "block_size": 4096, 00:22:11.963 "num_blocks": 8192, 00:22:11.963 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:11.963 "md_size": 32, 00:22:11.963 "md_interleave": false, 00:22:11.963 "dif_type": 0, 00:22:11.963 "assigned_rate_limits": { 00:22:11.963 "rw_ios_per_sec": 0, 00:22:11.963 "rw_mbytes_per_sec": 0, 00:22:11.963 "r_mbytes_per_sec": 0, 00:22:11.963 "w_mbytes_per_sec": 0 00:22:11.963 }, 00:22:11.963 "claimed": true, 00:22:11.963 "claim_type": "exclusive_write", 00:22:11.963 "zoned": false, 00:22:11.963 "supported_io_types": { 00:22:11.963 "read": true, 00:22:11.963 "write": true, 00:22:11.963 "unmap": true, 00:22:11.963 "flush": true, 00:22:11.963 "reset": true, 00:22:11.963 "nvme_admin": false, 00:22:11.963 "nvme_io": false, 00:22:11.963 "nvme_io_md": false, 00:22:11.963 "write_zeroes": true, 00:22:11.963 "zcopy": true, 00:22:11.963 "get_zone_info": false, 00:22:11.963 "zone_management": false, 00:22:11.963 "zone_append": false, 00:22:11.963 "compare": false, 00:22:11.963 "compare_and_write": false, 00:22:11.963 "abort": true, 00:22:11.963 "seek_hole": false, 00:22:11.963 "seek_data": false, 00:22:11.963 "copy": true, 00:22:11.963 "nvme_iov_md": false 00:22:11.963 }, 00:22:11.964 "memory_domains": [ 00:22:11.964 { 00:22:11.964 "dma_device_id": "system", 00:22:11.964 "dma_device_type": 1 00:22:11.964 }, 00:22:11.964 { 00:22:11.964 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:11.964 "dma_device_type": 2 00:22:11.964 } 00:22:11.964 ], 00:22:11.964 "driver_specific": { 00:22:11.964 "passthru": { 00:22:11.964 "name": "pt2", 00:22:11.964 "base_bdev_name": "malloc2" 00:22:11.964 } 00:22:11.964 } 00:22:11.964 }' 00:22:11.964 12:03:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:11.964 12:03:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:11.964 12:03:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:22:11.964 12:03:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:11.964 12:03:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:11.964 12:03:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:22:11.964 12:03:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:11.964 12:03:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:12.223 12:03:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:22:12.223 12:03:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:12.223 12:03:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:12.223 12:03:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:22:12.223 12:03:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:12.223 12:03:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:22:12.223 [2024-07-12 12:03:02.451322] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:12.223 12:03:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=97afe5a5-bb65-498a-9bf8-d2f605bc5709 00:22:12.223 12:03:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@435 -- # '[' -z 97afe5a5-bb65-498a-9bf8-d2f605bc5709 ']' 00:22:12.223 12:03:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:12.482 [2024-07-12 12:03:02.623617] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:12.482 [2024-07-12 12:03:02.623630] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:12.482 [2024-07-12 12:03:02.623668] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:12.482 [2024-07-12 12:03:02.623706] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:12.482 [2024-07-12 12:03:02.623712] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe15890 name raid_bdev1, state offline 00:22:12.482 12:03:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:12.482 12:03:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:22:12.741 12:03:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:22:12.741 12:03:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:22:12.741 12:03:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:12.741 12:03:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:22:12.741 12:03:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:12.741 12:03:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:22:12.998 12:03:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:22:12.998 12:03:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:22:13.257 12:03:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:22:13.257 12:03:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:22:13.257 12:03:03 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:22:13.257 12:03:03 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:22:13.257 12:03:03 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:13.257 12:03:03 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:13.257 12:03:03 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:13.257 12:03:03 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:13.257 12:03:03 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:13.257 12:03:03 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:13.257 12:03:03 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:13.257 12:03:03 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:22:13.257 12:03:03 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:22:13.257 [2024-07-12 12:03:03.473809] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:22:13.257 [2024-07-12 12:03:03.474798] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:22:13.257 [2024-07-12 12:03:03.474838] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:22:13.257 [2024-07-12 12:03:03.474864] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:22:13.257 [2024-07-12 12:03:03.474873] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:13.257 [2024-07-12 12:03:03.474894] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc95510 name raid_bdev1, state configuring 00:22:13.257 request: 00:22:13.257 { 00:22:13.257 "name": "raid_bdev1", 00:22:13.257 "raid_level": "raid1", 00:22:13.257 "base_bdevs": [ 00:22:13.257 "malloc1", 00:22:13.257 "malloc2" 00:22:13.257 ], 00:22:13.257 "superblock": false, 00:22:13.257 "method": "bdev_raid_create", 00:22:13.257 "req_id": 1 00:22:13.257 } 00:22:13.257 Got JSON-RPC error response 00:22:13.257 response: 00:22:13.257 { 00:22:13.257 "code": -17, 00:22:13.257 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:22:13.257 } 00:22:13.257 12:03:03 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # es=1 00:22:13.257 12:03:03 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:22:13.257 12:03:03 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:22:13.257 12:03:03 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:22:13.257 12:03:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:13.257 12:03:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:22:13.516 12:03:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:22:13.516 12:03:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:22:13.516 12:03:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:13.775 [2024-07-12 12:03:03.794609] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:13.775 [2024-07-12 12:03:03.794639] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:13.775 [2024-07-12 12:03:03.794649] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe17c20 00:22:13.775 [2024-07-12 12:03:03.794655] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:13.775 [2024-07-12 12:03:03.795722] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:13.775 [2024-07-12 12:03:03.795742] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:13.775 [2024-07-12 12:03:03.795773] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:22:13.775 [2024-07-12 12:03:03.795789] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:13.775 pt1 00:22:13.775 12:03:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:22:13.775 12:03:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:13.775 12:03:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:13.775 12:03:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:13.775 12:03:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:13.775 12:03:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:13.775 12:03:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:13.775 12:03:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:13.775 12:03:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:13.775 12:03:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:13.775 12:03:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:13.775 12:03:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:13.775 12:03:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:13.775 "name": "raid_bdev1", 00:22:13.775 "uuid": "97afe5a5-bb65-498a-9bf8-d2f605bc5709", 00:22:13.775 "strip_size_kb": 0, 00:22:13.775 "state": "configuring", 00:22:13.775 "raid_level": "raid1", 00:22:13.775 "superblock": true, 00:22:13.775 "num_base_bdevs": 2, 00:22:13.775 "num_base_bdevs_discovered": 1, 00:22:13.775 "num_base_bdevs_operational": 2, 00:22:13.775 "base_bdevs_list": [ 00:22:13.775 { 00:22:13.775 "name": "pt1", 00:22:13.775 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:13.775 "is_configured": true, 00:22:13.775 "data_offset": 256, 00:22:13.775 "data_size": 7936 00:22:13.775 }, 00:22:13.775 { 00:22:13.775 "name": null, 00:22:13.775 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:13.775 "is_configured": false, 00:22:13.775 "data_offset": 256, 00:22:13.775 "data_size": 7936 00:22:13.775 } 00:22:13.775 ] 00:22:13.775 }' 00:22:13.775 12:03:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:13.775 12:03:03 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:14.342 12:03:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:22:14.342 12:03:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:22:14.342 12:03:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:14.342 12:03:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:14.600 [2024-07-12 12:03:04.616742] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:14.600 [2024-07-12 12:03:04.616787] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:14.600 [2024-07-12 12:03:04.616799] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe17f90 00:22:14.601 [2024-07-12 12:03:04.616821] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:14.601 [2024-07-12 12:03:04.616963] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:14.601 [2024-07-12 12:03:04.616972] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:14.601 [2024-07-12 12:03:04.616999] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:14.601 [2024-07-12 12:03:04.617011] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:14.601 [2024-07-12 12:03:04.617077] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe189d0 00:22:14.601 [2024-07-12 12:03:04.617083] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:22:14.601 [2024-07-12 12:03:04.617120] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe19b30 00:22:14.601 [2024-07-12 12:03:04.617191] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe189d0 00:22:14.601 [2024-07-12 12:03:04.617196] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe189d0 00:22:14.601 [2024-07-12 12:03:04.617243] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:14.601 pt2 00:22:14.601 12:03:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:22:14.601 12:03:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:14.601 12:03:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:14.601 12:03:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:14.601 12:03:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:14.601 12:03:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:14.601 12:03:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:14.601 12:03:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:14.601 12:03:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:14.601 12:03:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:14.601 12:03:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:14.601 12:03:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:14.601 12:03:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:14.601 12:03:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:14.601 12:03:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:14.601 "name": "raid_bdev1", 00:22:14.601 "uuid": "97afe5a5-bb65-498a-9bf8-d2f605bc5709", 00:22:14.601 "strip_size_kb": 0, 00:22:14.601 "state": "online", 00:22:14.601 "raid_level": "raid1", 00:22:14.601 "superblock": true, 00:22:14.601 "num_base_bdevs": 2, 00:22:14.601 "num_base_bdevs_discovered": 2, 00:22:14.601 "num_base_bdevs_operational": 2, 00:22:14.601 "base_bdevs_list": [ 00:22:14.601 { 00:22:14.601 "name": "pt1", 00:22:14.601 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:14.601 "is_configured": true, 00:22:14.601 "data_offset": 256, 00:22:14.601 "data_size": 7936 00:22:14.601 }, 00:22:14.601 { 00:22:14.601 "name": "pt2", 00:22:14.601 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:14.601 "is_configured": true, 00:22:14.601 "data_offset": 256, 00:22:14.601 "data_size": 7936 00:22:14.601 } 00:22:14.601 ] 00:22:14.601 }' 00:22:14.601 12:03:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:14.601 12:03:04 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:15.168 12:03:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:22:15.168 12:03:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:22:15.168 12:03:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:15.168 12:03:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:15.168 12:03:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:15.168 12:03:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:22:15.168 12:03:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:15.168 12:03:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:15.427 [2024-07-12 12:03:05.427013] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:15.427 12:03:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:15.427 "name": "raid_bdev1", 00:22:15.427 "aliases": [ 00:22:15.427 "97afe5a5-bb65-498a-9bf8-d2f605bc5709" 00:22:15.427 ], 00:22:15.427 "product_name": "Raid Volume", 00:22:15.427 "block_size": 4096, 00:22:15.427 "num_blocks": 7936, 00:22:15.427 "uuid": "97afe5a5-bb65-498a-9bf8-d2f605bc5709", 00:22:15.427 "md_size": 32, 00:22:15.427 "md_interleave": false, 00:22:15.427 "dif_type": 0, 00:22:15.427 "assigned_rate_limits": { 00:22:15.427 "rw_ios_per_sec": 0, 00:22:15.427 "rw_mbytes_per_sec": 0, 00:22:15.427 "r_mbytes_per_sec": 0, 00:22:15.427 "w_mbytes_per_sec": 0 00:22:15.427 }, 00:22:15.427 "claimed": false, 00:22:15.427 "zoned": false, 00:22:15.427 "supported_io_types": { 00:22:15.427 "read": true, 00:22:15.427 "write": true, 00:22:15.427 "unmap": false, 00:22:15.427 "flush": false, 00:22:15.427 "reset": true, 00:22:15.427 "nvme_admin": false, 00:22:15.427 "nvme_io": false, 00:22:15.427 "nvme_io_md": false, 00:22:15.427 "write_zeroes": true, 00:22:15.427 "zcopy": false, 00:22:15.427 "get_zone_info": false, 00:22:15.427 "zone_management": false, 00:22:15.427 "zone_append": false, 00:22:15.427 "compare": false, 00:22:15.427 "compare_and_write": false, 00:22:15.427 "abort": false, 00:22:15.427 "seek_hole": false, 00:22:15.427 "seek_data": false, 00:22:15.427 "copy": false, 00:22:15.427 "nvme_iov_md": false 00:22:15.427 }, 00:22:15.427 "memory_domains": [ 00:22:15.427 { 00:22:15.427 "dma_device_id": "system", 00:22:15.427 "dma_device_type": 1 00:22:15.427 }, 00:22:15.427 { 00:22:15.427 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:15.427 "dma_device_type": 2 00:22:15.427 }, 00:22:15.427 { 00:22:15.427 "dma_device_id": "system", 00:22:15.427 "dma_device_type": 1 00:22:15.427 }, 00:22:15.427 { 00:22:15.427 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:15.427 "dma_device_type": 2 00:22:15.427 } 00:22:15.427 ], 00:22:15.427 "driver_specific": { 00:22:15.427 "raid": { 00:22:15.427 "uuid": "97afe5a5-bb65-498a-9bf8-d2f605bc5709", 00:22:15.427 "strip_size_kb": 0, 00:22:15.427 "state": "online", 00:22:15.427 "raid_level": "raid1", 00:22:15.427 "superblock": true, 00:22:15.427 "num_base_bdevs": 2, 00:22:15.427 "num_base_bdevs_discovered": 2, 00:22:15.427 "num_base_bdevs_operational": 2, 00:22:15.427 "base_bdevs_list": [ 00:22:15.427 { 00:22:15.427 "name": "pt1", 00:22:15.427 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:15.427 "is_configured": true, 00:22:15.427 "data_offset": 256, 00:22:15.427 "data_size": 7936 00:22:15.427 }, 00:22:15.427 { 00:22:15.427 "name": "pt2", 00:22:15.427 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:15.427 "is_configured": true, 00:22:15.427 "data_offset": 256, 00:22:15.427 "data_size": 7936 00:22:15.427 } 00:22:15.427 ] 00:22:15.427 } 00:22:15.427 } 00:22:15.427 }' 00:22:15.427 12:03:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:15.427 12:03:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:22:15.427 pt2' 00:22:15.427 12:03:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:15.427 12:03:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:22:15.427 12:03:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:15.427 12:03:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:15.427 "name": "pt1", 00:22:15.427 "aliases": [ 00:22:15.427 "00000000-0000-0000-0000-000000000001" 00:22:15.427 ], 00:22:15.427 "product_name": "passthru", 00:22:15.427 "block_size": 4096, 00:22:15.427 "num_blocks": 8192, 00:22:15.427 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:15.427 "md_size": 32, 00:22:15.427 "md_interleave": false, 00:22:15.427 "dif_type": 0, 00:22:15.427 "assigned_rate_limits": { 00:22:15.427 "rw_ios_per_sec": 0, 00:22:15.427 "rw_mbytes_per_sec": 0, 00:22:15.427 "r_mbytes_per_sec": 0, 00:22:15.427 "w_mbytes_per_sec": 0 00:22:15.427 }, 00:22:15.427 "claimed": true, 00:22:15.427 "claim_type": "exclusive_write", 00:22:15.427 "zoned": false, 00:22:15.427 "supported_io_types": { 00:22:15.427 "read": true, 00:22:15.427 "write": true, 00:22:15.427 "unmap": true, 00:22:15.427 "flush": true, 00:22:15.428 "reset": true, 00:22:15.428 "nvme_admin": false, 00:22:15.428 "nvme_io": false, 00:22:15.428 "nvme_io_md": false, 00:22:15.428 "write_zeroes": true, 00:22:15.428 "zcopy": true, 00:22:15.428 "get_zone_info": false, 00:22:15.428 "zone_management": false, 00:22:15.428 "zone_append": false, 00:22:15.428 "compare": false, 00:22:15.428 "compare_and_write": false, 00:22:15.428 "abort": true, 00:22:15.428 "seek_hole": false, 00:22:15.428 "seek_data": false, 00:22:15.428 "copy": true, 00:22:15.428 "nvme_iov_md": false 00:22:15.428 }, 00:22:15.428 "memory_domains": [ 00:22:15.428 { 00:22:15.428 "dma_device_id": "system", 00:22:15.428 "dma_device_type": 1 00:22:15.428 }, 00:22:15.428 { 00:22:15.428 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:15.428 "dma_device_type": 2 00:22:15.428 } 00:22:15.428 ], 00:22:15.428 "driver_specific": { 00:22:15.428 "passthru": { 00:22:15.428 "name": "pt1", 00:22:15.428 "base_bdev_name": "malloc1" 00:22:15.428 } 00:22:15.428 } 00:22:15.428 }' 00:22:15.428 12:03:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:15.686 12:03:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:15.686 12:03:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:22:15.686 12:03:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:15.686 12:03:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:15.686 12:03:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:22:15.686 12:03:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:15.686 12:03:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:15.686 12:03:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:22:15.686 12:03:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:15.686 12:03:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:15.945 12:03:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:22:15.945 12:03:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:15.945 12:03:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:22:15.945 12:03:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:15.945 12:03:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:15.945 "name": "pt2", 00:22:15.945 "aliases": [ 00:22:15.945 "00000000-0000-0000-0000-000000000002" 00:22:15.945 ], 00:22:15.945 "product_name": "passthru", 00:22:15.945 "block_size": 4096, 00:22:15.945 "num_blocks": 8192, 00:22:15.945 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:15.945 "md_size": 32, 00:22:15.945 "md_interleave": false, 00:22:15.945 "dif_type": 0, 00:22:15.945 "assigned_rate_limits": { 00:22:15.945 "rw_ios_per_sec": 0, 00:22:15.945 "rw_mbytes_per_sec": 0, 00:22:15.945 "r_mbytes_per_sec": 0, 00:22:15.945 "w_mbytes_per_sec": 0 00:22:15.945 }, 00:22:15.945 "claimed": true, 00:22:15.945 "claim_type": "exclusive_write", 00:22:15.945 "zoned": false, 00:22:15.945 "supported_io_types": { 00:22:15.945 "read": true, 00:22:15.945 "write": true, 00:22:15.945 "unmap": true, 00:22:15.945 "flush": true, 00:22:15.945 "reset": true, 00:22:15.945 "nvme_admin": false, 00:22:15.945 "nvme_io": false, 00:22:15.945 "nvme_io_md": false, 00:22:15.945 "write_zeroes": true, 00:22:15.945 "zcopy": true, 00:22:15.945 "get_zone_info": false, 00:22:15.945 "zone_management": false, 00:22:15.945 "zone_append": false, 00:22:15.945 "compare": false, 00:22:15.945 "compare_and_write": false, 00:22:15.945 "abort": true, 00:22:15.945 "seek_hole": false, 00:22:15.945 "seek_data": false, 00:22:15.945 "copy": true, 00:22:15.945 "nvme_iov_md": false 00:22:15.945 }, 00:22:15.945 "memory_domains": [ 00:22:15.945 { 00:22:15.945 "dma_device_id": "system", 00:22:15.945 "dma_device_type": 1 00:22:15.945 }, 00:22:15.945 { 00:22:15.945 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:15.945 "dma_device_type": 2 00:22:15.945 } 00:22:15.945 ], 00:22:15.945 "driver_specific": { 00:22:15.945 "passthru": { 00:22:15.945 "name": "pt2", 00:22:15.945 "base_bdev_name": "malloc2" 00:22:15.945 } 00:22:15.945 } 00:22:15.945 }' 00:22:15.945 12:03:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:15.945 12:03:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:15.945 12:03:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:22:15.945 12:03:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:16.204 12:03:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:16.204 12:03:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:22:16.204 12:03:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:16.204 12:03:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:16.204 12:03:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:22:16.204 12:03:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:16.204 12:03:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:16.204 12:03:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:22:16.204 12:03:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:16.204 12:03:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:22:16.462 [2024-07-12 12:03:06.549943] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:16.462 12:03:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # '[' 97afe5a5-bb65-498a-9bf8-d2f605bc5709 '!=' 97afe5a5-bb65-498a-9bf8-d2f605bc5709 ']' 00:22:16.462 12:03:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:22:16.462 12:03:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:16.462 12:03:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:22:16.462 12:03:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:22:16.462 [2024-07-12 12:03:06.706202] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:22:16.721 12:03:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:16.721 12:03:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:16.721 12:03:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:16.721 12:03:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:16.721 12:03:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:16.721 12:03:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:16.721 12:03:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:16.722 12:03:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:16.722 12:03:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:16.722 12:03:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:16.722 12:03:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:16.722 12:03:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:16.722 12:03:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:16.722 "name": "raid_bdev1", 00:22:16.722 "uuid": "97afe5a5-bb65-498a-9bf8-d2f605bc5709", 00:22:16.722 "strip_size_kb": 0, 00:22:16.722 "state": "online", 00:22:16.722 "raid_level": "raid1", 00:22:16.722 "superblock": true, 00:22:16.722 "num_base_bdevs": 2, 00:22:16.722 "num_base_bdevs_discovered": 1, 00:22:16.722 "num_base_bdevs_operational": 1, 00:22:16.722 "base_bdevs_list": [ 00:22:16.722 { 00:22:16.722 "name": null, 00:22:16.722 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:16.722 "is_configured": false, 00:22:16.722 "data_offset": 256, 00:22:16.722 "data_size": 7936 00:22:16.722 }, 00:22:16.722 { 00:22:16.722 "name": "pt2", 00:22:16.722 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:16.722 "is_configured": true, 00:22:16.722 "data_offset": 256, 00:22:16.722 "data_size": 7936 00:22:16.722 } 00:22:16.722 ] 00:22:16.722 }' 00:22:16.722 12:03:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:16.722 12:03:06 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:17.290 12:03:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:17.290 [2024-07-12 12:03:07.508254] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:17.290 [2024-07-12 12:03:07.508273] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:17.290 [2024-07-12 12:03:07.508306] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:17.290 [2024-07-12 12:03:07.508334] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:17.290 [2024-07-12 12:03:07.508340] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe189d0 name raid_bdev1, state offline 00:22:17.290 12:03:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:17.290 12:03:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:22:17.548 12:03:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:22:17.548 12:03:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:22:17.548 12:03:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:22:17.548 12:03:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:22:17.548 12:03:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:22:17.807 12:03:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:22:17.807 12:03:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:22:17.807 12:03:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:22:17.807 12:03:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:22:17.807 12:03:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@518 -- # i=1 00:22:17.807 12:03:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:17.807 [2024-07-12 12:03:08.025586] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:17.807 [2024-07-12 12:03:08.025618] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:17.807 [2024-07-12 12:03:08.025627] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe17980 00:22:17.807 [2024-07-12 12:03:08.025633] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:17.807 [2024-07-12 12:03:08.026700] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:17.807 [2024-07-12 12:03:08.026719] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:17.807 [2024-07-12 12:03:08.026749] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:17.807 [2024-07-12 12:03:08.026767] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:17.807 [2024-07-12 12:03:08.026820] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe1a130 00:22:17.807 [2024-07-12 12:03:08.026827] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:22:17.807 [2024-07-12 12:03:08.026864] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe1a410 00:22:17.807 [2024-07-12 12:03:08.026930] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe1a130 00:22:17.807 [2024-07-12 12:03:08.026935] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe1a130 00:22:17.807 [2024-07-12 12:03:08.026978] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:17.807 pt2 00:22:17.807 12:03:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:17.807 12:03:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:17.807 12:03:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:17.807 12:03:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:17.807 12:03:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:17.807 12:03:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:17.807 12:03:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:17.807 12:03:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:17.807 12:03:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:17.807 12:03:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:17.807 12:03:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:17.807 12:03:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:18.066 12:03:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:18.066 "name": "raid_bdev1", 00:22:18.066 "uuid": "97afe5a5-bb65-498a-9bf8-d2f605bc5709", 00:22:18.066 "strip_size_kb": 0, 00:22:18.066 "state": "online", 00:22:18.066 "raid_level": "raid1", 00:22:18.066 "superblock": true, 00:22:18.066 "num_base_bdevs": 2, 00:22:18.066 "num_base_bdevs_discovered": 1, 00:22:18.066 "num_base_bdevs_operational": 1, 00:22:18.066 "base_bdevs_list": [ 00:22:18.066 { 00:22:18.066 "name": null, 00:22:18.066 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:18.066 "is_configured": false, 00:22:18.066 "data_offset": 256, 00:22:18.066 "data_size": 7936 00:22:18.066 }, 00:22:18.066 { 00:22:18.066 "name": "pt2", 00:22:18.066 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:18.066 "is_configured": true, 00:22:18.066 "data_offset": 256, 00:22:18.066 "data_size": 7936 00:22:18.066 } 00:22:18.066 ] 00:22:18.066 }' 00:22:18.066 12:03:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:18.066 12:03:08 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:18.632 12:03:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:18.632 [2024-07-12 12:03:08.831673] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:18.632 [2024-07-12 12:03:08.831693] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:18.632 [2024-07-12 12:03:08.831730] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:18.632 [2024-07-12 12:03:08.831759] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:18.632 [2024-07-12 12:03:08.831765] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe1a130 name raid_bdev1, state offline 00:22:18.632 12:03:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:18.632 12:03:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:22:18.891 12:03:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:22:18.891 12:03:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:22:18.891 12:03:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:22:18.891 12:03:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:19.149 [2024-07-12 12:03:09.160513] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:19.149 [2024-07-12 12:03:09.160554] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:19.149 [2024-07-12 12:03:09.160565] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc93900 00:22:19.149 [2024-07-12 12:03:09.160587] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:19.149 [2024-07-12 12:03:09.161634] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:19.149 [2024-07-12 12:03:09.161654] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:19.149 [2024-07-12 12:03:09.161686] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:22:19.149 [2024-07-12 12:03:09.161703] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:19.149 [2024-07-12 12:03:09.161763] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:22:19.149 [2024-07-12 12:03:09.161770] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:19.149 [2024-07-12 12:03:09.161779] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe1a520 name raid_bdev1, state configuring 00:22:19.149 [2024-07-12 12:03:09.161793] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:19.149 [2024-07-12 12:03:09.161828] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe1a520 00:22:19.149 [2024-07-12 12:03:09.161833] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:22:19.149 [2024-07-12 12:03:09.161873] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc93fb0 00:22:19.149 [2024-07-12 12:03:09.161937] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe1a520 00:22:19.149 [2024-07-12 12:03:09.161942] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe1a520 00:22:19.149 [2024-07-12 12:03:09.161986] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:19.149 pt1 00:22:19.149 12:03:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:22:19.149 12:03:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:19.149 12:03:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:19.149 12:03:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:19.149 12:03:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:19.149 12:03:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:19.149 12:03:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:19.149 12:03:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:19.150 12:03:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:19.150 12:03:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:19.150 12:03:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:19.150 12:03:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:19.150 12:03:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:19.150 12:03:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:19.150 "name": "raid_bdev1", 00:22:19.150 "uuid": "97afe5a5-bb65-498a-9bf8-d2f605bc5709", 00:22:19.150 "strip_size_kb": 0, 00:22:19.150 "state": "online", 00:22:19.150 "raid_level": "raid1", 00:22:19.150 "superblock": true, 00:22:19.150 "num_base_bdevs": 2, 00:22:19.150 "num_base_bdevs_discovered": 1, 00:22:19.150 "num_base_bdevs_operational": 1, 00:22:19.150 "base_bdevs_list": [ 00:22:19.150 { 00:22:19.150 "name": null, 00:22:19.150 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:19.150 "is_configured": false, 00:22:19.150 "data_offset": 256, 00:22:19.150 "data_size": 7936 00:22:19.150 }, 00:22:19.150 { 00:22:19.150 "name": "pt2", 00:22:19.150 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:19.150 "is_configured": true, 00:22:19.150 "data_offset": 256, 00:22:19.150 "data_size": 7936 00:22:19.150 } 00:22:19.150 ] 00:22:19.150 }' 00:22:19.150 12:03:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:19.150 12:03:09 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:19.717 12:03:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:22:19.717 12:03:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:22:19.976 12:03:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:22:19.976 12:03:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:22:19.976 12:03:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:19.976 [2024-07-12 12:03:10.143205] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:19.976 12:03:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # '[' 97afe5a5-bb65-498a-9bf8-d2f605bc5709 '!=' 97afe5a5-bb65-498a-9bf8-d2f605bc5709 ']' 00:22:19.976 12:03:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@562 -- # killprocess 735143 00:22:19.976 12:03:10 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@948 -- # '[' -z 735143 ']' 00:22:19.976 12:03:10 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@952 -- # kill -0 735143 00:22:19.976 12:03:10 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # uname 00:22:19.976 12:03:10 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:19.976 12:03:10 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 735143 00:22:19.976 12:03:10 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:19.976 12:03:10 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:19.976 12:03:10 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 735143' 00:22:19.976 killing process with pid 735143 00:22:19.976 12:03:10 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@967 -- # kill 735143 00:22:19.976 [2024-07-12 12:03:10.202306] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:19.976 [2024-07-12 12:03:10.202344] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:19.976 [2024-07-12 12:03:10.202375] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:19.976 [2024-07-12 12:03:10.202381] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe1a520 name raid_bdev1, state offline 00:22:19.976 12:03:10 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@972 -- # wait 735143 00:22:19.976 [2024-07-12 12:03:10.220999] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:20.235 12:03:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@564 -- # return 0 00:22:20.235 00:22:20.235 real 0m11.564s 00:22:20.235 user 0m21.193s 00:22:20.235 sys 0m1.788s 00:22:20.235 12:03:10 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:20.235 12:03:10 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:20.235 ************************************ 00:22:20.235 END TEST raid_superblock_test_md_separate 00:22:20.235 ************************************ 00:22:20.235 12:03:10 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:20.235 12:03:10 bdev_raid -- bdev/bdev_raid.sh@907 -- # '[' true = true ']' 00:22:20.235 12:03:10 bdev_raid -- bdev/bdev_raid.sh@908 -- # run_test raid_rebuild_test_sb_md_separate raid_rebuild_test raid1 2 true false true 00:22:20.235 12:03:10 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:22:20.235 12:03:10 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:20.235 12:03:10 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:20.235 ************************************ 00:22:20.235 START TEST raid_rebuild_test_sb_md_separate 00:22:20.235 ************************************ 00:22:20.235 12:03:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:22:20.235 12:03:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:22:20.235 12:03:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:22:20.235 12:03:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:22:20.235 12:03:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:22:20.235 12:03:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@572 -- # local verify=true 00:22:20.235 12:03:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:22:20.235 12:03:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:20.235 12:03:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:22:20.235 12:03:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:20.235 12:03:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:20.235 12:03:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:22:20.235 12:03:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:20.235 12:03:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:20.236 12:03:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:22:20.236 12:03:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:22:20.236 12:03:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:22:20.236 12:03:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # local strip_size 00:22:20.236 12:03:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@576 -- # local create_arg 00:22:20.236 12:03:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:22:20.236 12:03:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@578 -- # local data_offset 00:22:20.236 12:03:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:22:20.236 12:03:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:22:20.236 12:03:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:22:20.236 12:03:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:22:20.236 12:03:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@596 -- # raid_pid=737802 00:22:20.236 12:03:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@597 -- # waitforlisten 737802 /var/tmp/spdk-raid.sock 00:22:20.236 12:03:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:22:20.236 12:03:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@829 -- # '[' -z 737802 ']' 00:22:20.236 12:03:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:20.236 12:03:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:20.236 12:03:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:20.236 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:20.236 12:03:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:20.236 12:03:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:20.494 [2024-07-12 12:03:10.506530] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:22:20.495 [2024-07-12 12:03:10.506581] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid737802 ] 00:22:20.495 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:20.495 Zero copy mechanism will not be used. 00:22:20.495 [2024-07-12 12:03:10.568468] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:20.495 [2024-07-12 12:03:10.646160] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:22:20.495 [2024-07-12 12:03:10.700526] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:20.495 [2024-07-12 12:03:10.700554] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:21.063 12:03:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:21.063 12:03:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@862 -- # return 0 00:22:21.063 12:03:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:21.063 12:03:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1_malloc 00:22:21.322 BaseBdev1_malloc 00:22:21.322 12:03:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:21.581 [2024-07-12 12:03:11.621123] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:21.581 [2024-07-12 12:03:11.621156] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:21.581 [2024-07-12 12:03:11.621170] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1430a10 00:22:21.581 [2024-07-12 12:03:11.621192] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:21.581 [2024-07-12 12:03:11.622213] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:21.581 [2024-07-12 12:03:11.622232] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:21.581 BaseBdev1 00:22:21.581 12:03:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:21.581 12:03:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2_malloc 00:22:21.581 BaseBdev2_malloc 00:22:21.581 12:03:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:22:21.840 [2024-07-12 12:03:11.942248] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:22:21.840 [2024-07-12 12:03:11.942279] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:21.840 [2024-07-12 12:03:11.942291] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1587750 00:22:21.840 [2024-07-12 12:03:11.942297] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:21.840 [2024-07-12 12:03:11.943309] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:21.840 [2024-07-12 12:03:11.943329] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:21.840 BaseBdev2 00:22:21.840 12:03:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b spare_malloc 00:22:22.098 spare_malloc 00:22:22.099 12:03:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:22:22.099 spare_delay 00:22:22.099 12:03:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:22.357 [2024-07-12 12:03:12.431645] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:22.357 [2024-07-12 12:03:12.431676] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:22.357 [2024-07-12 12:03:12.431689] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1574240 00:22:22.357 [2024-07-12 12:03:12.431694] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:22.357 [2024-07-12 12:03:12.432631] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:22.357 [2024-07-12 12:03:12.432650] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:22.357 spare 00:22:22.357 12:03:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:22:22.357 [2024-07-12 12:03:12.596088] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:22.357 [2024-07-12 12:03:12.597033] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:22.357 [2024-07-12 12:03:12.597147] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1574f00 00:22:22.357 [2024-07-12 12:03:12.597156] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:22:22.357 [2024-07-12 12:03:12.597212] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1576390 00:22:22.357 [2024-07-12 12:03:12.597290] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1574f00 00:22:22.357 [2024-07-12 12:03:12.597295] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1574f00 00:22:22.357 [2024-07-12 12:03:12.597342] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:22.616 12:03:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:22.616 12:03:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:22.616 12:03:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:22.616 12:03:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:22.616 12:03:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:22.617 12:03:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:22.617 12:03:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:22.617 12:03:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:22.617 12:03:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:22.617 12:03:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:22.617 12:03:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:22.617 12:03:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:22.617 12:03:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:22.617 "name": "raid_bdev1", 00:22:22.617 "uuid": "99e124e6-28da-4329-b673-065e36d223fb", 00:22:22.617 "strip_size_kb": 0, 00:22:22.617 "state": "online", 00:22:22.617 "raid_level": "raid1", 00:22:22.617 "superblock": true, 00:22:22.617 "num_base_bdevs": 2, 00:22:22.617 "num_base_bdevs_discovered": 2, 00:22:22.617 "num_base_bdevs_operational": 2, 00:22:22.617 "base_bdevs_list": [ 00:22:22.617 { 00:22:22.617 "name": "BaseBdev1", 00:22:22.617 "uuid": "6508ac71-662e-50c3-9d21-6c251bb836e2", 00:22:22.617 "is_configured": true, 00:22:22.617 "data_offset": 256, 00:22:22.617 "data_size": 7936 00:22:22.617 }, 00:22:22.617 { 00:22:22.617 "name": "BaseBdev2", 00:22:22.617 "uuid": "ed08f4d2-0aa5-5b2b-b718-d426ac0ecc5f", 00:22:22.617 "is_configured": true, 00:22:22.617 "data_offset": 256, 00:22:22.617 "data_size": 7936 00:22:22.617 } 00:22:22.617 ] 00:22:22.617 }' 00:22:22.617 12:03:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:22.617 12:03:12 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:23.193 12:03:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:22:23.193 12:03:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:23.193 [2024-07-12 12:03:13.394308] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:23.193 12:03:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:22:23.193 12:03:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:23.194 12:03:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:22:23.452 12:03:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:22:23.452 12:03:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:22:23.452 12:03:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:22:23.452 12:03:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:22:23.452 12:03:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:22:23.452 12:03:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:23.452 12:03:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:22:23.452 12:03:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:23.452 12:03:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:22:23.452 12:03:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:23.452 12:03:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:22:23.452 12:03:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:23.452 12:03:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:23.452 12:03:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:22:23.712 [2024-07-12 12:03:13.735131] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1576390 00:22:23.712 /dev/nbd0 00:22:23.712 12:03:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:23.712 12:03:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:23.712 12:03:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:22:23.712 12:03:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:22:23.712 12:03:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:23.712 12:03:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:23.712 12:03:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:22:23.712 12:03:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:22:23.712 12:03:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:23.712 12:03:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:23.712 12:03:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:23.712 1+0 records in 00:22:23.712 1+0 records out 00:22:23.712 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000178905 s, 22.9 MB/s 00:22:23.712 12:03:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:23.712 12:03:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:22:23.712 12:03:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:23.712 12:03:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:23.712 12:03:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:22:23.712 12:03:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:23.712 12:03:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:23.712 12:03:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:22:23.712 12:03:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:22:23.712 12:03:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:22:24.280 7936+0 records in 00:22:24.280 7936+0 records out 00:22:24.280 32505856 bytes (33 MB, 31 MiB) copied, 0.46607 s, 69.7 MB/s 00:22:24.280 12:03:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:22:24.280 12:03:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:24.280 12:03:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:22:24.280 12:03:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:24.280 12:03:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:22:24.280 12:03:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:24.280 12:03:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:24.280 12:03:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:24.280 [2024-07-12 12:03:14.445971] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:24.280 12:03:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:24.280 12:03:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:24.280 12:03:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:24.280 12:03:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:24.280 12:03:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:24.280 12:03:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:22:24.280 12:03:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:22:24.280 12:03:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:22:24.608 [2024-07-12 12:03:14.598395] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:24.608 12:03:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:24.608 12:03:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:24.608 12:03:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:24.608 12:03:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:24.608 12:03:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:24.608 12:03:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:24.608 12:03:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:24.608 12:03:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:24.608 12:03:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:24.608 12:03:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:24.608 12:03:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:24.608 12:03:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:24.608 12:03:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:24.608 "name": "raid_bdev1", 00:22:24.608 "uuid": "99e124e6-28da-4329-b673-065e36d223fb", 00:22:24.608 "strip_size_kb": 0, 00:22:24.608 "state": "online", 00:22:24.608 "raid_level": "raid1", 00:22:24.608 "superblock": true, 00:22:24.608 "num_base_bdevs": 2, 00:22:24.608 "num_base_bdevs_discovered": 1, 00:22:24.608 "num_base_bdevs_operational": 1, 00:22:24.608 "base_bdevs_list": [ 00:22:24.608 { 00:22:24.608 "name": null, 00:22:24.608 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:24.608 "is_configured": false, 00:22:24.608 "data_offset": 256, 00:22:24.608 "data_size": 7936 00:22:24.608 }, 00:22:24.608 { 00:22:24.608 "name": "BaseBdev2", 00:22:24.608 "uuid": "ed08f4d2-0aa5-5b2b-b718-d426ac0ecc5f", 00:22:24.608 "is_configured": true, 00:22:24.608 "data_offset": 256, 00:22:24.608 "data_size": 7936 00:22:24.608 } 00:22:24.608 ] 00:22:24.608 }' 00:22:24.608 12:03:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:24.608 12:03:14 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:25.201 12:03:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:25.201 [2024-07-12 12:03:15.408512] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:25.201 [2024-07-12 12:03:15.410460] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1573bf0 00:22:25.201 [2024-07-12 12:03:15.411857] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:25.201 12:03:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@646 -- # sleep 1 00:22:26.580 12:03:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:26.580 12:03:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:26.580 12:03:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:26.580 12:03:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:26.580 12:03:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:26.580 12:03:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:26.580 12:03:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:26.580 12:03:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:26.580 "name": "raid_bdev1", 00:22:26.580 "uuid": "99e124e6-28da-4329-b673-065e36d223fb", 00:22:26.580 "strip_size_kb": 0, 00:22:26.580 "state": "online", 00:22:26.580 "raid_level": "raid1", 00:22:26.580 "superblock": true, 00:22:26.580 "num_base_bdevs": 2, 00:22:26.580 "num_base_bdevs_discovered": 2, 00:22:26.580 "num_base_bdevs_operational": 2, 00:22:26.580 "process": { 00:22:26.580 "type": "rebuild", 00:22:26.580 "target": "spare", 00:22:26.580 "progress": { 00:22:26.580 "blocks": 2816, 00:22:26.580 "percent": 35 00:22:26.580 } 00:22:26.580 }, 00:22:26.580 "base_bdevs_list": [ 00:22:26.580 { 00:22:26.580 "name": "spare", 00:22:26.580 "uuid": "6d64484e-7d61-5487-b9e5-3391aa5c669c", 00:22:26.580 "is_configured": true, 00:22:26.580 "data_offset": 256, 00:22:26.580 "data_size": 7936 00:22:26.580 }, 00:22:26.580 { 00:22:26.580 "name": "BaseBdev2", 00:22:26.580 "uuid": "ed08f4d2-0aa5-5b2b-b718-d426ac0ecc5f", 00:22:26.580 "is_configured": true, 00:22:26.580 "data_offset": 256, 00:22:26.580 "data_size": 7936 00:22:26.580 } 00:22:26.580 ] 00:22:26.580 }' 00:22:26.580 12:03:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:26.580 12:03:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:26.580 12:03:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:26.580 12:03:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:26.580 12:03:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:26.839 [2024-07-12 12:03:16.848973] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:26.839 [2024-07-12 12:03:16.922447] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:26.839 [2024-07-12 12:03:16.922476] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:26.839 [2024-07-12 12:03:16.922485] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:26.839 [2024-07-12 12:03:16.922505] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:26.839 12:03:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:26.839 12:03:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:26.839 12:03:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:26.839 12:03:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:26.839 12:03:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:26.839 12:03:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:26.839 12:03:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:26.839 12:03:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:26.839 12:03:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:26.839 12:03:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:26.839 12:03:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:26.839 12:03:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:27.098 12:03:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:27.098 "name": "raid_bdev1", 00:22:27.098 "uuid": "99e124e6-28da-4329-b673-065e36d223fb", 00:22:27.098 "strip_size_kb": 0, 00:22:27.098 "state": "online", 00:22:27.098 "raid_level": "raid1", 00:22:27.098 "superblock": true, 00:22:27.098 "num_base_bdevs": 2, 00:22:27.098 "num_base_bdevs_discovered": 1, 00:22:27.098 "num_base_bdevs_operational": 1, 00:22:27.098 "base_bdevs_list": [ 00:22:27.098 { 00:22:27.098 "name": null, 00:22:27.098 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:27.098 "is_configured": false, 00:22:27.098 "data_offset": 256, 00:22:27.098 "data_size": 7936 00:22:27.098 }, 00:22:27.098 { 00:22:27.098 "name": "BaseBdev2", 00:22:27.098 "uuid": "ed08f4d2-0aa5-5b2b-b718-d426ac0ecc5f", 00:22:27.098 "is_configured": true, 00:22:27.098 "data_offset": 256, 00:22:27.098 "data_size": 7936 00:22:27.098 } 00:22:27.098 ] 00:22:27.098 }' 00:22:27.098 12:03:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:27.098 12:03:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:27.357 12:03:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:27.357 12:03:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:27.357 12:03:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:27.357 12:03:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:27.357 12:03:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:27.357 12:03:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:27.357 12:03:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:27.616 12:03:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:27.616 "name": "raid_bdev1", 00:22:27.616 "uuid": "99e124e6-28da-4329-b673-065e36d223fb", 00:22:27.616 "strip_size_kb": 0, 00:22:27.616 "state": "online", 00:22:27.616 "raid_level": "raid1", 00:22:27.616 "superblock": true, 00:22:27.616 "num_base_bdevs": 2, 00:22:27.616 "num_base_bdevs_discovered": 1, 00:22:27.616 "num_base_bdevs_operational": 1, 00:22:27.616 "base_bdevs_list": [ 00:22:27.616 { 00:22:27.616 "name": null, 00:22:27.616 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:27.616 "is_configured": false, 00:22:27.617 "data_offset": 256, 00:22:27.617 "data_size": 7936 00:22:27.617 }, 00:22:27.617 { 00:22:27.617 "name": "BaseBdev2", 00:22:27.617 "uuid": "ed08f4d2-0aa5-5b2b-b718-d426ac0ecc5f", 00:22:27.617 "is_configured": true, 00:22:27.617 "data_offset": 256, 00:22:27.617 "data_size": 7936 00:22:27.617 } 00:22:27.617 ] 00:22:27.617 }' 00:22:27.617 12:03:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:27.617 12:03:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:27.617 12:03:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:27.617 12:03:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:27.617 12:03:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:27.875 [2024-07-12 12:03:17.976139] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:27.875 [2024-07-12 12:03:17.978075] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15cb260 00:22:27.875 [2024-07-12 12:03:17.979111] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:27.875 12:03:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@662 -- # sleep 1 00:22:28.813 12:03:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:28.813 12:03:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:28.813 12:03:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:28.813 12:03:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:28.813 12:03:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:28.813 12:03:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:28.813 12:03:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:29.072 12:03:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:29.072 "name": "raid_bdev1", 00:22:29.072 "uuid": "99e124e6-28da-4329-b673-065e36d223fb", 00:22:29.072 "strip_size_kb": 0, 00:22:29.072 "state": "online", 00:22:29.072 "raid_level": "raid1", 00:22:29.072 "superblock": true, 00:22:29.072 "num_base_bdevs": 2, 00:22:29.072 "num_base_bdevs_discovered": 2, 00:22:29.072 "num_base_bdevs_operational": 2, 00:22:29.072 "process": { 00:22:29.072 "type": "rebuild", 00:22:29.072 "target": "spare", 00:22:29.072 "progress": { 00:22:29.072 "blocks": 2816, 00:22:29.072 "percent": 35 00:22:29.072 } 00:22:29.072 }, 00:22:29.072 "base_bdevs_list": [ 00:22:29.072 { 00:22:29.072 "name": "spare", 00:22:29.072 "uuid": "6d64484e-7d61-5487-b9e5-3391aa5c669c", 00:22:29.072 "is_configured": true, 00:22:29.072 "data_offset": 256, 00:22:29.072 "data_size": 7936 00:22:29.072 }, 00:22:29.072 { 00:22:29.072 "name": "BaseBdev2", 00:22:29.072 "uuid": "ed08f4d2-0aa5-5b2b-b718-d426ac0ecc5f", 00:22:29.072 "is_configured": true, 00:22:29.072 "data_offset": 256, 00:22:29.072 "data_size": 7936 00:22:29.072 } 00:22:29.072 ] 00:22:29.072 }' 00:22:29.072 12:03:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:29.072 12:03:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:29.072 12:03:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:29.072 12:03:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:29.072 12:03:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:22:29.072 12:03:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:22:29.072 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:22:29.072 12:03:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:22:29.072 12:03:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:22:29.072 12:03:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:22:29.072 12:03:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@705 -- # local timeout=819 00:22:29.072 12:03:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:29.072 12:03:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:29.072 12:03:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:29.072 12:03:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:29.072 12:03:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:29.072 12:03:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:29.072 12:03:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:29.072 12:03:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:29.332 12:03:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:29.332 "name": "raid_bdev1", 00:22:29.332 "uuid": "99e124e6-28da-4329-b673-065e36d223fb", 00:22:29.332 "strip_size_kb": 0, 00:22:29.332 "state": "online", 00:22:29.332 "raid_level": "raid1", 00:22:29.332 "superblock": true, 00:22:29.332 "num_base_bdevs": 2, 00:22:29.332 "num_base_bdevs_discovered": 2, 00:22:29.332 "num_base_bdevs_operational": 2, 00:22:29.332 "process": { 00:22:29.332 "type": "rebuild", 00:22:29.332 "target": "spare", 00:22:29.332 "progress": { 00:22:29.332 "blocks": 3584, 00:22:29.332 "percent": 45 00:22:29.332 } 00:22:29.332 }, 00:22:29.332 "base_bdevs_list": [ 00:22:29.332 { 00:22:29.332 "name": "spare", 00:22:29.332 "uuid": "6d64484e-7d61-5487-b9e5-3391aa5c669c", 00:22:29.332 "is_configured": true, 00:22:29.332 "data_offset": 256, 00:22:29.332 "data_size": 7936 00:22:29.332 }, 00:22:29.332 { 00:22:29.332 "name": "BaseBdev2", 00:22:29.332 "uuid": "ed08f4d2-0aa5-5b2b-b718-d426ac0ecc5f", 00:22:29.332 "is_configured": true, 00:22:29.332 "data_offset": 256, 00:22:29.332 "data_size": 7936 00:22:29.332 } 00:22:29.332 ] 00:22:29.332 }' 00:22:29.332 12:03:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:29.332 12:03:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:29.332 12:03:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:29.332 12:03:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:29.332 12:03:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:30.708 12:03:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:30.708 12:03:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:30.708 12:03:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:30.708 12:03:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:30.708 12:03:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:30.708 12:03:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:30.708 12:03:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:30.708 12:03:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:30.708 12:03:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:30.708 "name": "raid_bdev1", 00:22:30.708 "uuid": "99e124e6-28da-4329-b673-065e36d223fb", 00:22:30.708 "strip_size_kb": 0, 00:22:30.708 "state": "online", 00:22:30.708 "raid_level": "raid1", 00:22:30.708 "superblock": true, 00:22:30.708 "num_base_bdevs": 2, 00:22:30.708 "num_base_bdevs_discovered": 2, 00:22:30.708 "num_base_bdevs_operational": 2, 00:22:30.708 "process": { 00:22:30.708 "type": "rebuild", 00:22:30.708 "target": "spare", 00:22:30.708 "progress": { 00:22:30.708 "blocks": 6656, 00:22:30.708 "percent": 83 00:22:30.708 } 00:22:30.708 }, 00:22:30.708 "base_bdevs_list": [ 00:22:30.708 { 00:22:30.708 "name": "spare", 00:22:30.708 "uuid": "6d64484e-7d61-5487-b9e5-3391aa5c669c", 00:22:30.708 "is_configured": true, 00:22:30.708 "data_offset": 256, 00:22:30.708 "data_size": 7936 00:22:30.708 }, 00:22:30.708 { 00:22:30.708 "name": "BaseBdev2", 00:22:30.708 "uuid": "ed08f4d2-0aa5-5b2b-b718-d426ac0ecc5f", 00:22:30.708 "is_configured": true, 00:22:30.708 "data_offset": 256, 00:22:30.708 "data_size": 7936 00:22:30.708 } 00:22:30.708 ] 00:22:30.708 }' 00:22:30.708 12:03:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:30.708 12:03:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:30.708 12:03:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:30.709 12:03:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:30.709 12:03:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:30.974 [2024-07-12 12:03:21.100856] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:22:30.974 [2024-07-12 12:03:21.100897] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:22:30.974 [2024-07-12 12:03:21.100967] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:31.543 12:03:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:31.543 12:03:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:31.543 12:03:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:31.543 12:03:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:31.543 12:03:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:31.543 12:03:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:31.801 12:03:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:31.801 12:03:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:31.801 12:03:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:31.801 "name": "raid_bdev1", 00:22:31.801 "uuid": "99e124e6-28da-4329-b673-065e36d223fb", 00:22:31.801 "strip_size_kb": 0, 00:22:31.801 "state": "online", 00:22:31.801 "raid_level": "raid1", 00:22:31.801 "superblock": true, 00:22:31.801 "num_base_bdevs": 2, 00:22:31.801 "num_base_bdevs_discovered": 2, 00:22:31.802 "num_base_bdevs_operational": 2, 00:22:31.802 "base_bdevs_list": [ 00:22:31.802 { 00:22:31.802 "name": "spare", 00:22:31.802 "uuid": "6d64484e-7d61-5487-b9e5-3391aa5c669c", 00:22:31.802 "is_configured": true, 00:22:31.802 "data_offset": 256, 00:22:31.802 "data_size": 7936 00:22:31.802 }, 00:22:31.802 { 00:22:31.802 "name": "BaseBdev2", 00:22:31.802 "uuid": "ed08f4d2-0aa5-5b2b-b718-d426ac0ecc5f", 00:22:31.802 "is_configured": true, 00:22:31.802 "data_offset": 256, 00:22:31.802 "data_size": 7936 00:22:31.802 } 00:22:31.802 ] 00:22:31.802 }' 00:22:31.802 12:03:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:31.802 12:03:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:22:31.802 12:03:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:31.802 12:03:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:22:31.802 12:03:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@708 -- # break 00:22:31.802 12:03:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:31.802 12:03:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:31.802 12:03:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:31.802 12:03:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:31.802 12:03:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:31.802 12:03:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:31.802 12:03:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:32.061 12:03:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:32.061 "name": "raid_bdev1", 00:22:32.061 "uuid": "99e124e6-28da-4329-b673-065e36d223fb", 00:22:32.061 "strip_size_kb": 0, 00:22:32.061 "state": "online", 00:22:32.061 "raid_level": "raid1", 00:22:32.061 "superblock": true, 00:22:32.061 "num_base_bdevs": 2, 00:22:32.061 "num_base_bdevs_discovered": 2, 00:22:32.061 "num_base_bdevs_operational": 2, 00:22:32.061 "base_bdevs_list": [ 00:22:32.061 { 00:22:32.061 "name": "spare", 00:22:32.061 "uuid": "6d64484e-7d61-5487-b9e5-3391aa5c669c", 00:22:32.061 "is_configured": true, 00:22:32.061 "data_offset": 256, 00:22:32.061 "data_size": 7936 00:22:32.061 }, 00:22:32.061 { 00:22:32.061 "name": "BaseBdev2", 00:22:32.061 "uuid": "ed08f4d2-0aa5-5b2b-b718-d426ac0ecc5f", 00:22:32.061 "is_configured": true, 00:22:32.061 "data_offset": 256, 00:22:32.061 "data_size": 7936 00:22:32.061 } 00:22:32.061 ] 00:22:32.061 }' 00:22:32.061 12:03:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:32.061 12:03:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:32.061 12:03:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:32.061 12:03:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:32.061 12:03:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:32.061 12:03:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:32.061 12:03:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:32.061 12:03:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:32.061 12:03:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:32.061 12:03:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:32.061 12:03:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:32.061 12:03:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:32.061 12:03:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:32.061 12:03:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:32.061 12:03:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:32.061 12:03:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:32.320 12:03:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:32.320 "name": "raid_bdev1", 00:22:32.320 "uuid": "99e124e6-28da-4329-b673-065e36d223fb", 00:22:32.320 "strip_size_kb": 0, 00:22:32.320 "state": "online", 00:22:32.320 "raid_level": "raid1", 00:22:32.320 "superblock": true, 00:22:32.320 "num_base_bdevs": 2, 00:22:32.320 "num_base_bdevs_discovered": 2, 00:22:32.320 "num_base_bdevs_operational": 2, 00:22:32.320 "base_bdevs_list": [ 00:22:32.320 { 00:22:32.320 "name": "spare", 00:22:32.320 "uuid": "6d64484e-7d61-5487-b9e5-3391aa5c669c", 00:22:32.320 "is_configured": true, 00:22:32.320 "data_offset": 256, 00:22:32.320 "data_size": 7936 00:22:32.320 }, 00:22:32.320 { 00:22:32.320 "name": "BaseBdev2", 00:22:32.320 "uuid": "ed08f4d2-0aa5-5b2b-b718-d426ac0ecc5f", 00:22:32.320 "is_configured": true, 00:22:32.320 "data_offset": 256, 00:22:32.320 "data_size": 7936 00:22:32.320 } 00:22:32.320 ] 00:22:32.320 }' 00:22:32.320 12:03:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:32.320 12:03:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:32.888 12:03:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:32.888 [2024-07-12 12:03:23.104914] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:32.888 [2024-07-12 12:03:23.104934] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:32.888 [2024-07-12 12:03:23.104974] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:32.888 [2024-07-12 12:03:23.105011] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:32.888 [2024-07-12 12:03:23.105016] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1574f00 name raid_bdev1, state offline 00:22:32.888 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:32.888 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # jq length 00:22:33.146 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:22:33.146 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:22:33.146 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:22:33.146 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:22:33.146 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:33.146 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:22:33.146 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:33.146 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:22:33.146 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:33.146 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:22:33.146 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:33.146 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:33.146 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:22:33.404 /dev/nbd0 00:22:33.404 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:33.404 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:33.404 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:22:33.404 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:22:33.404 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:33.404 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:33.404 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:22:33.404 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:22:33.404 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:33.404 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:33.404 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:33.404 1+0 records in 00:22:33.404 1+0 records out 00:22:33.404 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000218138 s, 18.8 MB/s 00:22:33.404 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:33.404 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:22:33.404 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:33.404 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:33.404 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:22:33.404 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:33.404 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:33.404 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:22:33.663 /dev/nbd1 00:22:33.663 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:22:33.663 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:22:33.663 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:22:33.663 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:22:33.663 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:33.663 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:33.663 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:22:33.663 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:22:33.663 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:33.663 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:33.663 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:33.663 1+0 records in 00:22:33.663 1+0 records out 00:22:33.663 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000235978 s, 17.4 MB/s 00:22:33.663 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:33.663 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:22:33.663 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:33.663 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:33.663 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:22:33.663 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:33.663 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:33.663 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:22:33.663 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:22:33.663 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:33.663 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:22:33.663 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:33.663 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:22:33.663 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:33.663 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:33.921 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:33.921 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:33.921 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:33.921 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:33.921 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:33.921 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:33.921 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:22:33.921 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:22:33.921 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:33.921 12:03:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:22:33.921 12:03:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:22:33.921 12:03:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:22:33.921 12:03:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:22:33.921 12:03:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:33.921 12:03:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:33.921 12:03:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:22:33.921 12:03:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:22:33.921 12:03:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:22:33.921 12:03:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:22:33.921 12:03:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:34.180 12:03:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:34.438 [2024-07-12 12:03:24.448767] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:34.438 [2024-07-12 12:03:24.448798] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:34.438 [2024-07-12 12:03:24.448809] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x142eba0 00:22:34.438 [2024-07-12 12:03:24.448815] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:34.438 [2024-07-12 12:03:24.449922] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:34.438 [2024-07-12 12:03:24.449941] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:34.439 [2024-07-12 12:03:24.449978] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:22:34.439 [2024-07-12 12:03:24.449995] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:34.439 [2024-07-12 12:03:24.450058] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:34.439 spare 00:22:34.439 12:03:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:34.439 12:03:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:34.439 12:03:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:34.439 12:03:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:34.439 12:03:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:34.439 12:03:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:34.439 12:03:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:34.439 12:03:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:34.439 12:03:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:34.439 12:03:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:34.439 12:03:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:34.439 12:03:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:34.439 [2024-07-12 12:03:24.550346] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1577c20 00:22:34.439 [2024-07-12 12:03:24.550356] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:22:34.439 [2024-07-12 12:03:24.550401] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1574840 00:22:34.439 [2024-07-12 12:03:24.550482] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1577c20 00:22:34.439 [2024-07-12 12:03:24.550487] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1577c20 00:22:34.439 [2024-07-12 12:03:24.550539] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:34.439 12:03:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:34.439 "name": "raid_bdev1", 00:22:34.439 "uuid": "99e124e6-28da-4329-b673-065e36d223fb", 00:22:34.439 "strip_size_kb": 0, 00:22:34.439 "state": "online", 00:22:34.439 "raid_level": "raid1", 00:22:34.439 "superblock": true, 00:22:34.439 "num_base_bdevs": 2, 00:22:34.439 "num_base_bdevs_discovered": 2, 00:22:34.439 "num_base_bdevs_operational": 2, 00:22:34.439 "base_bdevs_list": [ 00:22:34.439 { 00:22:34.439 "name": "spare", 00:22:34.439 "uuid": "6d64484e-7d61-5487-b9e5-3391aa5c669c", 00:22:34.439 "is_configured": true, 00:22:34.439 "data_offset": 256, 00:22:34.439 "data_size": 7936 00:22:34.439 }, 00:22:34.439 { 00:22:34.439 "name": "BaseBdev2", 00:22:34.439 "uuid": "ed08f4d2-0aa5-5b2b-b718-d426ac0ecc5f", 00:22:34.439 "is_configured": true, 00:22:34.439 "data_offset": 256, 00:22:34.439 "data_size": 7936 00:22:34.439 } 00:22:34.439 ] 00:22:34.439 }' 00:22:34.439 12:03:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:34.439 12:03:24 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:35.005 12:03:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:35.005 12:03:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:35.005 12:03:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:35.005 12:03:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:35.005 12:03:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:35.005 12:03:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:35.005 12:03:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:35.264 12:03:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:35.264 "name": "raid_bdev1", 00:22:35.264 "uuid": "99e124e6-28da-4329-b673-065e36d223fb", 00:22:35.264 "strip_size_kb": 0, 00:22:35.264 "state": "online", 00:22:35.264 "raid_level": "raid1", 00:22:35.264 "superblock": true, 00:22:35.264 "num_base_bdevs": 2, 00:22:35.264 "num_base_bdevs_discovered": 2, 00:22:35.264 "num_base_bdevs_operational": 2, 00:22:35.264 "base_bdevs_list": [ 00:22:35.264 { 00:22:35.264 "name": "spare", 00:22:35.264 "uuid": "6d64484e-7d61-5487-b9e5-3391aa5c669c", 00:22:35.264 "is_configured": true, 00:22:35.264 "data_offset": 256, 00:22:35.264 "data_size": 7936 00:22:35.264 }, 00:22:35.264 { 00:22:35.264 "name": "BaseBdev2", 00:22:35.264 "uuid": "ed08f4d2-0aa5-5b2b-b718-d426ac0ecc5f", 00:22:35.264 "is_configured": true, 00:22:35.264 "data_offset": 256, 00:22:35.264 "data_size": 7936 00:22:35.264 } 00:22:35.264 ] 00:22:35.264 }' 00:22:35.264 12:03:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:35.264 12:03:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:35.264 12:03:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:35.264 12:03:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:35.264 12:03:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:35.264 12:03:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:22:35.523 12:03:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:22:35.523 12:03:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:35.523 [2024-07-12 12:03:25.708116] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:35.523 12:03:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:35.523 12:03:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:35.523 12:03:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:35.523 12:03:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:35.523 12:03:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:35.523 12:03:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:35.523 12:03:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:35.523 12:03:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:35.523 12:03:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:35.523 12:03:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:35.523 12:03:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:35.523 12:03:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:35.782 12:03:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:35.782 "name": "raid_bdev1", 00:22:35.782 "uuid": "99e124e6-28da-4329-b673-065e36d223fb", 00:22:35.782 "strip_size_kb": 0, 00:22:35.782 "state": "online", 00:22:35.782 "raid_level": "raid1", 00:22:35.782 "superblock": true, 00:22:35.782 "num_base_bdevs": 2, 00:22:35.782 "num_base_bdevs_discovered": 1, 00:22:35.782 "num_base_bdevs_operational": 1, 00:22:35.782 "base_bdevs_list": [ 00:22:35.782 { 00:22:35.782 "name": null, 00:22:35.782 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:35.782 "is_configured": false, 00:22:35.782 "data_offset": 256, 00:22:35.782 "data_size": 7936 00:22:35.782 }, 00:22:35.782 { 00:22:35.782 "name": "BaseBdev2", 00:22:35.782 "uuid": "ed08f4d2-0aa5-5b2b-b718-d426ac0ecc5f", 00:22:35.782 "is_configured": true, 00:22:35.782 "data_offset": 256, 00:22:35.782 "data_size": 7936 00:22:35.782 } 00:22:35.782 ] 00:22:35.782 }' 00:22:35.782 12:03:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:35.782 12:03:25 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:36.348 12:03:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:36.349 [2024-07-12 12:03:26.514209] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:36.349 [2024-07-12 12:03:26.514325] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:22:36.349 [2024-07-12 12:03:26.514335] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:22:36.349 [2024-07-12 12:03:26.514354] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:36.349 [2024-07-12 12:03:26.516232] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1576680 00:22:36.349 [2024-07-12 12:03:26.517650] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:36.349 12:03:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@755 -- # sleep 1 00:22:37.727 12:03:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:37.727 12:03:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:37.727 12:03:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:37.727 12:03:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:37.727 12:03:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:37.727 12:03:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:37.727 12:03:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:37.727 12:03:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:37.727 "name": "raid_bdev1", 00:22:37.727 "uuid": "99e124e6-28da-4329-b673-065e36d223fb", 00:22:37.727 "strip_size_kb": 0, 00:22:37.727 "state": "online", 00:22:37.727 "raid_level": "raid1", 00:22:37.727 "superblock": true, 00:22:37.727 "num_base_bdevs": 2, 00:22:37.727 "num_base_bdevs_discovered": 2, 00:22:37.727 "num_base_bdevs_operational": 2, 00:22:37.727 "process": { 00:22:37.727 "type": "rebuild", 00:22:37.727 "target": "spare", 00:22:37.727 "progress": { 00:22:37.727 "blocks": 2816, 00:22:37.727 "percent": 35 00:22:37.727 } 00:22:37.727 }, 00:22:37.727 "base_bdevs_list": [ 00:22:37.727 { 00:22:37.727 "name": "spare", 00:22:37.727 "uuid": "6d64484e-7d61-5487-b9e5-3391aa5c669c", 00:22:37.727 "is_configured": true, 00:22:37.727 "data_offset": 256, 00:22:37.727 "data_size": 7936 00:22:37.727 }, 00:22:37.727 { 00:22:37.727 "name": "BaseBdev2", 00:22:37.727 "uuid": "ed08f4d2-0aa5-5b2b-b718-d426ac0ecc5f", 00:22:37.727 "is_configured": true, 00:22:37.727 "data_offset": 256, 00:22:37.727 "data_size": 7936 00:22:37.727 } 00:22:37.727 ] 00:22:37.727 }' 00:22:37.727 12:03:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:37.727 12:03:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:37.727 12:03:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:37.727 12:03:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:37.727 12:03:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:37.727 [2024-07-12 12:03:27.950800] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:37.986 [2024-07-12 12:03:28.028230] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:37.986 [2024-07-12 12:03:28.028260] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:37.986 [2024-07-12 12:03:28.028268] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:37.986 [2024-07-12 12:03:28.028272] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:37.986 12:03:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:37.986 12:03:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:37.986 12:03:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:37.986 12:03:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:37.986 12:03:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:37.986 12:03:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:37.986 12:03:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:37.986 12:03:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:37.986 12:03:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:37.986 12:03:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:37.986 12:03:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:37.986 12:03:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:37.986 12:03:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:37.986 "name": "raid_bdev1", 00:22:37.986 "uuid": "99e124e6-28da-4329-b673-065e36d223fb", 00:22:37.986 "strip_size_kb": 0, 00:22:37.986 "state": "online", 00:22:37.986 "raid_level": "raid1", 00:22:37.986 "superblock": true, 00:22:37.986 "num_base_bdevs": 2, 00:22:37.986 "num_base_bdevs_discovered": 1, 00:22:37.986 "num_base_bdevs_operational": 1, 00:22:37.986 "base_bdevs_list": [ 00:22:37.986 { 00:22:37.986 "name": null, 00:22:37.986 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:37.986 "is_configured": false, 00:22:37.986 "data_offset": 256, 00:22:37.986 "data_size": 7936 00:22:37.986 }, 00:22:37.986 { 00:22:37.986 "name": "BaseBdev2", 00:22:37.986 "uuid": "ed08f4d2-0aa5-5b2b-b718-d426ac0ecc5f", 00:22:37.986 "is_configured": true, 00:22:37.986 "data_offset": 256, 00:22:37.986 "data_size": 7936 00:22:37.986 } 00:22:37.986 ] 00:22:37.986 }' 00:22:37.986 12:03:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:37.986 12:03:28 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:38.554 12:03:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:38.814 [2024-07-12 12:03:28.849144] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:38.814 [2024-07-12 12:03:28.849177] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:38.814 [2024-07-12 12:03:28.849207] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15786a0 00:22:38.814 [2024-07-12 12:03:28.849213] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:38.814 [2024-07-12 12:03:28.849371] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:38.814 [2024-07-12 12:03:28.849380] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:38.814 [2024-07-12 12:03:28.849418] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:22:38.814 [2024-07-12 12:03:28.849424] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:22:38.814 [2024-07-12 12:03:28.849429] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:22:38.814 [2024-07-12 12:03:28.849440] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:38.814 [2024-07-12 12:03:28.851310] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1578930 00:22:38.814 [2024-07-12 12:03:28.852344] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:38.814 spare 00:22:38.814 12:03:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@762 -- # sleep 1 00:22:39.752 12:03:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:39.752 12:03:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:39.752 12:03:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:39.752 12:03:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:39.752 12:03:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:39.752 12:03:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:39.752 12:03:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:40.011 12:03:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:40.011 "name": "raid_bdev1", 00:22:40.011 "uuid": "99e124e6-28da-4329-b673-065e36d223fb", 00:22:40.011 "strip_size_kb": 0, 00:22:40.011 "state": "online", 00:22:40.011 "raid_level": "raid1", 00:22:40.011 "superblock": true, 00:22:40.011 "num_base_bdevs": 2, 00:22:40.011 "num_base_bdevs_discovered": 2, 00:22:40.011 "num_base_bdevs_operational": 2, 00:22:40.011 "process": { 00:22:40.011 "type": "rebuild", 00:22:40.011 "target": "spare", 00:22:40.011 "progress": { 00:22:40.011 "blocks": 2816, 00:22:40.011 "percent": 35 00:22:40.011 } 00:22:40.011 }, 00:22:40.011 "base_bdevs_list": [ 00:22:40.011 { 00:22:40.011 "name": "spare", 00:22:40.011 "uuid": "6d64484e-7d61-5487-b9e5-3391aa5c669c", 00:22:40.011 "is_configured": true, 00:22:40.011 "data_offset": 256, 00:22:40.011 "data_size": 7936 00:22:40.011 }, 00:22:40.011 { 00:22:40.011 "name": "BaseBdev2", 00:22:40.011 "uuid": "ed08f4d2-0aa5-5b2b-b718-d426ac0ecc5f", 00:22:40.011 "is_configured": true, 00:22:40.011 "data_offset": 256, 00:22:40.011 "data_size": 7936 00:22:40.011 } 00:22:40.011 ] 00:22:40.011 }' 00:22:40.011 12:03:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:40.011 12:03:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:40.011 12:03:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:40.012 12:03:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:40.012 12:03:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:40.271 [2024-07-12 12:03:30.277531] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:40.271 [2024-07-12 12:03:30.362997] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:40.271 [2024-07-12 12:03:30.363021] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:40.271 [2024-07-12 12:03:30.363029] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:40.271 [2024-07-12 12:03:30.363033] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:40.271 12:03:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:40.271 12:03:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:40.271 12:03:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:40.271 12:03:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:40.271 12:03:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:40.271 12:03:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:40.271 12:03:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:40.271 12:03:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:40.271 12:03:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:40.271 12:03:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:40.271 12:03:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:40.271 12:03:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:40.530 12:03:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:40.530 "name": "raid_bdev1", 00:22:40.530 "uuid": "99e124e6-28da-4329-b673-065e36d223fb", 00:22:40.530 "strip_size_kb": 0, 00:22:40.530 "state": "online", 00:22:40.530 "raid_level": "raid1", 00:22:40.530 "superblock": true, 00:22:40.530 "num_base_bdevs": 2, 00:22:40.530 "num_base_bdevs_discovered": 1, 00:22:40.530 "num_base_bdevs_operational": 1, 00:22:40.530 "base_bdevs_list": [ 00:22:40.530 { 00:22:40.530 "name": null, 00:22:40.530 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:40.530 "is_configured": false, 00:22:40.530 "data_offset": 256, 00:22:40.530 "data_size": 7936 00:22:40.530 }, 00:22:40.530 { 00:22:40.530 "name": "BaseBdev2", 00:22:40.530 "uuid": "ed08f4d2-0aa5-5b2b-b718-d426ac0ecc5f", 00:22:40.530 "is_configured": true, 00:22:40.530 "data_offset": 256, 00:22:40.530 "data_size": 7936 00:22:40.530 } 00:22:40.530 ] 00:22:40.530 }' 00:22:40.530 12:03:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:40.530 12:03:30 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:41.097 12:03:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:41.097 12:03:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:41.097 12:03:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:41.097 12:03:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:41.097 12:03:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:41.097 12:03:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:41.097 12:03:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:41.097 12:03:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:41.097 "name": "raid_bdev1", 00:22:41.097 "uuid": "99e124e6-28da-4329-b673-065e36d223fb", 00:22:41.097 "strip_size_kb": 0, 00:22:41.097 "state": "online", 00:22:41.097 "raid_level": "raid1", 00:22:41.097 "superblock": true, 00:22:41.097 "num_base_bdevs": 2, 00:22:41.097 "num_base_bdevs_discovered": 1, 00:22:41.097 "num_base_bdevs_operational": 1, 00:22:41.097 "base_bdevs_list": [ 00:22:41.097 { 00:22:41.097 "name": null, 00:22:41.097 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:41.097 "is_configured": false, 00:22:41.097 "data_offset": 256, 00:22:41.097 "data_size": 7936 00:22:41.097 }, 00:22:41.097 { 00:22:41.097 "name": "BaseBdev2", 00:22:41.097 "uuid": "ed08f4d2-0aa5-5b2b-b718-d426ac0ecc5f", 00:22:41.097 "is_configured": true, 00:22:41.097 "data_offset": 256, 00:22:41.097 "data_size": 7936 00:22:41.097 } 00:22:41.097 ] 00:22:41.097 }' 00:22:41.097 12:03:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:41.097 12:03:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:41.097 12:03:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:41.097 12:03:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:41.097 12:03:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:22:41.356 12:03:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:41.356 [2024-07-12 12:03:31.593035] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:41.356 [2024-07-12 12:03:31.593064] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:41.356 [2024-07-12 12:03:31.593075] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1430c40 00:22:41.356 [2024-07-12 12:03:31.593080] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:41.356 [2024-07-12 12:03:31.593239] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:41.356 [2024-07-12 12:03:31.593248] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:41.356 [2024-07-12 12:03:31.593276] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:22:41.356 [2024-07-12 12:03:31.593283] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:22:41.356 [2024-07-12 12:03:31.593287] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:22:41.356 BaseBdev1 00:22:41.613 12:03:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@773 -- # sleep 1 00:22:42.547 12:03:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:42.547 12:03:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:42.547 12:03:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:42.547 12:03:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:42.547 12:03:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:42.548 12:03:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:42.548 12:03:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:42.548 12:03:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:42.548 12:03:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:42.548 12:03:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:42.548 12:03:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:42.548 12:03:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:42.548 12:03:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:42.548 "name": "raid_bdev1", 00:22:42.548 "uuid": "99e124e6-28da-4329-b673-065e36d223fb", 00:22:42.548 "strip_size_kb": 0, 00:22:42.548 "state": "online", 00:22:42.548 "raid_level": "raid1", 00:22:42.548 "superblock": true, 00:22:42.548 "num_base_bdevs": 2, 00:22:42.548 "num_base_bdevs_discovered": 1, 00:22:42.548 "num_base_bdevs_operational": 1, 00:22:42.548 "base_bdevs_list": [ 00:22:42.548 { 00:22:42.548 "name": null, 00:22:42.548 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:42.548 "is_configured": false, 00:22:42.548 "data_offset": 256, 00:22:42.548 "data_size": 7936 00:22:42.548 }, 00:22:42.548 { 00:22:42.548 "name": "BaseBdev2", 00:22:42.548 "uuid": "ed08f4d2-0aa5-5b2b-b718-d426ac0ecc5f", 00:22:42.548 "is_configured": true, 00:22:42.548 "data_offset": 256, 00:22:42.548 "data_size": 7936 00:22:42.548 } 00:22:42.548 ] 00:22:42.548 }' 00:22:42.548 12:03:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:42.548 12:03:32 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:43.113 12:03:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:43.113 12:03:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:43.113 12:03:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:43.113 12:03:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:43.113 12:03:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:43.113 12:03:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:43.113 12:03:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:43.372 12:03:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:43.372 "name": "raid_bdev1", 00:22:43.372 "uuid": "99e124e6-28da-4329-b673-065e36d223fb", 00:22:43.372 "strip_size_kb": 0, 00:22:43.372 "state": "online", 00:22:43.372 "raid_level": "raid1", 00:22:43.372 "superblock": true, 00:22:43.372 "num_base_bdevs": 2, 00:22:43.372 "num_base_bdevs_discovered": 1, 00:22:43.372 "num_base_bdevs_operational": 1, 00:22:43.372 "base_bdevs_list": [ 00:22:43.372 { 00:22:43.372 "name": null, 00:22:43.372 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:43.372 "is_configured": false, 00:22:43.372 "data_offset": 256, 00:22:43.372 "data_size": 7936 00:22:43.372 }, 00:22:43.372 { 00:22:43.372 "name": "BaseBdev2", 00:22:43.372 "uuid": "ed08f4d2-0aa5-5b2b-b718-d426ac0ecc5f", 00:22:43.372 "is_configured": true, 00:22:43.372 "data_offset": 256, 00:22:43.372 "data_size": 7936 00:22:43.372 } 00:22:43.372 ] 00:22:43.372 }' 00:22:43.372 12:03:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:43.372 12:03:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:43.372 12:03:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:43.372 12:03:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:43.372 12:03:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:43.372 12:03:33 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:22:43.372 12:03:33 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:43.372 12:03:33 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:43.372 12:03:33 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:43.372 12:03:33 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:43.372 12:03:33 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:43.372 12:03:33 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:43.372 12:03:33 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:43.372 12:03:33 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:43.372 12:03:33 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:22:43.372 12:03:33 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:43.630 [2024-07-12 12:03:33.698495] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:43.630 [2024-07-12 12:03:33.698596] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:22:43.630 [2024-07-12 12:03:33.698605] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:22:43.630 request: 00:22:43.630 { 00:22:43.630 "raid_bdev": "raid_bdev1", 00:22:43.630 "base_bdev": "BaseBdev1", 00:22:43.630 "method": "bdev_raid_add_base_bdev", 00:22:43.630 "req_id": 1 00:22:43.630 } 00:22:43.630 Got JSON-RPC error response 00:22:43.630 response: 00:22:43.630 { 00:22:43.630 "code": -22, 00:22:43.630 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:22:43.630 } 00:22:43.630 12:03:33 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # es=1 00:22:43.630 12:03:33 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:22:43.630 12:03:33 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:22:43.630 12:03:33 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:22:43.630 12:03:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@777 -- # sleep 1 00:22:44.564 12:03:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:44.564 12:03:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:44.564 12:03:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:44.564 12:03:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:44.565 12:03:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:44.565 12:03:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:44.565 12:03:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:44.565 12:03:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:44.565 12:03:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:44.565 12:03:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:44.565 12:03:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:44.565 12:03:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:44.823 12:03:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:44.823 "name": "raid_bdev1", 00:22:44.823 "uuid": "99e124e6-28da-4329-b673-065e36d223fb", 00:22:44.823 "strip_size_kb": 0, 00:22:44.823 "state": "online", 00:22:44.823 "raid_level": "raid1", 00:22:44.823 "superblock": true, 00:22:44.823 "num_base_bdevs": 2, 00:22:44.823 "num_base_bdevs_discovered": 1, 00:22:44.823 "num_base_bdevs_operational": 1, 00:22:44.823 "base_bdevs_list": [ 00:22:44.824 { 00:22:44.824 "name": null, 00:22:44.824 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:44.824 "is_configured": false, 00:22:44.824 "data_offset": 256, 00:22:44.824 "data_size": 7936 00:22:44.824 }, 00:22:44.824 { 00:22:44.824 "name": "BaseBdev2", 00:22:44.824 "uuid": "ed08f4d2-0aa5-5b2b-b718-d426ac0ecc5f", 00:22:44.824 "is_configured": true, 00:22:44.824 "data_offset": 256, 00:22:44.824 "data_size": 7936 00:22:44.824 } 00:22:44.824 ] 00:22:44.824 }' 00:22:44.824 12:03:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:44.824 12:03:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:45.392 12:03:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:45.392 12:03:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:45.392 12:03:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:45.392 12:03:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:45.392 12:03:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:45.392 12:03:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:45.392 12:03:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:45.392 12:03:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:45.392 "name": "raid_bdev1", 00:22:45.392 "uuid": "99e124e6-28da-4329-b673-065e36d223fb", 00:22:45.392 "strip_size_kb": 0, 00:22:45.392 "state": "online", 00:22:45.392 "raid_level": "raid1", 00:22:45.392 "superblock": true, 00:22:45.392 "num_base_bdevs": 2, 00:22:45.392 "num_base_bdevs_discovered": 1, 00:22:45.392 "num_base_bdevs_operational": 1, 00:22:45.392 "base_bdevs_list": [ 00:22:45.392 { 00:22:45.392 "name": null, 00:22:45.392 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:45.392 "is_configured": false, 00:22:45.392 "data_offset": 256, 00:22:45.392 "data_size": 7936 00:22:45.392 }, 00:22:45.392 { 00:22:45.392 "name": "BaseBdev2", 00:22:45.392 "uuid": "ed08f4d2-0aa5-5b2b-b718-d426ac0ecc5f", 00:22:45.392 "is_configured": true, 00:22:45.392 "data_offset": 256, 00:22:45.392 "data_size": 7936 00:22:45.392 } 00:22:45.392 ] 00:22:45.392 }' 00:22:45.392 12:03:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:45.392 12:03:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:45.392 12:03:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:45.651 12:03:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:45.651 12:03:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@782 -- # killprocess 737802 00:22:45.651 12:03:35 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@948 -- # '[' -z 737802 ']' 00:22:45.651 12:03:35 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@952 -- # kill -0 737802 00:22:45.651 12:03:35 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # uname 00:22:45.652 12:03:35 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:45.652 12:03:35 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 737802 00:22:45.652 12:03:35 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:45.652 12:03:35 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:45.652 12:03:35 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 737802' 00:22:45.652 killing process with pid 737802 00:22:45.652 12:03:35 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@967 -- # kill 737802 00:22:45.652 Received shutdown signal, test time was about 60.000000 seconds 00:22:45.652 00:22:45.652 Latency(us) 00:22:45.652 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:22:45.652 =================================================================================================================== 00:22:45.652 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:22:45.652 [2024-07-12 12:03:35.697371] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:45.652 [2024-07-12 12:03:35.697440] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:45.652 [2024-07-12 12:03:35.697470] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:45.652 [2024-07-12 12:03:35.697476] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1577c20 name raid_bdev1, state offline 00:22:45.652 12:03:35 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@972 -- # wait 737802 00:22:45.652 [2024-07-12 12:03:35.724324] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:45.652 12:03:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@784 -- # return 0 00:22:45.652 00:22:45.652 real 0m25.449s 00:22:45.652 user 0m39.031s 00:22:45.652 sys 0m3.256s 00:22:45.652 12:03:35 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:45.652 12:03:35 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:45.652 ************************************ 00:22:45.652 END TEST raid_rebuild_test_sb_md_separate 00:22:45.652 ************************************ 00:22:45.911 12:03:35 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:45.911 12:03:35 bdev_raid -- bdev/bdev_raid.sh@911 -- # base_malloc_params='-m 32 -i' 00:22:45.911 12:03:35 bdev_raid -- bdev/bdev_raid.sh@912 -- # run_test raid_state_function_test_sb_md_interleaved raid_state_function_test raid1 2 true 00:22:45.911 12:03:35 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:22:45.911 12:03:35 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:45.911 12:03:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:45.911 ************************************ 00:22:45.911 START TEST raid_state_function_test_sb_md_interleaved 00:22:45.911 ************************************ 00:22:45.911 12:03:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:22:45.911 12:03:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:22:45.911 12:03:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:22:45.911 12:03:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:22:45.911 12:03:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:22:45.911 12:03:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:22:45.911 12:03:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:45.911 12:03:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:22:45.911 12:03:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:45.911 12:03:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:45.911 12:03:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:22:45.911 12:03:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:45.911 12:03:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:45.911 12:03:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:22:45.911 12:03:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:22:45.911 12:03:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:22:45.911 12:03:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # local strip_size 00:22:45.911 12:03:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:22:45.911 12:03:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:22:45.911 12:03:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:22:45.911 12:03:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:22:45.911 12:03:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:22:45.911 12:03:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:22:45.911 12:03:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@244 -- # raid_pid=742364 00:22:45.911 12:03:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 742364' 00:22:45.911 Process raid pid: 742364 00:22:45.911 12:03:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:22:45.911 12:03:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@246 -- # waitforlisten 742364 /var/tmp/spdk-raid.sock 00:22:45.911 12:03:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 742364 ']' 00:22:45.911 12:03:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:45.911 12:03:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:45.911 12:03:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:45.911 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:45.911 12:03:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:45.911 12:03:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:45.911 [2024-07-12 12:03:36.024492] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:22:45.911 [2024-07-12 12:03:36.024557] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:22:45.911 [2024-07-12 12:03:36.088279] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:46.170 [2024-07-12 12:03:36.166104] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:22:46.171 [2024-07-12 12:03:36.223837] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:46.171 [2024-07-12 12:03:36.223861] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:46.739 12:03:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:46.739 12:03:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:22:46.739 12:03:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:22:46.739 [2024-07-12 12:03:36.967215] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:46.739 [2024-07-12 12:03:36.967245] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:46.739 [2024-07-12 12:03:36.967251] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:46.739 [2024-07-12 12:03:36.967256] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:46.739 12:03:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:22:46.739 12:03:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:46.739 12:03:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:46.739 12:03:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:46.739 12:03:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:46.739 12:03:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:46.739 12:03:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:46.739 12:03:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:46.740 12:03:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:46.740 12:03:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:46.999 12:03:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:46.999 12:03:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:46.999 12:03:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:46.999 "name": "Existed_Raid", 00:22:46.999 "uuid": "d18548da-9ea3-43ea-9de5-0d073c77ed32", 00:22:46.999 "strip_size_kb": 0, 00:22:46.999 "state": "configuring", 00:22:46.999 "raid_level": "raid1", 00:22:46.999 "superblock": true, 00:22:46.999 "num_base_bdevs": 2, 00:22:46.999 "num_base_bdevs_discovered": 0, 00:22:46.999 "num_base_bdevs_operational": 2, 00:22:46.999 "base_bdevs_list": [ 00:22:46.999 { 00:22:46.999 "name": "BaseBdev1", 00:22:46.999 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:46.999 "is_configured": false, 00:22:46.999 "data_offset": 0, 00:22:46.999 "data_size": 0 00:22:46.999 }, 00:22:46.999 { 00:22:46.999 "name": "BaseBdev2", 00:22:46.999 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:46.999 "is_configured": false, 00:22:46.999 "data_offset": 0, 00:22:46.999 "data_size": 0 00:22:46.999 } 00:22:46.999 ] 00:22:46.999 }' 00:22:46.999 12:03:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:46.999 12:03:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:47.596 12:03:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:47.596 [2024-07-12 12:03:37.785254] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:47.596 [2024-07-12 12:03:37.785279] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfd41b0 name Existed_Raid, state configuring 00:22:47.596 12:03:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:22:47.881 [2024-07-12 12:03:37.961717] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:47.881 [2024-07-12 12:03:37.961737] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:47.881 [2024-07-12 12:03:37.961742] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:47.881 [2024-07-12 12:03:37.961747] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:47.881 12:03:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1 00:22:48.140 [2024-07-12 12:03:38.138419] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:48.140 BaseBdev1 00:22:48.140 12:03:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:22:48.140 12:03:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:22:48.140 12:03:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:48.140 12:03:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local i 00:22:48.140 12:03:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:48.140 12:03:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:48.140 12:03:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:48.140 12:03:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:22:48.399 [ 00:22:48.399 { 00:22:48.399 "name": "BaseBdev1", 00:22:48.399 "aliases": [ 00:22:48.399 "1084a663-fd4a-4b8c-b569-6290a64a0164" 00:22:48.399 ], 00:22:48.399 "product_name": "Malloc disk", 00:22:48.399 "block_size": 4128, 00:22:48.399 "num_blocks": 8192, 00:22:48.399 "uuid": "1084a663-fd4a-4b8c-b569-6290a64a0164", 00:22:48.399 "md_size": 32, 00:22:48.399 "md_interleave": true, 00:22:48.399 "dif_type": 0, 00:22:48.399 "assigned_rate_limits": { 00:22:48.399 "rw_ios_per_sec": 0, 00:22:48.399 "rw_mbytes_per_sec": 0, 00:22:48.399 "r_mbytes_per_sec": 0, 00:22:48.399 "w_mbytes_per_sec": 0 00:22:48.399 }, 00:22:48.399 "claimed": true, 00:22:48.399 "claim_type": "exclusive_write", 00:22:48.399 "zoned": false, 00:22:48.399 "supported_io_types": { 00:22:48.399 "read": true, 00:22:48.399 "write": true, 00:22:48.399 "unmap": true, 00:22:48.399 "flush": true, 00:22:48.399 "reset": true, 00:22:48.399 "nvme_admin": false, 00:22:48.399 "nvme_io": false, 00:22:48.399 "nvme_io_md": false, 00:22:48.399 "write_zeroes": true, 00:22:48.399 "zcopy": true, 00:22:48.399 "get_zone_info": false, 00:22:48.399 "zone_management": false, 00:22:48.399 "zone_append": false, 00:22:48.399 "compare": false, 00:22:48.399 "compare_and_write": false, 00:22:48.399 "abort": true, 00:22:48.399 "seek_hole": false, 00:22:48.399 "seek_data": false, 00:22:48.399 "copy": true, 00:22:48.399 "nvme_iov_md": false 00:22:48.399 }, 00:22:48.399 "memory_domains": [ 00:22:48.399 { 00:22:48.399 "dma_device_id": "system", 00:22:48.399 "dma_device_type": 1 00:22:48.399 }, 00:22:48.399 { 00:22:48.399 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:48.399 "dma_device_type": 2 00:22:48.399 } 00:22:48.399 ], 00:22:48.399 "driver_specific": {} 00:22:48.399 } 00:22:48.399 ] 00:22:48.399 12:03:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # return 0 00:22:48.399 12:03:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:22:48.399 12:03:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:48.399 12:03:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:48.399 12:03:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:48.399 12:03:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:48.399 12:03:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:48.399 12:03:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:48.399 12:03:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:48.399 12:03:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:48.399 12:03:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:48.399 12:03:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:48.399 12:03:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:48.658 12:03:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:48.658 "name": "Existed_Raid", 00:22:48.658 "uuid": "bedcae99-1bb3-4c84-bae7-98b87fbab8dd", 00:22:48.658 "strip_size_kb": 0, 00:22:48.658 "state": "configuring", 00:22:48.658 "raid_level": "raid1", 00:22:48.658 "superblock": true, 00:22:48.658 "num_base_bdevs": 2, 00:22:48.658 "num_base_bdevs_discovered": 1, 00:22:48.658 "num_base_bdevs_operational": 2, 00:22:48.658 "base_bdevs_list": [ 00:22:48.658 { 00:22:48.658 "name": "BaseBdev1", 00:22:48.658 "uuid": "1084a663-fd4a-4b8c-b569-6290a64a0164", 00:22:48.658 "is_configured": true, 00:22:48.658 "data_offset": 256, 00:22:48.658 "data_size": 7936 00:22:48.659 }, 00:22:48.659 { 00:22:48.659 "name": "BaseBdev2", 00:22:48.659 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:48.659 "is_configured": false, 00:22:48.659 "data_offset": 0, 00:22:48.659 "data_size": 0 00:22:48.659 } 00:22:48.659 ] 00:22:48.659 }' 00:22:48.659 12:03:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:48.659 12:03:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:48.917 12:03:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:49.176 [2024-07-12 12:03:39.301462] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:49.176 [2024-07-12 12:03:39.301492] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfd3aa0 name Existed_Raid, state configuring 00:22:49.176 12:03:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:22:49.435 [2024-07-12 12:03:39.469925] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:49.435 [2024-07-12 12:03:39.470962] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:49.435 [2024-07-12 12:03:39.470984] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:49.435 12:03:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:22:49.435 12:03:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:49.435 12:03:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:22:49.435 12:03:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:49.435 12:03:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:49.435 12:03:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:49.435 12:03:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:49.435 12:03:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:49.435 12:03:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:49.435 12:03:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:49.436 12:03:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:49.436 12:03:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:49.436 12:03:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:49.436 12:03:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:49.436 12:03:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:49.436 "name": "Existed_Raid", 00:22:49.436 "uuid": "c1958658-a04b-4e8b-8a7e-0f7717fd9541", 00:22:49.436 "strip_size_kb": 0, 00:22:49.436 "state": "configuring", 00:22:49.436 "raid_level": "raid1", 00:22:49.436 "superblock": true, 00:22:49.436 "num_base_bdevs": 2, 00:22:49.436 "num_base_bdevs_discovered": 1, 00:22:49.436 "num_base_bdevs_operational": 2, 00:22:49.436 "base_bdevs_list": [ 00:22:49.436 { 00:22:49.436 "name": "BaseBdev1", 00:22:49.436 "uuid": "1084a663-fd4a-4b8c-b569-6290a64a0164", 00:22:49.436 "is_configured": true, 00:22:49.436 "data_offset": 256, 00:22:49.436 "data_size": 7936 00:22:49.436 }, 00:22:49.436 { 00:22:49.436 "name": "BaseBdev2", 00:22:49.436 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:49.436 "is_configured": false, 00:22:49.436 "data_offset": 0, 00:22:49.436 "data_size": 0 00:22:49.436 } 00:22:49.436 ] 00:22:49.436 }' 00:22:49.436 12:03:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:49.436 12:03:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:50.003 12:03:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2 00:22:50.262 [2024-07-12 12:03:40.319078] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:50.262 [2024-07-12 12:03:40.319177] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xfd32b0 00:22:50.262 [2024-07-12 12:03:40.319185] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:22:50.262 [2024-07-12 12:03:40.319226] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfd59f0 00:22:50.262 [2024-07-12 12:03:40.319275] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfd32b0 00:22:50.262 [2024-07-12 12:03:40.319280] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xfd32b0 00:22:50.262 [2024-07-12 12:03:40.319316] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:50.262 BaseBdev2 00:22:50.262 12:03:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:22:50.262 12:03:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:22:50.262 12:03:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:50.262 12:03:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local i 00:22:50.262 12:03:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:50.262 12:03:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:50.262 12:03:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:50.520 12:03:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:22:50.520 [ 00:22:50.520 { 00:22:50.520 "name": "BaseBdev2", 00:22:50.520 "aliases": [ 00:22:50.520 "2f2fa329-a124-41c0-8318-442618513de6" 00:22:50.520 ], 00:22:50.521 "product_name": "Malloc disk", 00:22:50.521 "block_size": 4128, 00:22:50.521 "num_blocks": 8192, 00:22:50.521 "uuid": "2f2fa329-a124-41c0-8318-442618513de6", 00:22:50.521 "md_size": 32, 00:22:50.521 "md_interleave": true, 00:22:50.521 "dif_type": 0, 00:22:50.521 "assigned_rate_limits": { 00:22:50.521 "rw_ios_per_sec": 0, 00:22:50.521 "rw_mbytes_per_sec": 0, 00:22:50.521 "r_mbytes_per_sec": 0, 00:22:50.521 "w_mbytes_per_sec": 0 00:22:50.521 }, 00:22:50.521 "claimed": true, 00:22:50.521 "claim_type": "exclusive_write", 00:22:50.521 "zoned": false, 00:22:50.521 "supported_io_types": { 00:22:50.521 "read": true, 00:22:50.521 "write": true, 00:22:50.521 "unmap": true, 00:22:50.521 "flush": true, 00:22:50.521 "reset": true, 00:22:50.521 "nvme_admin": false, 00:22:50.521 "nvme_io": false, 00:22:50.521 "nvme_io_md": false, 00:22:50.521 "write_zeroes": true, 00:22:50.521 "zcopy": true, 00:22:50.521 "get_zone_info": false, 00:22:50.521 "zone_management": false, 00:22:50.521 "zone_append": false, 00:22:50.521 "compare": false, 00:22:50.521 "compare_and_write": false, 00:22:50.521 "abort": true, 00:22:50.521 "seek_hole": false, 00:22:50.521 "seek_data": false, 00:22:50.521 "copy": true, 00:22:50.521 "nvme_iov_md": false 00:22:50.521 }, 00:22:50.521 "memory_domains": [ 00:22:50.521 { 00:22:50.521 "dma_device_id": "system", 00:22:50.521 "dma_device_type": 1 00:22:50.521 }, 00:22:50.521 { 00:22:50.521 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:50.521 "dma_device_type": 2 00:22:50.521 } 00:22:50.521 ], 00:22:50.521 "driver_specific": {} 00:22:50.521 } 00:22:50.521 ] 00:22:50.521 12:03:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # return 0 00:22:50.521 12:03:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:50.521 12:03:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:50.521 12:03:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:22:50.521 12:03:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:50.521 12:03:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:50.521 12:03:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:50.521 12:03:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:50.521 12:03:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:50.521 12:03:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:50.521 12:03:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:50.521 12:03:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:50.521 12:03:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:50.521 12:03:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:50.521 12:03:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:50.779 12:03:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:50.779 "name": "Existed_Raid", 00:22:50.779 "uuid": "c1958658-a04b-4e8b-8a7e-0f7717fd9541", 00:22:50.779 "strip_size_kb": 0, 00:22:50.779 "state": "online", 00:22:50.779 "raid_level": "raid1", 00:22:50.779 "superblock": true, 00:22:50.779 "num_base_bdevs": 2, 00:22:50.779 "num_base_bdevs_discovered": 2, 00:22:50.779 "num_base_bdevs_operational": 2, 00:22:50.779 "base_bdevs_list": [ 00:22:50.779 { 00:22:50.779 "name": "BaseBdev1", 00:22:50.779 "uuid": "1084a663-fd4a-4b8c-b569-6290a64a0164", 00:22:50.779 "is_configured": true, 00:22:50.779 "data_offset": 256, 00:22:50.779 "data_size": 7936 00:22:50.779 }, 00:22:50.779 { 00:22:50.779 "name": "BaseBdev2", 00:22:50.779 "uuid": "2f2fa329-a124-41c0-8318-442618513de6", 00:22:50.779 "is_configured": true, 00:22:50.779 "data_offset": 256, 00:22:50.779 "data_size": 7936 00:22:50.779 } 00:22:50.779 ] 00:22:50.779 }' 00:22:50.779 12:03:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:50.779 12:03:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:51.346 12:03:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:22:51.346 12:03:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:22:51.346 12:03:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:51.346 12:03:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:51.346 12:03:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:51.346 12:03:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:22:51.346 12:03:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:22:51.346 12:03:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:51.346 [2024-07-12 12:03:41.502470] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:51.346 12:03:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:51.346 "name": "Existed_Raid", 00:22:51.346 "aliases": [ 00:22:51.346 "c1958658-a04b-4e8b-8a7e-0f7717fd9541" 00:22:51.346 ], 00:22:51.346 "product_name": "Raid Volume", 00:22:51.346 "block_size": 4128, 00:22:51.346 "num_blocks": 7936, 00:22:51.346 "uuid": "c1958658-a04b-4e8b-8a7e-0f7717fd9541", 00:22:51.346 "md_size": 32, 00:22:51.346 "md_interleave": true, 00:22:51.346 "dif_type": 0, 00:22:51.346 "assigned_rate_limits": { 00:22:51.346 "rw_ios_per_sec": 0, 00:22:51.346 "rw_mbytes_per_sec": 0, 00:22:51.346 "r_mbytes_per_sec": 0, 00:22:51.346 "w_mbytes_per_sec": 0 00:22:51.346 }, 00:22:51.346 "claimed": false, 00:22:51.346 "zoned": false, 00:22:51.346 "supported_io_types": { 00:22:51.346 "read": true, 00:22:51.346 "write": true, 00:22:51.346 "unmap": false, 00:22:51.346 "flush": false, 00:22:51.346 "reset": true, 00:22:51.346 "nvme_admin": false, 00:22:51.346 "nvme_io": false, 00:22:51.346 "nvme_io_md": false, 00:22:51.346 "write_zeroes": true, 00:22:51.346 "zcopy": false, 00:22:51.346 "get_zone_info": false, 00:22:51.346 "zone_management": false, 00:22:51.346 "zone_append": false, 00:22:51.346 "compare": false, 00:22:51.346 "compare_and_write": false, 00:22:51.346 "abort": false, 00:22:51.346 "seek_hole": false, 00:22:51.346 "seek_data": false, 00:22:51.346 "copy": false, 00:22:51.346 "nvme_iov_md": false 00:22:51.346 }, 00:22:51.346 "memory_domains": [ 00:22:51.346 { 00:22:51.346 "dma_device_id": "system", 00:22:51.346 "dma_device_type": 1 00:22:51.346 }, 00:22:51.346 { 00:22:51.346 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:51.346 "dma_device_type": 2 00:22:51.346 }, 00:22:51.346 { 00:22:51.346 "dma_device_id": "system", 00:22:51.346 "dma_device_type": 1 00:22:51.346 }, 00:22:51.346 { 00:22:51.346 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:51.346 "dma_device_type": 2 00:22:51.346 } 00:22:51.346 ], 00:22:51.346 "driver_specific": { 00:22:51.346 "raid": { 00:22:51.346 "uuid": "c1958658-a04b-4e8b-8a7e-0f7717fd9541", 00:22:51.346 "strip_size_kb": 0, 00:22:51.346 "state": "online", 00:22:51.346 "raid_level": "raid1", 00:22:51.346 "superblock": true, 00:22:51.346 "num_base_bdevs": 2, 00:22:51.346 "num_base_bdevs_discovered": 2, 00:22:51.346 "num_base_bdevs_operational": 2, 00:22:51.346 "base_bdevs_list": [ 00:22:51.346 { 00:22:51.346 "name": "BaseBdev1", 00:22:51.346 "uuid": "1084a663-fd4a-4b8c-b569-6290a64a0164", 00:22:51.346 "is_configured": true, 00:22:51.346 "data_offset": 256, 00:22:51.346 "data_size": 7936 00:22:51.346 }, 00:22:51.346 { 00:22:51.346 "name": "BaseBdev2", 00:22:51.346 "uuid": "2f2fa329-a124-41c0-8318-442618513de6", 00:22:51.346 "is_configured": true, 00:22:51.346 "data_offset": 256, 00:22:51.346 "data_size": 7936 00:22:51.346 } 00:22:51.346 ] 00:22:51.346 } 00:22:51.346 } 00:22:51.346 }' 00:22:51.346 12:03:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:51.346 12:03:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:22:51.346 BaseBdev2' 00:22:51.346 12:03:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:51.346 12:03:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:22:51.346 12:03:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:51.605 12:03:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:51.605 "name": "BaseBdev1", 00:22:51.605 "aliases": [ 00:22:51.605 "1084a663-fd4a-4b8c-b569-6290a64a0164" 00:22:51.605 ], 00:22:51.605 "product_name": "Malloc disk", 00:22:51.605 "block_size": 4128, 00:22:51.605 "num_blocks": 8192, 00:22:51.605 "uuid": "1084a663-fd4a-4b8c-b569-6290a64a0164", 00:22:51.605 "md_size": 32, 00:22:51.605 "md_interleave": true, 00:22:51.605 "dif_type": 0, 00:22:51.605 "assigned_rate_limits": { 00:22:51.605 "rw_ios_per_sec": 0, 00:22:51.605 "rw_mbytes_per_sec": 0, 00:22:51.605 "r_mbytes_per_sec": 0, 00:22:51.605 "w_mbytes_per_sec": 0 00:22:51.605 }, 00:22:51.605 "claimed": true, 00:22:51.605 "claim_type": "exclusive_write", 00:22:51.605 "zoned": false, 00:22:51.605 "supported_io_types": { 00:22:51.605 "read": true, 00:22:51.605 "write": true, 00:22:51.605 "unmap": true, 00:22:51.605 "flush": true, 00:22:51.605 "reset": true, 00:22:51.605 "nvme_admin": false, 00:22:51.605 "nvme_io": false, 00:22:51.605 "nvme_io_md": false, 00:22:51.605 "write_zeroes": true, 00:22:51.605 "zcopy": true, 00:22:51.605 "get_zone_info": false, 00:22:51.605 "zone_management": false, 00:22:51.605 "zone_append": false, 00:22:51.605 "compare": false, 00:22:51.605 "compare_and_write": false, 00:22:51.605 "abort": true, 00:22:51.605 "seek_hole": false, 00:22:51.605 "seek_data": false, 00:22:51.605 "copy": true, 00:22:51.605 "nvme_iov_md": false 00:22:51.605 }, 00:22:51.605 "memory_domains": [ 00:22:51.605 { 00:22:51.605 "dma_device_id": "system", 00:22:51.605 "dma_device_type": 1 00:22:51.605 }, 00:22:51.605 { 00:22:51.605 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:51.605 "dma_device_type": 2 00:22:51.605 } 00:22:51.605 ], 00:22:51.605 "driver_specific": {} 00:22:51.605 }' 00:22:51.605 12:03:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:51.605 12:03:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:51.605 12:03:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:22:51.605 12:03:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:51.865 12:03:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:51.865 12:03:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:22:51.865 12:03:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:51.865 12:03:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:51.865 12:03:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:22:51.865 12:03:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:51.865 12:03:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:51.865 12:03:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:22:51.865 12:03:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:51.865 12:03:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:22:51.865 12:03:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:52.124 12:03:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:52.124 "name": "BaseBdev2", 00:22:52.124 "aliases": [ 00:22:52.124 "2f2fa329-a124-41c0-8318-442618513de6" 00:22:52.124 ], 00:22:52.124 "product_name": "Malloc disk", 00:22:52.124 "block_size": 4128, 00:22:52.124 "num_blocks": 8192, 00:22:52.124 "uuid": "2f2fa329-a124-41c0-8318-442618513de6", 00:22:52.124 "md_size": 32, 00:22:52.124 "md_interleave": true, 00:22:52.124 "dif_type": 0, 00:22:52.124 "assigned_rate_limits": { 00:22:52.124 "rw_ios_per_sec": 0, 00:22:52.124 "rw_mbytes_per_sec": 0, 00:22:52.124 "r_mbytes_per_sec": 0, 00:22:52.124 "w_mbytes_per_sec": 0 00:22:52.124 }, 00:22:52.124 "claimed": true, 00:22:52.124 "claim_type": "exclusive_write", 00:22:52.124 "zoned": false, 00:22:52.124 "supported_io_types": { 00:22:52.124 "read": true, 00:22:52.124 "write": true, 00:22:52.124 "unmap": true, 00:22:52.124 "flush": true, 00:22:52.124 "reset": true, 00:22:52.124 "nvme_admin": false, 00:22:52.124 "nvme_io": false, 00:22:52.124 "nvme_io_md": false, 00:22:52.124 "write_zeroes": true, 00:22:52.124 "zcopy": true, 00:22:52.124 "get_zone_info": false, 00:22:52.124 "zone_management": false, 00:22:52.124 "zone_append": false, 00:22:52.124 "compare": false, 00:22:52.124 "compare_and_write": false, 00:22:52.124 "abort": true, 00:22:52.124 "seek_hole": false, 00:22:52.124 "seek_data": false, 00:22:52.124 "copy": true, 00:22:52.124 "nvme_iov_md": false 00:22:52.124 }, 00:22:52.124 "memory_domains": [ 00:22:52.124 { 00:22:52.124 "dma_device_id": "system", 00:22:52.124 "dma_device_type": 1 00:22:52.124 }, 00:22:52.124 { 00:22:52.124 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:52.124 "dma_device_type": 2 00:22:52.124 } 00:22:52.124 ], 00:22:52.124 "driver_specific": {} 00:22:52.124 }' 00:22:52.124 12:03:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:52.124 12:03:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:52.124 12:03:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:22:52.124 12:03:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:52.124 12:03:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:52.124 12:03:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:22:52.124 12:03:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:52.383 12:03:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:52.383 12:03:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:22:52.383 12:03:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:52.383 12:03:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:52.383 12:03:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:22:52.383 12:03:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:22:52.643 [2024-07-12 12:03:42.637296] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:52.643 12:03:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@275 -- # local expected_state 00:22:52.643 12:03:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:22:52.643 12:03:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:52.643 12:03:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:22:52.643 12:03:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:22:52.643 12:03:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:22:52.643 12:03:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:52.643 12:03:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:52.643 12:03:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:52.643 12:03:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:52.643 12:03:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:52.643 12:03:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:52.643 12:03:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:52.643 12:03:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:52.643 12:03:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:52.643 12:03:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:52.643 12:03:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:52.643 12:03:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:52.643 "name": "Existed_Raid", 00:22:52.643 "uuid": "c1958658-a04b-4e8b-8a7e-0f7717fd9541", 00:22:52.643 "strip_size_kb": 0, 00:22:52.643 "state": "online", 00:22:52.643 "raid_level": "raid1", 00:22:52.643 "superblock": true, 00:22:52.643 "num_base_bdevs": 2, 00:22:52.643 "num_base_bdevs_discovered": 1, 00:22:52.643 "num_base_bdevs_operational": 1, 00:22:52.643 "base_bdevs_list": [ 00:22:52.643 { 00:22:52.643 "name": null, 00:22:52.643 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:52.643 "is_configured": false, 00:22:52.643 "data_offset": 256, 00:22:52.643 "data_size": 7936 00:22:52.643 }, 00:22:52.643 { 00:22:52.643 "name": "BaseBdev2", 00:22:52.643 "uuid": "2f2fa329-a124-41c0-8318-442618513de6", 00:22:52.643 "is_configured": true, 00:22:52.643 "data_offset": 256, 00:22:52.643 "data_size": 7936 00:22:52.643 } 00:22:52.643 ] 00:22:52.643 }' 00:22:52.643 12:03:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:52.643 12:03:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:53.211 12:03:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:22:53.211 12:03:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:53.211 12:03:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:53.211 12:03:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:53.470 12:03:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:53.470 12:03:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:53.470 12:03:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:22:53.470 [2024-07-12 12:03:43.624809] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:53.470 [2024-07-12 12:03:43.624870] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:53.470 [2024-07-12 12:03:43.635241] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:53.470 [2024-07-12 12:03:43.635285] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:53.470 [2024-07-12 12:03:43.635291] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfd32b0 name Existed_Raid, state offline 00:22:53.471 12:03:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:53.471 12:03:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:53.471 12:03:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:53.471 12:03:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:22:53.730 12:03:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:22:53.730 12:03:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:22:53.730 12:03:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:22:53.730 12:03:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@341 -- # killprocess 742364 00:22:53.730 12:03:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 742364 ']' 00:22:53.730 12:03:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 742364 00:22:53.730 12:03:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:22:53.730 12:03:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:53.730 12:03:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 742364 00:22:53.730 12:03:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:53.730 12:03:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:53.730 12:03:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 742364' 00:22:53.730 killing process with pid 742364 00:22:53.730 12:03:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # kill 742364 00:22:53.730 [2024-07-12 12:03:43.863364] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:53.730 12:03:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@972 -- # wait 742364 00:22:53.730 [2024-07-12 12:03:43.864140] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:53.989 12:03:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@343 -- # return 0 00:22:53.989 00:22:53.989 real 0m8.070s 00:22:53.989 user 0m14.477s 00:22:53.989 sys 0m1.307s 00:22:53.989 12:03:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:53.989 12:03:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:53.989 ************************************ 00:22:53.989 END TEST raid_state_function_test_sb_md_interleaved 00:22:53.989 ************************************ 00:22:53.989 12:03:44 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:53.989 12:03:44 bdev_raid -- bdev/bdev_raid.sh@913 -- # run_test raid_superblock_test_md_interleaved raid_superblock_test raid1 2 00:22:53.989 12:03:44 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:22:53.989 12:03:44 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:53.989 12:03:44 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:53.989 ************************************ 00:22:53.989 START TEST raid_superblock_test_md_interleaved 00:22:53.989 ************************************ 00:22:53.989 12:03:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:22:53.989 12:03:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:22:53.989 12:03:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:22:53.989 12:03:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:22:53.989 12:03:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:22:53.990 12:03:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:22:53.990 12:03:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:22:53.990 12:03:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:22:53.990 12:03:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:22:53.990 12:03:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:22:53.990 12:03:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@398 -- # local strip_size 00:22:53.990 12:03:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:22:53.990 12:03:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:22:53.990 12:03:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:22:53.990 12:03:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:22:53.990 12:03:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:22:53.990 12:03:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@411 -- # raid_pid=743968 00:22:53.990 12:03:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@412 -- # waitforlisten 743968 /var/tmp/spdk-raid.sock 00:22:53.990 12:03:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:22:53.990 12:03:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 743968 ']' 00:22:53.990 12:03:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:53.990 12:03:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:53.990 12:03:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:53.990 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:53.990 12:03:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:53.990 12:03:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:53.990 [2024-07-12 12:03:44.158639] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:22:53.990 [2024-07-12 12:03:44.158678] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid743968 ] 00:22:53.990 [2024-07-12 12:03:44.221271] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:54.248 [2024-07-12 12:03:44.291852] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:22:54.248 [2024-07-12 12:03:44.342401] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:54.249 [2024-07-12 12:03:44.342430] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:54.817 12:03:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:54.817 12:03:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:22:54.817 12:03:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:22:54.817 12:03:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:54.817 12:03:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:22:54.817 12:03:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:22:54.817 12:03:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:22:54.817 12:03:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:54.817 12:03:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:54.817 12:03:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:54.817 12:03:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc1 00:22:55.077 malloc1 00:22:55.077 12:03:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:55.077 [2024-07-12 12:03:45.258336] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:55.077 [2024-07-12 12:03:45.258372] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:55.077 [2024-07-12 12:03:45.258383] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x81faf0 00:22:55.077 [2024-07-12 12:03:45.258390] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:55.077 [2024-07-12 12:03:45.259406] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:55.077 [2024-07-12 12:03:45.259425] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:55.077 pt1 00:22:55.077 12:03:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:55.077 12:03:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:55.077 12:03:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:22:55.077 12:03:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:22:55.077 12:03:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:22:55.077 12:03:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:55.077 12:03:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:55.077 12:03:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:55.077 12:03:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc2 00:22:55.336 malloc2 00:22:55.336 12:03:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:55.336 [2024-07-12 12:03:45.582738] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:55.336 [2024-07-12 12:03:45.582764] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:55.336 [2024-07-12 12:03:45.582773] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9ad440 00:22:55.336 [2024-07-12 12:03:45.582779] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:55.595 [2024-07-12 12:03:45.583729] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:55.595 [2024-07-12 12:03:45.583748] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:55.595 pt2 00:22:55.595 12:03:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:55.595 12:03:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:55.595 12:03:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:22:55.595 [2024-07-12 12:03:45.747178] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:55.595 [2024-07-12 12:03:45.748138] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:55.595 [2024-07-12 12:03:45.748233] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x9a0dd0 00:22:55.595 [2024-07-12 12:03:45.748241] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:22:55.595 [2024-07-12 12:03:45.748280] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x81db40 00:22:55.595 [2024-07-12 12:03:45.748333] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x9a0dd0 00:22:55.595 [2024-07-12 12:03:45.748337] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x9a0dd0 00:22:55.595 [2024-07-12 12:03:45.748370] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:55.595 12:03:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:55.595 12:03:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:55.595 12:03:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:55.595 12:03:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:55.595 12:03:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:55.595 12:03:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:55.595 12:03:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:55.595 12:03:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:55.595 12:03:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:55.595 12:03:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:55.595 12:03:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:55.595 12:03:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:55.854 12:03:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:55.854 "name": "raid_bdev1", 00:22:55.854 "uuid": "7c0a4d67-1e6b-4823-9edc-a42b2e40e55c", 00:22:55.854 "strip_size_kb": 0, 00:22:55.854 "state": "online", 00:22:55.854 "raid_level": "raid1", 00:22:55.854 "superblock": true, 00:22:55.854 "num_base_bdevs": 2, 00:22:55.854 "num_base_bdevs_discovered": 2, 00:22:55.855 "num_base_bdevs_operational": 2, 00:22:55.855 "base_bdevs_list": [ 00:22:55.855 { 00:22:55.855 "name": "pt1", 00:22:55.855 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:55.855 "is_configured": true, 00:22:55.855 "data_offset": 256, 00:22:55.855 "data_size": 7936 00:22:55.855 }, 00:22:55.855 { 00:22:55.855 "name": "pt2", 00:22:55.855 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:55.855 "is_configured": true, 00:22:55.855 "data_offset": 256, 00:22:55.855 "data_size": 7936 00:22:55.855 } 00:22:55.855 ] 00:22:55.855 }' 00:22:55.855 12:03:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:55.855 12:03:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:56.423 12:03:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:22:56.423 12:03:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:22:56.423 12:03:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:56.423 12:03:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:56.423 12:03:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:56.423 12:03:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:22:56.423 12:03:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:56.423 12:03:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:56.423 [2024-07-12 12:03:46.597532] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:56.423 12:03:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:56.423 "name": "raid_bdev1", 00:22:56.423 "aliases": [ 00:22:56.423 "7c0a4d67-1e6b-4823-9edc-a42b2e40e55c" 00:22:56.423 ], 00:22:56.423 "product_name": "Raid Volume", 00:22:56.423 "block_size": 4128, 00:22:56.423 "num_blocks": 7936, 00:22:56.423 "uuid": "7c0a4d67-1e6b-4823-9edc-a42b2e40e55c", 00:22:56.423 "md_size": 32, 00:22:56.423 "md_interleave": true, 00:22:56.423 "dif_type": 0, 00:22:56.423 "assigned_rate_limits": { 00:22:56.423 "rw_ios_per_sec": 0, 00:22:56.423 "rw_mbytes_per_sec": 0, 00:22:56.423 "r_mbytes_per_sec": 0, 00:22:56.423 "w_mbytes_per_sec": 0 00:22:56.423 }, 00:22:56.423 "claimed": false, 00:22:56.423 "zoned": false, 00:22:56.423 "supported_io_types": { 00:22:56.423 "read": true, 00:22:56.423 "write": true, 00:22:56.423 "unmap": false, 00:22:56.423 "flush": false, 00:22:56.423 "reset": true, 00:22:56.423 "nvme_admin": false, 00:22:56.423 "nvme_io": false, 00:22:56.423 "nvme_io_md": false, 00:22:56.423 "write_zeroes": true, 00:22:56.423 "zcopy": false, 00:22:56.423 "get_zone_info": false, 00:22:56.423 "zone_management": false, 00:22:56.423 "zone_append": false, 00:22:56.423 "compare": false, 00:22:56.423 "compare_and_write": false, 00:22:56.423 "abort": false, 00:22:56.423 "seek_hole": false, 00:22:56.423 "seek_data": false, 00:22:56.423 "copy": false, 00:22:56.423 "nvme_iov_md": false 00:22:56.423 }, 00:22:56.423 "memory_domains": [ 00:22:56.423 { 00:22:56.423 "dma_device_id": "system", 00:22:56.423 "dma_device_type": 1 00:22:56.423 }, 00:22:56.423 { 00:22:56.423 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:56.424 "dma_device_type": 2 00:22:56.424 }, 00:22:56.424 { 00:22:56.424 "dma_device_id": "system", 00:22:56.424 "dma_device_type": 1 00:22:56.424 }, 00:22:56.424 { 00:22:56.424 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:56.424 "dma_device_type": 2 00:22:56.424 } 00:22:56.424 ], 00:22:56.424 "driver_specific": { 00:22:56.424 "raid": { 00:22:56.424 "uuid": "7c0a4d67-1e6b-4823-9edc-a42b2e40e55c", 00:22:56.424 "strip_size_kb": 0, 00:22:56.424 "state": "online", 00:22:56.424 "raid_level": "raid1", 00:22:56.424 "superblock": true, 00:22:56.424 "num_base_bdevs": 2, 00:22:56.424 "num_base_bdevs_discovered": 2, 00:22:56.424 "num_base_bdevs_operational": 2, 00:22:56.424 "base_bdevs_list": [ 00:22:56.424 { 00:22:56.424 "name": "pt1", 00:22:56.424 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:56.424 "is_configured": true, 00:22:56.424 "data_offset": 256, 00:22:56.424 "data_size": 7936 00:22:56.424 }, 00:22:56.424 { 00:22:56.424 "name": "pt2", 00:22:56.424 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:56.424 "is_configured": true, 00:22:56.424 "data_offset": 256, 00:22:56.424 "data_size": 7936 00:22:56.424 } 00:22:56.424 ] 00:22:56.424 } 00:22:56.424 } 00:22:56.424 }' 00:22:56.424 12:03:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:56.424 12:03:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:22:56.424 pt2' 00:22:56.424 12:03:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:56.424 12:03:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:22:56.424 12:03:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:56.685 12:03:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:56.685 "name": "pt1", 00:22:56.686 "aliases": [ 00:22:56.686 "00000000-0000-0000-0000-000000000001" 00:22:56.686 ], 00:22:56.686 "product_name": "passthru", 00:22:56.686 "block_size": 4128, 00:22:56.686 "num_blocks": 8192, 00:22:56.686 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:56.686 "md_size": 32, 00:22:56.686 "md_interleave": true, 00:22:56.686 "dif_type": 0, 00:22:56.686 "assigned_rate_limits": { 00:22:56.686 "rw_ios_per_sec": 0, 00:22:56.686 "rw_mbytes_per_sec": 0, 00:22:56.686 "r_mbytes_per_sec": 0, 00:22:56.686 "w_mbytes_per_sec": 0 00:22:56.686 }, 00:22:56.686 "claimed": true, 00:22:56.686 "claim_type": "exclusive_write", 00:22:56.686 "zoned": false, 00:22:56.686 "supported_io_types": { 00:22:56.686 "read": true, 00:22:56.686 "write": true, 00:22:56.686 "unmap": true, 00:22:56.686 "flush": true, 00:22:56.686 "reset": true, 00:22:56.686 "nvme_admin": false, 00:22:56.686 "nvme_io": false, 00:22:56.686 "nvme_io_md": false, 00:22:56.686 "write_zeroes": true, 00:22:56.686 "zcopy": true, 00:22:56.686 "get_zone_info": false, 00:22:56.686 "zone_management": false, 00:22:56.686 "zone_append": false, 00:22:56.686 "compare": false, 00:22:56.686 "compare_and_write": false, 00:22:56.686 "abort": true, 00:22:56.686 "seek_hole": false, 00:22:56.686 "seek_data": false, 00:22:56.686 "copy": true, 00:22:56.686 "nvme_iov_md": false 00:22:56.686 }, 00:22:56.686 "memory_domains": [ 00:22:56.686 { 00:22:56.686 "dma_device_id": "system", 00:22:56.686 "dma_device_type": 1 00:22:56.686 }, 00:22:56.686 { 00:22:56.686 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:56.686 "dma_device_type": 2 00:22:56.686 } 00:22:56.686 ], 00:22:56.686 "driver_specific": { 00:22:56.686 "passthru": { 00:22:56.686 "name": "pt1", 00:22:56.686 "base_bdev_name": "malloc1" 00:22:56.686 } 00:22:56.686 } 00:22:56.686 }' 00:22:56.686 12:03:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:56.686 12:03:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:56.686 12:03:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:22:56.686 12:03:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:56.947 12:03:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:56.947 12:03:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:22:56.947 12:03:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:56.947 12:03:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:56.947 12:03:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:22:56.947 12:03:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:56.947 12:03:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:56.947 12:03:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:22:56.947 12:03:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:56.947 12:03:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:22:56.947 12:03:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:57.205 12:03:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:57.205 "name": "pt2", 00:22:57.205 "aliases": [ 00:22:57.205 "00000000-0000-0000-0000-000000000002" 00:22:57.205 ], 00:22:57.205 "product_name": "passthru", 00:22:57.205 "block_size": 4128, 00:22:57.205 "num_blocks": 8192, 00:22:57.205 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:57.205 "md_size": 32, 00:22:57.205 "md_interleave": true, 00:22:57.205 "dif_type": 0, 00:22:57.205 "assigned_rate_limits": { 00:22:57.205 "rw_ios_per_sec": 0, 00:22:57.206 "rw_mbytes_per_sec": 0, 00:22:57.206 "r_mbytes_per_sec": 0, 00:22:57.206 "w_mbytes_per_sec": 0 00:22:57.206 }, 00:22:57.206 "claimed": true, 00:22:57.206 "claim_type": "exclusive_write", 00:22:57.206 "zoned": false, 00:22:57.206 "supported_io_types": { 00:22:57.206 "read": true, 00:22:57.206 "write": true, 00:22:57.206 "unmap": true, 00:22:57.206 "flush": true, 00:22:57.206 "reset": true, 00:22:57.206 "nvme_admin": false, 00:22:57.206 "nvme_io": false, 00:22:57.206 "nvme_io_md": false, 00:22:57.206 "write_zeroes": true, 00:22:57.206 "zcopy": true, 00:22:57.206 "get_zone_info": false, 00:22:57.206 "zone_management": false, 00:22:57.206 "zone_append": false, 00:22:57.206 "compare": false, 00:22:57.206 "compare_and_write": false, 00:22:57.206 "abort": true, 00:22:57.206 "seek_hole": false, 00:22:57.206 "seek_data": false, 00:22:57.206 "copy": true, 00:22:57.206 "nvme_iov_md": false 00:22:57.206 }, 00:22:57.206 "memory_domains": [ 00:22:57.206 { 00:22:57.206 "dma_device_id": "system", 00:22:57.206 "dma_device_type": 1 00:22:57.206 }, 00:22:57.206 { 00:22:57.206 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:57.206 "dma_device_type": 2 00:22:57.206 } 00:22:57.206 ], 00:22:57.206 "driver_specific": { 00:22:57.206 "passthru": { 00:22:57.206 "name": "pt2", 00:22:57.206 "base_bdev_name": "malloc2" 00:22:57.206 } 00:22:57.206 } 00:22:57.206 }' 00:22:57.206 12:03:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:57.206 12:03:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:57.206 12:03:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:22:57.206 12:03:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:57.206 12:03:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:57.465 12:03:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:22:57.465 12:03:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:57.465 12:03:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:57.465 12:03:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:22:57.465 12:03:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:57.465 12:03:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:57.465 12:03:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:22:57.465 12:03:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:22:57.465 12:03:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:57.724 [2024-07-12 12:03:47.764558] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:57.724 12:03:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=7c0a4d67-1e6b-4823-9edc-a42b2e40e55c 00:22:57.724 12:03:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@435 -- # '[' -z 7c0a4d67-1e6b-4823-9edc-a42b2e40e55c ']' 00:22:57.724 12:03:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:57.724 [2024-07-12 12:03:47.932847] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:57.725 [2024-07-12 12:03:47.932860] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:57.725 [2024-07-12 12:03:47.932899] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:57.725 [2024-07-12 12:03:47.932935] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:57.725 [2024-07-12 12:03:47.932941] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9a0dd0 name raid_bdev1, state offline 00:22:57.725 12:03:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:57.725 12:03:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:22:57.984 12:03:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:22:57.984 12:03:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:22:57.984 12:03:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:57.984 12:03:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:22:58.242 12:03:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:58.242 12:03:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:22:58.242 12:03:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:22:58.242 12:03:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:22:58.502 12:03:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:22:58.502 12:03:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:22:58.502 12:03:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:22:58.502 12:03:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:22:58.502 12:03:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:58.502 12:03:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:58.502 12:03:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:58.502 12:03:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:58.502 12:03:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:58.502 12:03:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:58.502 12:03:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:58.502 12:03:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:22:58.502 12:03:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:22:58.762 [2024-07-12 12:03:48.762982] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:22:58.762 [2024-07-12 12:03:48.764150] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:22:58.762 [2024-07-12 12:03:48.764196] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:22:58.762 [2024-07-12 12:03:48.764222] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:22:58.762 [2024-07-12 12:03:48.764232] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:58.762 [2024-07-12 12:03:48.764238] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9a1aa0 name raid_bdev1, state configuring 00:22:58.762 request: 00:22:58.762 { 00:22:58.762 "name": "raid_bdev1", 00:22:58.762 "raid_level": "raid1", 00:22:58.762 "base_bdevs": [ 00:22:58.762 "malloc1", 00:22:58.762 "malloc2" 00:22:58.762 ], 00:22:58.762 "superblock": false, 00:22:58.762 "method": "bdev_raid_create", 00:22:58.762 "req_id": 1 00:22:58.762 } 00:22:58.762 Got JSON-RPC error response 00:22:58.762 response: 00:22:58.762 { 00:22:58.762 "code": -17, 00:22:58.762 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:22:58.762 } 00:22:58.762 12:03:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:22:58.762 12:03:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:22:58.762 12:03:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:22:58.762 12:03:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:22:58.762 12:03:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:58.762 12:03:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:22:58.762 12:03:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:22:58.762 12:03:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:22:58.762 12:03:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:59.021 [2024-07-12 12:03:49.099824] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:59.021 [2024-07-12 12:03:49.099857] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:59.021 [2024-07-12 12:03:49.099869] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x81dd30 00:22:59.021 [2024-07-12 12:03:49.099891] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:59.021 [2024-07-12 12:03:49.100930] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:59.021 [2024-07-12 12:03:49.100949] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:59.021 [2024-07-12 12:03:49.100981] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:22:59.021 [2024-07-12 12:03:49.100999] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:59.021 pt1 00:22:59.021 12:03:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:22:59.021 12:03:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:59.021 12:03:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:59.021 12:03:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:59.021 12:03:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:59.021 12:03:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:59.021 12:03:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:59.021 12:03:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:59.021 12:03:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:59.021 12:03:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:59.021 12:03:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:59.021 12:03:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:59.280 12:03:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:59.280 "name": "raid_bdev1", 00:22:59.280 "uuid": "7c0a4d67-1e6b-4823-9edc-a42b2e40e55c", 00:22:59.280 "strip_size_kb": 0, 00:22:59.280 "state": "configuring", 00:22:59.280 "raid_level": "raid1", 00:22:59.280 "superblock": true, 00:22:59.280 "num_base_bdevs": 2, 00:22:59.280 "num_base_bdevs_discovered": 1, 00:22:59.280 "num_base_bdevs_operational": 2, 00:22:59.280 "base_bdevs_list": [ 00:22:59.280 { 00:22:59.280 "name": "pt1", 00:22:59.280 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:59.280 "is_configured": true, 00:22:59.280 "data_offset": 256, 00:22:59.280 "data_size": 7936 00:22:59.280 }, 00:22:59.280 { 00:22:59.280 "name": null, 00:22:59.280 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:59.280 "is_configured": false, 00:22:59.280 "data_offset": 256, 00:22:59.280 "data_size": 7936 00:22:59.280 } 00:22:59.280 ] 00:22:59.280 }' 00:22:59.280 12:03:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:59.280 12:03:49 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:59.848 12:03:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:22:59.848 12:03:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:22:59.848 12:03:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:59.848 12:03:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:59.848 [2024-07-12 12:03:49.938004] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:59.848 [2024-07-12 12:03:49.938043] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:59.848 [2024-07-12 12:03:49.938073] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9a42c0 00:22:59.848 [2024-07-12 12:03:49.938080] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:59.848 [2024-07-12 12:03:49.938208] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:59.848 [2024-07-12 12:03:49.938216] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:59.848 [2024-07-12 12:03:49.938248] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:59.848 [2024-07-12 12:03:49.938260] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:59.848 [2024-07-12 12:03:49.938321] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x81e800 00:22:59.848 [2024-07-12 12:03:49.938327] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:22:59.849 [2024-07-12 12:03:49.938362] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x81db40 00:22:59.849 [2024-07-12 12:03:49.938413] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x81e800 00:22:59.849 [2024-07-12 12:03:49.938419] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x81e800 00:22:59.849 [2024-07-12 12:03:49.938456] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:59.849 pt2 00:22:59.849 12:03:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:22:59.849 12:03:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:59.849 12:03:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:59.849 12:03:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:59.849 12:03:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:59.849 12:03:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:59.849 12:03:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:59.849 12:03:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:59.849 12:03:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:59.849 12:03:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:59.849 12:03:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:59.849 12:03:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:59.849 12:03:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:59.849 12:03:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:00.108 12:03:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:00.108 "name": "raid_bdev1", 00:23:00.108 "uuid": "7c0a4d67-1e6b-4823-9edc-a42b2e40e55c", 00:23:00.108 "strip_size_kb": 0, 00:23:00.108 "state": "online", 00:23:00.108 "raid_level": "raid1", 00:23:00.108 "superblock": true, 00:23:00.108 "num_base_bdevs": 2, 00:23:00.108 "num_base_bdevs_discovered": 2, 00:23:00.108 "num_base_bdevs_operational": 2, 00:23:00.108 "base_bdevs_list": [ 00:23:00.108 { 00:23:00.108 "name": "pt1", 00:23:00.108 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:00.108 "is_configured": true, 00:23:00.108 "data_offset": 256, 00:23:00.108 "data_size": 7936 00:23:00.108 }, 00:23:00.108 { 00:23:00.108 "name": "pt2", 00:23:00.108 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:00.108 "is_configured": true, 00:23:00.108 "data_offset": 256, 00:23:00.108 "data_size": 7936 00:23:00.108 } 00:23:00.108 ] 00:23:00.108 }' 00:23:00.108 12:03:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:00.108 12:03:50 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:00.676 12:03:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:23:00.676 12:03:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:23:00.676 12:03:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:00.676 12:03:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:00.676 12:03:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:00.676 12:03:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:23:00.676 12:03:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:00.676 12:03:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:00.676 [2024-07-12 12:03:50.772334] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:00.676 12:03:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:00.676 "name": "raid_bdev1", 00:23:00.676 "aliases": [ 00:23:00.676 "7c0a4d67-1e6b-4823-9edc-a42b2e40e55c" 00:23:00.676 ], 00:23:00.676 "product_name": "Raid Volume", 00:23:00.676 "block_size": 4128, 00:23:00.676 "num_blocks": 7936, 00:23:00.676 "uuid": "7c0a4d67-1e6b-4823-9edc-a42b2e40e55c", 00:23:00.676 "md_size": 32, 00:23:00.676 "md_interleave": true, 00:23:00.676 "dif_type": 0, 00:23:00.676 "assigned_rate_limits": { 00:23:00.676 "rw_ios_per_sec": 0, 00:23:00.676 "rw_mbytes_per_sec": 0, 00:23:00.676 "r_mbytes_per_sec": 0, 00:23:00.676 "w_mbytes_per_sec": 0 00:23:00.676 }, 00:23:00.676 "claimed": false, 00:23:00.676 "zoned": false, 00:23:00.676 "supported_io_types": { 00:23:00.676 "read": true, 00:23:00.676 "write": true, 00:23:00.676 "unmap": false, 00:23:00.676 "flush": false, 00:23:00.676 "reset": true, 00:23:00.676 "nvme_admin": false, 00:23:00.676 "nvme_io": false, 00:23:00.676 "nvme_io_md": false, 00:23:00.676 "write_zeroes": true, 00:23:00.676 "zcopy": false, 00:23:00.676 "get_zone_info": false, 00:23:00.676 "zone_management": false, 00:23:00.676 "zone_append": false, 00:23:00.676 "compare": false, 00:23:00.676 "compare_and_write": false, 00:23:00.676 "abort": false, 00:23:00.677 "seek_hole": false, 00:23:00.677 "seek_data": false, 00:23:00.677 "copy": false, 00:23:00.677 "nvme_iov_md": false 00:23:00.677 }, 00:23:00.677 "memory_domains": [ 00:23:00.677 { 00:23:00.677 "dma_device_id": "system", 00:23:00.677 "dma_device_type": 1 00:23:00.677 }, 00:23:00.677 { 00:23:00.677 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:00.677 "dma_device_type": 2 00:23:00.677 }, 00:23:00.677 { 00:23:00.677 "dma_device_id": "system", 00:23:00.677 "dma_device_type": 1 00:23:00.677 }, 00:23:00.677 { 00:23:00.677 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:00.677 "dma_device_type": 2 00:23:00.677 } 00:23:00.677 ], 00:23:00.677 "driver_specific": { 00:23:00.677 "raid": { 00:23:00.677 "uuid": "7c0a4d67-1e6b-4823-9edc-a42b2e40e55c", 00:23:00.677 "strip_size_kb": 0, 00:23:00.677 "state": "online", 00:23:00.677 "raid_level": "raid1", 00:23:00.677 "superblock": true, 00:23:00.677 "num_base_bdevs": 2, 00:23:00.677 "num_base_bdevs_discovered": 2, 00:23:00.677 "num_base_bdevs_operational": 2, 00:23:00.677 "base_bdevs_list": [ 00:23:00.677 { 00:23:00.677 "name": "pt1", 00:23:00.677 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:00.677 "is_configured": true, 00:23:00.677 "data_offset": 256, 00:23:00.677 "data_size": 7936 00:23:00.677 }, 00:23:00.677 { 00:23:00.677 "name": "pt2", 00:23:00.677 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:00.677 "is_configured": true, 00:23:00.677 "data_offset": 256, 00:23:00.677 "data_size": 7936 00:23:00.677 } 00:23:00.677 ] 00:23:00.677 } 00:23:00.677 } 00:23:00.677 }' 00:23:00.677 12:03:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:00.677 12:03:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:23:00.677 pt2' 00:23:00.677 12:03:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:00.677 12:03:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:23:00.677 12:03:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:00.936 12:03:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:00.936 "name": "pt1", 00:23:00.936 "aliases": [ 00:23:00.936 "00000000-0000-0000-0000-000000000001" 00:23:00.936 ], 00:23:00.936 "product_name": "passthru", 00:23:00.936 "block_size": 4128, 00:23:00.936 "num_blocks": 8192, 00:23:00.936 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:00.936 "md_size": 32, 00:23:00.936 "md_interleave": true, 00:23:00.936 "dif_type": 0, 00:23:00.936 "assigned_rate_limits": { 00:23:00.936 "rw_ios_per_sec": 0, 00:23:00.936 "rw_mbytes_per_sec": 0, 00:23:00.936 "r_mbytes_per_sec": 0, 00:23:00.936 "w_mbytes_per_sec": 0 00:23:00.936 }, 00:23:00.936 "claimed": true, 00:23:00.936 "claim_type": "exclusive_write", 00:23:00.936 "zoned": false, 00:23:00.936 "supported_io_types": { 00:23:00.936 "read": true, 00:23:00.936 "write": true, 00:23:00.936 "unmap": true, 00:23:00.936 "flush": true, 00:23:00.936 "reset": true, 00:23:00.936 "nvme_admin": false, 00:23:00.936 "nvme_io": false, 00:23:00.936 "nvme_io_md": false, 00:23:00.936 "write_zeroes": true, 00:23:00.936 "zcopy": true, 00:23:00.936 "get_zone_info": false, 00:23:00.936 "zone_management": false, 00:23:00.936 "zone_append": false, 00:23:00.936 "compare": false, 00:23:00.936 "compare_and_write": false, 00:23:00.936 "abort": true, 00:23:00.936 "seek_hole": false, 00:23:00.936 "seek_data": false, 00:23:00.936 "copy": true, 00:23:00.936 "nvme_iov_md": false 00:23:00.936 }, 00:23:00.936 "memory_domains": [ 00:23:00.936 { 00:23:00.936 "dma_device_id": "system", 00:23:00.936 "dma_device_type": 1 00:23:00.936 }, 00:23:00.936 { 00:23:00.936 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:00.936 "dma_device_type": 2 00:23:00.936 } 00:23:00.936 ], 00:23:00.936 "driver_specific": { 00:23:00.936 "passthru": { 00:23:00.936 "name": "pt1", 00:23:00.936 "base_bdev_name": "malloc1" 00:23:00.936 } 00:23:00.936 } 00:23:00.936 }' 00:23:00.936 12:03:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:00.936 12:03:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:00.936 12:03:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:23:00.936 12:03:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:00.936 12:03:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:00.936 12:03:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:23:00.936 12:03:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:01.195 12:03:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:01.195 12:03:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:23:01.195 12:03:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:01.195 12:03:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:01.195 12:03:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:23:01.195 12:03:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:01.195 12:03:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:23:01.195 12:03:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:01.454 12:03:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:01.454 "name": "pt2", 00:23:01.454 "aliases": [ 00:23:01.454 "00000000-0000-0000-0000-000000000002" 00:23:01.454 ], 00:23:01.454 "product_name": "passthru", 00:23:01.454 "block_size": 4128, 00:23:01.454 "num_blocks": 8192, 00:23:01.454 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:01.454 "md_size": 32, 00:23:01.454 "md_interleave": true, 00:23:01.454 "dif_type": 0, 00:23:01.454 "assigned_rate_limits": { 00:23:01.454 "rw_ios_per_sec": 0, 00:23:01.454 "rw_mbytes_per_sec": 0, 00:23:01.454 "r_mbytes_per_sec": 0, 00:23:01.454 "w_mbytes_per_sec": 0 00:23:01.454 }, 00:23:01.454 "claimed": true, 00:23:01.454 "claim_type": "exclusive_write", 00:23:01.454 "zoned": false, 00:23:01.454 "supported_io_types": { 00:23:01.454 "read": true, 00:23:01.454 "write": true, 00:23:01.454 "unmap": true, 00:23:01.454 "flush": true, 00:23:01.454 "reset": true, 00:23:01.454 "nvme_admin": false, 00:23:01.454 "nvme_io": false, 00:23:01.454 "nvme_io_md": false, 00:23:01.454 "write_zeroes": true, 00:23:01.454 "zcopy": true, 00:23:01.454 "get_zone_info": false, 00:23:01.454 "zone_management": false, 00:23:01.454 "zone_append": false, 00:23:01.454 "compare": false, 00:23:01.454 "compare_and_write": false, 00:23:01.454 "abort": true, 00:23:01.454 "seek_hole": false, 00:23:01.454 "seek_data": false, 00:23:01.454 "copy": true, 00:23:01.454 "nvme_iov_md": false 00:23:01.454 }, 00:23:01.454 "memory_domains": [ 00:23:01.454 { 00:23:01.454 "dma_device_id": "system", 00:23:01.454 "dma_device_type": 1 00:23:01.454 }, 00:23:01.454 { 00:23:01.454 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:01.454 "dma_device_type": 2 00:23:01.454 } 00:23:01.454 ], 00:23:01.454 "driver_specific": { 00:23:01.454 "passthru": { 00:23:01.454 "name": "pt2", 00:23:01.454 "base_bdev_name": "malloc2" 00:23:01.454 } 00:23:01.454 } 00:23:01.454 }' 00:23:01.454 12:03:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:01.454 12:03:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:01.454 12:03:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:23:01.454 12:03:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:01.454 12:03:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:01.454 12:03:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:23:01.454 12:03:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:01.454 12:03:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:01.454 12:03:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:23:01.454 12:03:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:01.713 12:03:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:01.713 12:03:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:23:01.713 12:03:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:01.713 12:03:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:23:01.713 [2024-07-12 12:03:51.919316] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:01.713 12:03:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # '[' 7c0a4d67-1e6b-4823-9edc-a42b2e40e55c '!=' 7c0a4d67-1e6b-4823-9edc-a42b2e40e55c ']' 00:23:01.713 12:03:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:23:01.713 12:03:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:01.713 12:03:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:23:01.713 12:03:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:23:01.972 [2024-07-12 12:03:52.087591] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:23:01.972 12:03:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:01.972 12:03:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:01.972 12:03:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:01.972 12:03:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:01.972 12:03:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:01.972 12:03:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:01.972 12:03:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:01.972 12:03:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:01.972 12:03:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:01.972 12:03:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:01.972 12:03:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:01.972 12:03:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:02.231 12:03:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:02.231 "name": "raid_bdev1", 00:23:02.231 "uuid": "7c0a4d67-1e6b-4823-9edc-a42b2e40e55c", 00:23:02.231 "strip_size_kb": 0, 00:23:02.231 "state": "online", 00:23:02.231 "raid_level": "raid1", 00:23:02.231 "superblock": true, 00:23:02.231 "num_base_bdevs": 2, 00:23:02.231 "num_base_bdevs_discovered": 1, 00:23:02.231 "num_base_bdevs_operational": 1, 00:23:02.231 "base_bdevs_list": [ 00:23:02.231 { 00:23:02.231 "name": null, 00:23:02.231 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:02.231 "is_configured": false, 00:23:02.231 "data_offset": 256, 00:23:02.231 "data_size": 7936 00:23:02.231 }, 00:23:02.231 { 00:23:02.231 "name": "pt2", 00:23:02.231 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:02.231 "is_configured": true, 00:23:02.231 "data_offset": 256, 00:23:02.231 "data_size": 7936 00:23:02.231 } 00:23:02.231 ] 00:23:02.231 }' 00:23:02.231 12:03:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:02.231 12:03:52 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:02.799 12:03:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:02.799 [2024-07-12 12:03:52.905677] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:02.799 [2024-07-12 12:03:52.905700] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:02.799 [2024-07-12 12:03:52.905743] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:02.799 [2024-07-12 12:03:52.905773] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:02.799 [2024-07-12 12:03:52.905779] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x81e800 name raid_bdev1, state offline 00:23:02.799 12:03:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:02.799 12:03:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:23:03.056 12:03:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:23:03.056 12:03:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:23:03.057 12:03:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:23:03.057 12:03:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:23:03.057 12:03:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:23:03.057 12:03:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:23:03.057 12:03:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:23:03.057 12:03:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:23:03.057 12:03:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:23:03.057 12:03:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@518 -- # i=1 00:23:03.057 12:03:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:03.314 [2024-07-12 12:03:53.439031] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:03.314 [2024-07-12 12:03:53.439066] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:03.314 [2024-07-12 12:03:53.439077] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9a4670 00:23:03.314 [2024-07-12 12:03:53.439083] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:03.314 [2024-07-12 12:03:53.440146] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:03.314 [2024-07-12 12:03:53.440166] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:03.314 [2024-07-12 12:03:53.440198] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:23:03.314 [2024-07-12 12:03:53.440219] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:03.314 [2024-07-12 12:03:53.440268] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x9a3e80 00:23:03.314 [2024-07-12 12:03:53.440281] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:23:03.314 [2024-07-12 12:03:53.440323] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9a2950 00:23:03.314 [2024-07-12 12:03:53.440373] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x9a3e80 00:23:03.314 [2024-07-12 12:03:53.440378] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x9a3e80 00:23:03.314 [2024-07-12 12:03:53.440415] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:03.314 pt2 00:23:03.314 12:03:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:03.314 12:03:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:03.314 12:03:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:03.314 12:03:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:03.314 12:03:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:03.315 12:03:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:03.315 12:03:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:03.315 12:03:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:03.315 12:03:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:03.315 12:03:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:03.315 12:03:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:03.315 12:03:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:03.573 12:03:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:03.573 "name": "raid_bdev1", 00:23:03.573 "uuid": "7c0a4d67-1e6b-4823-9edc-a42b2e40e55c", 00:23:03.573 "strip_size_kb": 0, 00:23:03.573 "state": "online", 00:23:03.573 "raid_level": "raid1", 00:23:03.573 "superblock": true, 00:23:03.573 "num_base_bdevs": 2, 00:23:03.573 "num_base_bdevs_discovered": 1, 00:23:03.573 "num_base_bdevs_operational": 1, 00:23:03.573 "base_bdevs_list": [ 00:23:03.573 { 00:23:03.573 "name": null, 00:23:03.573 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:03.573 "is_configured": false, 00:23:03.573 "data_offset": 256, 00:23:03.573 "data_size": 7936 00:23:03.573 }, 00:23:03.573 { 00:23:03.573 "name": "pt2", 00:23:03.573 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:03.573 "is_configured": true, 00:23:03.573 "data_offset": 256, 00:23:03.573 "data_size": 7936 00:23:03.573 } 00:23:03.573 ] 00:23:03.573 }' 00:23:03.573 12:03:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:03.573 12:03:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:04.139 12:03:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:04.139 [2024-07-12 12:03:54.237083] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:04.139 [2024-07-12 12:03:54.237103] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:04.139 [2024-07-12 12:03:54.237139] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:04.139 [2024-07-12 12:03:54.237167] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:04.139 [2024-07-12 12:03:54.237173] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9a3e80 name raid_bdev1, state offline 00:23:04.139 12:03:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:23:04.139 12:03:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:04.398 12:03:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:23:04.398 12:03:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:23:04.398 12:03:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:23:04.398 12:03:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:04.398 [2024-07-12 12:03:54.585991] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:04.398 [2024-07-12 12:03:54.586023] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:04.398 [2024-07-12 12:03:54.586033] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x81f430 00:23:04.398 [2024-07-12 12:03:54.586040] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:04.398 [2024-07-12 12:03:54.587181] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:04.398 [2024-07-12 12:03:54.587200] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:04.398 [2024-07-12 12:03:54.587237] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:23:04.398 [2024-07-12 12:03:54.587256] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:04.398 [2024-07-12 12:03:54.587317] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:23:04.398 [2024-07-12 12:03:54.587325] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:04.398 [2024-07-12 12:03:54.587335] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9a48c0 name raid_bdev1, state configuring 00:23:04.398 [2024-07-12 12:03:54.587350] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:04.398 [2024-07-12 12:03:54.587389] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x9a48c0 00:23:04.398 [2024-07-12 12:03:54.587395] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:23:04.398 [2024-07-12 12:03:54.587437] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9a3b00 00:23:04.398 [2024-07-12 12:03:54.587493] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x9a48c0 00:23:04.398 [2024-07-12 12:03:54.587499] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x9a48c0 00:23:04.398 [2024-07-12 12:03:54.587549] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:04.398 pt1 00:23:04.398 12:03:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:23:04.398 12:03:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:04.398 12:03:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:04.398 12:03:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:04.398 12:03:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:04.398 12:03:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:04.398 12:03:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:04.398 12:03:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:04.398 12:03:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:04.398 12:03:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:04.398 12:03:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:04.398 12:03:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:04.398 12:03:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:04.657 12:03:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:04.657 "name": "raid_bdev1", 00:23:04.657 "uuid": "7c0a4d67-1e6b-4823-9edc-a42b2e40e55c", 00:23:04.657 "strip_size_kb": 0, 00:23:04.657 "state": "online", 00:23:04.657 "raid_level": "raid1", 00:23:04.657 "superblock": true, 00:23:04.657 "num_base_bdevs": 2, 00:23:04.657 "num_base_bdevs_discovered": 1, 00:23:04.657 "num_base_bdevs_operational": 1, 00:23:04.657 "base_bdevs_list": [ 00:23:04.657 { 00:23:04.657 "name": null, 00:23:04.657 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:04.657 "is_configured": false, 00:23:04.657 "data_offset": 256, 00:23:04.657 "data_size": 7936 00:23:04.657 }, 00:23:04.657 { 00:23:04.657 "name": "pt2", 00:23:04.657 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:04.657 "is_configured": true, 00:23:04.657 "data_offset": 256, 00:23:04.657 "data_size": 7936 00:23:04.657 } 00:23:04.657 ] 00:23:04.657 }' 00:23:04.657 12:03:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:04.657 12:03:54 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:05.223 12:03:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:23:05.223 12:03:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:23:05.223 12:03:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:23:05.223 12:03:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:05.223 12:03:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:23:05.482 [2024-07-12 12:03:55.592746] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:05.482 12:03:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # '[' 7c0a4d67-1e6b-4823-9edc-a42b2e40e55c '!=' 7c0a4d67-1e6b-4823-9edc-a42b2e40e55c ']' 00:23:05.482 12:03:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@562 -- # killprocess 743968 00:23:05.482 12:03:55 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 743968 ']' 00:23:05.482 12:03:55 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 743968 00:23:05.482 12:03:55 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:23:05.482 12:03:55 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:05.482 12:03:55 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 743968 00:23:05.482 12:03:55 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:05.482 12:03:55 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:05.482 12:03:55 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 743968' 00:23:05.482 killing process with pid 743968 00:23:05.482 12:03:55 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@967 -- # kill 743968 00:23:05.482 [2024-07-12 12:03:55.657941] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:05.482 [2024-07-12 12:03:55.657984] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:05.482 [2024-07-12 12:03:55.658014] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:05.482 [2024-07-12 12:03:55.658020] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9a48c0 name raid_bdev1, state offline 00:23:05.482 12:03:55 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@972 -- # wait 743968 00:23:05.482 [2024-07-12 12:03:55.673737] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:05.741 12:03:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@564 -- # return 0 00:23:05.741 00:23:05.741 real 0m11.745s 00:23:05.741 user 0m21.611s 00:23:05.741 sys 0m1.839s 00:23:05.741 12:03:55 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:05.741 12:03:55 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:05.741 ************************************ 00:23:05.741 END TEST raid_superblock_test_md_interleaved 00:23:05.741 ************************************ 00:23:05.741 12:03:55 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:05.741 12:03:55 bdev_raid -- bdev/bdev_raid.sh@914 -- # run_test raid_rebuild_test_sb_md_interleaved raid_rebuild_test raid1 2 true false false 00:23:05.741 12:03:55 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:23:05.741 12:03:55 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:05.741 12:03:55 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:05.741 ************************************ 00:23:05.741 START TEST raid_rebuild_test_sb_md_interleaved 00:23:05.741 ************************************ 00:23:05.741 12:03:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false false 00:23:05.741 12:03:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:23:05.741 12:03:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:23:05.741 12:03:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:23:05.741 12:03:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:23:05.741 12:03:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@572 -- # local verify=false 00:23:05.741 12:03:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:23:05.741 12:03:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:05.741 12:03:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:23:05.741 12:03:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:05.741 12:03:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:05.741 12:03:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:23:05.741 12:03:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:05.741 12:03:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:05.741 12:03:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:05.741 12:03:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:23:05.741 12:03:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:23:05.741 12:03:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # local strip_size 00:23:05.741 12:03:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@576 -- # local create_arg 00:23:05.741 12:03:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:23:05.741 12:03:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@578 -- # local data_offset 00:23:05.741 12:03:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:23:05.741 12:03:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:23:05.741 12:03:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:23:05.741 12:03:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:23:05.741 12:03:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@596 -- # raid_pid=746244 00:23:05.741 12:03:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@597 -- # waitforlisten 746244 /var/tmp/spdk-raid.sock 00:23:05.741 12:03:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:05.741 12:03:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 746244 ']' 00:23:05.741 12:03:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:05.741 12:03:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:05.741 12:03:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:05.741 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:05.741 12:03:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:05.741 12:03:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:05.741 [2024-07-12 12:03:55.980321] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:23:05.741 [2024-07-12 12:03:55.980360] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid746244 ] 00:23:05.741 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:05.741 Zero copy mechanism will not be used. 00:23:06.000 [2024-07-12 12:03:56.045445] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:06.000 [2024-07-12 12:03:56.116099] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:23:06.000 [2024-07-12 12:03:56.168130] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:06.000 [2024-07-12 12:03:56.168156] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:06.568 12:03:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:06.568 12:03:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:23:06.568 12:03:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:06.568 12:03:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1_malloc 00:23:06.826 BaseBdev1_malloc 00:23:06.826 12:03:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:07.085 [2024-07-12 12:03:57.091711] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:07.085 [2024-07-12 12:03:57.091750] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:07.085 [2024-07-12 12:03:57.091765] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x155cee0 00:23:07.085 [2024-07-12 12:03:57.091771] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:07.085 [2024-07-12 12:03:57.092736] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:07.085 [2024-07-12 12:03:57.092757] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:07.085 BaseBdev1 00:23:07.085 12:03:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:07.085 12:03:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2_malloc 00:23:07.085 BaseBdev2_malloc 00:23:07.085 12:03:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:07.343 [2024-07-12 12:03:57.448532] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:07.343 [2024-07-12 12:03:57.448566] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:07.343 [2024-07-12 12:03:57.448578] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1554700 00:23:07.343 [2024-07-12 12:03:57.448600] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:07.343 [2024-07-12 12:03:57.449725] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:07.343 [2024-07-12 12:03:57.449746] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:07.343 BaseBdev2 00:23:07.343 12:03:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b spare_malloc 00:23:07.602 spare_malloc 00:23:07.602 12:03:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:07.602 spare_delay 00:23:07.602 12:03:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:07.860 [2024-07-12 12:03:57.949687] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:07.860 [2024-07-12 12:03:57.949716] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:07.860 [2024-07-12 12:03:57.949727] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15577e0 00:23:07.860 [2024-07-12 12:03:57.949733] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:07.860 [2024-07-12 12:03:57.950546] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:07.860 [2024-07-12 12:03:57.950563] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:07.860 spare 00:23:07.860 12:03:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:23:08.118 [2024-07-12 12:03:58.114126] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:08.118 [2024-07-12 12:03:58.114884] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:08.118 [2024-07-12 12:03:58.114987] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1559960 00:23:08.118 [2024-07-12 12:03:58.114994] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:23:08.118 [2024-07-12 12:03:58.115033] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13bfbc0 00:23:08.118 [2024-07-12 12:03:58.115086] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1559960 00:23:08.118 [2024-07-12 12:03:58.115091] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1559960 00:23:08.118 [2024-07-12 12:03:58.115123] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:08.118 12:03:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:08.118 12:03:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:08.118 12:03:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:08.118 12:03:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:08.118 12:03:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:08.118 12:03:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:08.118 12:03:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:08.118 12:03:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:08.118 12:03:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:08.118 12:03:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:08.118 12:03:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:08.118 12:03:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:08.118 12:03:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:08.118 "name": "raid_bdev1", 00:23:08.118 "uuid": "f021f4c0-4504-4fae-ae6f-977fe9a21f9d", 00:23:08.118 "strip_size_kb": 0, 00:23:08.118 "state": "online", 00:23:08.118 "raid_level": "raid1", 00:23:08.118 "superblock": true, 00:23:08.118 "num_base_bdevs": 2, 00:23:08.118 "num_base_bdevs_discovered": 2, 00:23:08.118 "num_base_bdevs_operational": 2, 00:23:08.118 "base_bdevs_list": [ 00:23:08.118 { 00:23:08.118 "name": "BaseBdev1", 00:23:08.118 "uuid": "696e0ee1-0271-5656-8e80-d83771550c03", 00:23:08.118 "is_configured": true, 00:23:08.119 "data_offset": 256, 00:23:08.119 "data_size": 7936 00:23:08.119 }, 00:23:08.119 { 00:23:08.119 "name": "BaseBdev2", 00:23:08.119 "uuid": "c904a626-68c2-5505-8810-9d04dc0d1a7d", 00:23:08.119 "is_configured": true, 00:23:08.119 "data_offset": 256, 00:23:08.119 "data_size": 7936 00:23:08.119 } 00:23:08.119 ] 00:23:08.119 }' 00:23:08.119 12:03:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:08.119 12:03:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:08.721 12:03:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:08.722 12:03:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:23:08.722 [2024-07-12 12:03:58.952460] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:08.980 12:03:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:23:08.980 12:03:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:08.980 12:03:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:08.980 12:03:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:23:08.980 12:03:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:23:08.980 12:03:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@623 -- # '[' false = true ']' 00:23:08.980 12:03:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:23:09.239 [2024-07-12 12:03:59.301184] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:09.239 12:03:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:09.239 12:03:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:09.239 12:03:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:09.239 12:03:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:09.239 12:03:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:09.239 12:03:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:09.239 12:03:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:09.239 12:03:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:09.239 12:03:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:09.239 12:03:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:09.239 12:03:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:09.239 12:03:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:09.497 12:03:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:09.497 "name": "raid_bdev1", 00:23:09.497 "uuid": "f021f4c0-4504-4fae-ae6f-977fe9a21f9d", 00:23:09.497 "strip_size_kb": 0, 00:23:09.497 "state": "online", 00:23:09.497 "raid_level": "raid1", 00:23:09.497 "superblock": true, 00:23:09.497 "num_base_bdevs": 2, 00:23:09.497 "num_base_bdevs_discovered": 1, 00:23:09.497 "num_base_bdevs_operational": 1, 00:23:09.497 "base_bdevs_list": [ 00:23:09.497 { 00:23:09.497 "name": null, 00:23:09.497 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:09.497 "is_configured": false, 00:23:09.497 "data_offset": 256, 00:23:09.497 "data_size": 7936 00:23:09.497 }, 00:23:09.497 { 00:23:09.497 "name": "BaseBdev2", 00:23:09.497 "uuid": "c904a626-68c2-5505-8810-9d04dc0d1a7d", 00:23:09.497 "is_configured": true, 00:23:09.497 "data_offset": 256, 00:23:09.497 "data_size": 7936 00:23:09.497 } 00:23:09.497 ] 00:23:09.497 }' 00:23:09.497 12:03:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:09.497 12:03:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:09.755 12:03:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:10.014 [2024-07-12 12:04:00.139377] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:10.014 [2024-07-12 12:04:00.142574] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1559750 00:23:10.014 [2024-07-12 12:04:00.143839] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:10.014 12:04:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@646 -- # sleep 1 00:23:10.951 12:04:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:10.951 12:04:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:10.951 12:04:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:10.951 12:04:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:10.951 12:04:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:10.951 12:04:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:10.951 12:04:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:11.210 12:04:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:11.210 "name": "raid_bdev1", 00:23:11.210 "uuid": "f021f4c0-4504-4fae-ae6f-977fe9a21f9d", 00:23:11.210 "strip_size_kb": 0, 00:23:11.210 "state": "online", 00:23:11.210 "raid_level": "raid1", 00:23:11.211 "superblock": true, 00:23:11.211 "num_base_bdevs": 2, 00:23:11.211 "num_base_bdevs_discovered": 2, 00:23:11.211 "num_base_bdevs_operational": 2, 00:23:11.211 "process": { 00:23:11.211 "type": "rebuild", 00:23:11.211 "target": "spare", 00:23:11.211 "progress": { 00:23:11.211 "blocks": 2816, 00:23:11.211 "percent": 35 00:23:11.211 } 00:23:11.211 }, 00:23:11.211 "base_bdevs_list": [ 00:23:11.211 { 00:23:11.211 "name": "spare", 00:23:11.211 "uuid": "ff90ff35-c598-5b17-8c88-b5a27d4673a6", 00:23:11.211 "is_configured": true, 00:23:11.211 "data_offset": 256, 00:23:11.211 "data_size": 7936 00:23:11.211 }, 00:23:11.211 { 00:23:11.211 "name": "BaseBdev2", 00:23:11.211 "uuid": "c904a626-68c2-5505-8810-9d04dc0d1a7d", 00:23:11.211 "is_configured": true, 00:23:11.211 "data_offset": 256, 00:23:11.211 "data_size": 7936 00:23:11.211 } 00:23:11.211 ] 00:23:11.211 }' 00:23:11.211 12:04:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:11.211 12:04:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:11.211 12:04:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:11.211 12:04:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:11.211 12:04:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:11.470 [2024-07-12 12:04:01.560076] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:11.470 [2024-07-12 12:04:01.654396] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:11.470 [2024-07-12 12:04:01.654433] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:11.470 [2024-07-12 12:04:01.654443] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:11.470 [2024-07-12 12:04:01.654447] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:11.470 12:04:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:11.470 12:04:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:11.470 12:04:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:11.470 12:04:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:11.470 12:04:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:11.470 12:04:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:11.470 12:04:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:11.470 12:04:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:11.470 12:04:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:11.470 12:04:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:11.470 12:04:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:11.470 12:04:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:11.728 12:04:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:11.728 "name": "raid_bdev1", 00:23:11.728 "uuid": "f021f4c0-4504-4fae-ae6f-977fe9a21f9d", 00:23:11.728 "strip_size_kb": 0, 00:23:11.728 "state": "online", 00:23:11.728 "raid_level": "raid1", 00:23:11.728 "superblock": true, 00:23:11.728 "num_base_bdevs": 2, 00:23:11.728 "num_base_bdevs_discovered": 1, 00:23:11.729 "num_base_bdevs_operational": 1, 00:23:11.729 "base_bdevs_list": [ 00:23:11.729 { 00:23:11.729 "name": null, 00:23:11.729 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:11.729 "is_configured": false, 00:23:11.729 "data_offset": 256, 00:23:11.729 "data_size": 7936 00:23:11.729 }, 00:23:11.729 { 00:23:11.729 "name": "BaseBdev2", 00:23:11.729 "uuid": "c904a626-68c2-5505-8810-9d04dc0d1a7d", 00:23:11.729 "is_configured": true, 00:23:11.729 "data_offset": 256, 00:23:11.729 "data_size": 7936 00:23:11.729 } 00:23:11.729 ] 00:23:11.729 }' 00:23:11.729 12:04:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:11.729 12:04:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:12.297 12:04:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:12.297 12:04:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:12.297 12:04:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:12.297 12:04:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:12.297 12:04:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:12.297 12:04:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:12.297 12:04:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:12.297 12:04:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:12.297 "name": "raid_bdev1", 00:23:12.297 "uuid": "f021f4c0-4504-4fae-ae6f-977fe9a21f9d", 00:23:12.297 "strip_size_kb": 0, 00:23:12.297 "state": "online", 00:23:12.297 "raid_level": "raid1", 00:23:12.297 "superblock": true, 00:23:12.297 "num_base_bdevs": 2, 00:23:12.297 "num_base_bdevs_discovered": 1, 00:23:12.297 "num_base_bdevs_operational": 1, 00:23:12.297 "base_bdevs_list": [ 00:23:12.297 { 00:23:12.297 "name": null, 00:23:12.297 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:12.297 "is_configured": false, 00:23:12.297 "data_offset": 256, 00:23:12.297 "data_size": 7936 00:23:12.297 }, 00:23:12.297 { 00:23:12.297 "name": "BaseBdev2", 00:23:12.297 "uuid": "c904a626-68c2-5505-8810-9d04dc0d1a7d", 00:23:12.297 "is_configured": true, 00:23:12.297 "data_offset": 256, 00:23:12.297 "data_size": 7936 00:23:12.297 } 00:23:12.297 ] 00:23:12.297 }' 00:23:12.297 12:04:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:12.557 12:04:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:12.557 12:04:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:12.557 12:04:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:12.557 12:04:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:12.557 [2024-07-12 12:04:02.748810] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:12.557 [2024-07-12 12:04:02.751973] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1556130 00:23:12.557 [2024-07-12 12:04:02.752988] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:12.557 12:04:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@662 -- # sleep 1 00:23:13.935 12:04:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:13.935 12:04:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:13.935 12:04:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:13.935 12:04:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:13.935 12:04:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:13.935 12:04:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:13.935 12:04:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:13.935 12:04:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:13.935 "name": "raid_bdev1", 00:23:13.935 "uuid": "f021f4c0-4504-4fae-ae6f-977fe9a21f9d", 00:23:13.935 "strip_size_kb": 0, 00:23:13.935 "state": "online", 00:23:13.935 "raid_level": "raid1", 00:23:13.935 "superblock": true, 00:23:13.935 "num_base_bdevs": 2, 00:23:13.935 "num_base_bdevs_discovered": 2, 00:23:13.935 "num_base_bdevs_operational": 2, 00:23:13.935 "process": { 00:23:13.935 "type": "rebuild", 00:23:13.935 "target": "spare", 00:23:13.935 "progress": { 00:23:13.935 "blocks": 2816, 00:23:13.935 "percent": 35 00:23:13.935 } 00:23:13.935 }, 00:23:13.935 "base_bdevs_list": [ 00:23:13.935 { 00:23:13.935 "name": "spare", 00:23:13.935 "uuid": "ff90ff35-c598-5b17-8c88-b5a27d4673a6", 00:23:13.935 "is_configured": true, 00:23:13.935 "data_offset": 256, 00:23:13.935 "data_size": 7936 00:23:13.935 }, 00:23:13.935 { 00:23:13.935 "name": "BaseBdev2", 00:23:13.935 "uuid": "c904a626-68c2-5505-8810-9d04dc0d1a7d", 00:23:13.935 "is_configured": true, 00:23:13.935 "data_offset": 256, 00:23:13.935 "data_size": 7936 00:23:13.935 } 00:23:13.935 ] 00:23:13.935 }' 00:23:13.935 12:04:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:13.935 12:04:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:13.935 12:04:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:13.935 12:04:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:13.935 12:04:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:23:13.935 12:04:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:23:13.935 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:23:13.935 12:04:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:23:13.935 12:04:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:23:13.935 12:04:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:23:13.935 12:04:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@705 -- # local timeout=864 00:23:13.935 12:04:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:13.935 12:04:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:13.935 12:04:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:13.935 12:04:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:13.935 12:04:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:13.935 12:04:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:13.935 12:04:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:13.935 12:04:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:14.194 12:04:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:14.194 "name": "raid_bdev1", 00:23:14.194 "uuid": "f021f4c0-4504-4fae-ae6f-977fe9a21f9d", 00:23:14.194 "strip_size_kb": 0, 00:23:14.194 "state": "online", 00:23:14.194 "raid_level": "raid1", 00:23:14.194 "superblock": true, 00:23:14.194 "num_base_bdevs": 2, 00:23:14.194 "num_base_bdevs_discovered": 2, 00:23:14.194 "num_base_bdevs_operational": 2, 00:23:14.194 "process": { 00:23:14.194 "type": "rebuild", 00:23:14.194 "target": "spare", 00:23:14.194 "progress": { 00:23:14.194 "blocks": 3584, 00:23:14.194 "percent": 45 00:23:14.194 } 00:23:14.194 }, 00:23:14.194 "base_bdevs_list": [ 00:23:14.194 { 00:23:14.194 "name": "spare", 00:23:14.194 "uuid": "ff90ff35-c598-5b17-8c88-b5a27d4673a6", 00:23:14.194 "is_configured": true, 00:23:14.194 "data_offset": 256, 00:23:14.194 "data_size": 7936 00:23:14.194 }, 00:23:14.194 { 00:23:14.194 "name": "BaseBdev2", 00:23:14.194 "uuid": "c904a626-68c2-5505-8810-9d04dc0d1a7d", 00:23:14.194 "is_configured": true, 00:23:14.194 "data_offset": 256, 00:23:14.194 "data_size": 7936 00:23:14.194 } 00:23:14.194 ] 00:23:14.194 }' 00:23:14.194 12:04:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:14.194 12:04:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:14.194 12:04:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:14.194 12:04:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:14.194 12:04:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:15.130 12:04:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:15.130 12:04:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:15.130 12:04:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:15.130 12:04:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:15.130 12:04:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:15.130 12:04:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:15.130 12:04:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:15.130 12:04:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:15.389 12:04:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:15.389 "name": "raid_bdev1", 00:23:15.389 "uuid": "f021f4c0-4504-4fae-ae6f-977fe9a21f9d", 00:23:15.389 "strip_size_kb": 0, 00:23:15.389 "state": "online", 00:23:15.389 "raid_level": "raid1", 00:23:15.389 "superblock": true, 00:23:15.389 "num_base_bdevs": 2, 00:23:15.389 "num_base_bdevs_discovered": 2, 00:23:15.389 "num_base_bdevs_operational": 2, 00:23:15.389 "process": { 00:23:15.389 "type": "rebuild", 00:23:15.389 "target": "spare", 00:23:15.389 "progress": { 00:23:15.389 "blocks": 6656, 00:23:15.389 "percent": 83 00:23:15.389 } 00:23:15.389 }, 00:23:15.389 "base_bdevs_list": [ 00:23:15.389 { 00:23:15.389 "name": "spare", 00:23:15.389 "uuid": "ff90ff35-c598-5b17-8c88-b5a27d4673a6", 00:23:15.389 "is_configured": true, 00:23:15.389 "data_offset": 256, 00:23:15.389 "data_size": 7936 00:23:15.389 }, 00:23:15.389 { 00:23:15.389 "name": "BaseBdev2", 00:23:15.389 "uuid": "c904a626-68c2-5505-8810-9d04dc0d1a7d", 00:23:15.389 "is_configured": true, 00:23:15.389 "data_offset": 256, 00:23:15.389 "data_size": 7936 00:23:15.389 } 00:23:15.389 ] 00:23:15.389 }' 00:23:15.389 12:04:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:15.389 12:04:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:15.389 12:04:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:15.389 12:04:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:15.389 12:04:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:15.648 [2024-07-12 12:04:05.874635] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:15.648 [2024-07-12 12:04:05.874680] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:15.648 [2024-07-12 12:04:05.874756] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:16.585 12:04:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:16.585 12:04:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:16.585 12:04:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:16.585 12:04:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:16.585 12:04:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:16.585 12:04:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:16.585 12:04:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:16.585 12:04:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:16.585 12:04:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:16.585 "name": "raid_bdev1", 00:23:16.585 "uuid": "f021f4c0-4504-4fae-ae6f-977fe9a21f9d", 00:23:16.585 "strip_size_kb": 0, 00:23:16.585 "state": "online", 00:23:16.585 "raid_level": "raid1", 00:23:16.585 "superblock": true, 00:23:16.585 "num_base_bdevs": 2, 00:23:16.585 "num_base_bdevs_discovered": 2, 00:23:16.585 "num_base_bdevs_operational": 2, 00:23:16.585 "base_bdevs_list": [ 00:23:16.585 { 00:23:16.585 "name": "spare", 00:23:16.586 "uuid": "ff90ff35-c598-5b17-8c88-b5a27d4673a6", 00:23:16.586 "is_configured": true, 00:23:16.586 "data_offset": 256, 00:23:16.586 "data_size": 7936 00:23:16.586 }, 00:23:16.586 { 00:23:16.586 "name": "BaseBdev2", 00:23:16.586 "uuid": "c904a626-68c2-5505-8810-9d04dc0d1a7d", 00:23:16.586 "is_configured": true, 00:23:16.586 "data_offset": 256, 00:23:16.586 "data_size": 7936 00:23:16.586 } 00:23:16.586 ] 00:23:16.586 }' 00:23:16.586 12:04:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:16.586 12:04:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:16.586 12:04:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:16.586 12:04:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:23:16.586 12:04:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@708 -- # break 00:23:16.586 12:04:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:16.586 12:04:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:16.586 12:04:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:16.586 12:04:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:16.586 12:04:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:16.586 12:04:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:16.586 12:04:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:16.845 12:04:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:16.845 "name": "raid_bdev1", 00:23:16.845 "uuid": "f021f4c0-4504-4fae-ae6f-977fe9a21f9d", 00:23:16.845 "strip_size_kb": 0, 00:23:16.845 "state": "online", 00:23:16.845 "raid_level": "raid1", 00:23:16.845 "superblock": true, 00:23:16.845 "num_base_bdevs": 2, 00:23:16.845 "num_base_bdevs_discovered": 2, 00:23:16.845 "num_base_bdevs_operational": 2, 00:23:16.845 "base_bdevs_list": [ 00:23:16.845 { 00:23:16.845 "name": "spare", 00:23:16.845 "uuid": "ff90ff35-c598-5b17-8c88-b5a27d4673a6", 00:23:16.845 "is_configured": true, 00:23:16.845 "data_offset": 256, 00:23:16.845 "data_size": 7936 00:23:16.845 }, 00:23:16.845 { 00:23:16.845 "name": "BaseBdev2", 00:23:16.845 "uuid": "c904a626-68c2-5505-8810-9d04dc0d1a7d", 00:23:16.845 "is_configured": true, 00:23:16.845 "data_offset": 256, 00:23:16.845 "data_size": 7936 00:23:16.845 } 00:23:16.845 ] 00:23:16.845 }' 00:23:16.845 12:04:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:16.845 12:04:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:16.845 12:04:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:16.845 12:04:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:16.845 12:04:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:16.845 12:04:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:16.845 12:04:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:16.845 12:04:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:16.845 12:04:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:16.845 12:04:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:16.845 12:04:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:16.845 12:04:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:16.845 12:04:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:16.845 12:04:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:16.845 12:04:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:16.845 12:04:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:17.104 12:04:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:17.104 "name": "raid_bdev1", 00:23:17.104 "uuid": "f021f4c0-4504-4fae-ae6f-977fe9a21f9d", 00:23:17.104 "strip_size_kb": 0, 00:23:17.104 "state": "online", 00:23:17.104 "raid_level": "raid1", 00:23:17.104 "superblock": true, 00:23:17.104 "num_base_bdevs": 2, 00:23:17.104 "num_base_bdevs_discovered": 2, 00:23:17.104 "num_base_bdevs_operational": 2, 00:23:17.104 "base_bdevs_list": [ 00:23:17.104 { 00:23:17.104 "name": "spare", 00:23:17.104 "uuid": "ff90ff35-c598-5b17-8c88-b5a27d4673a6", 00:23:17.104 "is_configured": true, 00:23:17.104 "data_offset": 256, 00:23:17.104 "data_size": 7936 00:23:17.104 }, 00:23:17.104 { 00:23:17.104 "name": "BaseBdev2", 00:23:17.104 "uuid": "c904a626-68c2-5505-8810-9d04dc0d1a7d", 00:23:17.104 "is_configured": true, 00:23:17.104 "data_offset": 256, 00:23:17.104 "data_size": 7936 00:23:17.104 } 00:23:17.104 ] 00:23:17.104 }' 00:23:17.104 12:04:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:17.104 12:04:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:17.672 12:04:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:17.672 [2024-07-12 12:04:07.811371] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:17.672 [2024-07-12 12:04:07.811393] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:17.672 [2024-07-12 12:04:07.811439] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:17.672 [2024-07-12 12:04:07.811478] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:17.672 [2024-07-12 12:04:07.811484] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1559960 name raid_bdev1, state offline 00:23:17.672 12:04:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # jq length 00:23:17.672 12:04:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:17.931 12:04:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:23:17.931 12:04:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@721 -- # '[' false = true ']' 00:23:17.931 12:04:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:23:17.931 12:04:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:17.931 12:04:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:18.190 [2024-07-12 12:04:08.316657] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:18.190 [2024-07-12 12:04:08.316693] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:18.190 [2024-07-12 12:04:08.316705] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1557580 00:23:18.190 [2024-07-12 12:04:08.316727] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:18.190 [2024-07-12 12:04:08.317822] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:18.190 [2024-07-12 12:04:08.317843] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:18.190 [2024-07-12 12:04:08.317884] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:18.190 [2024-07-12 12:04:08.317904] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:18.190 [2024-07-12 12:04:08.317962] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:18.190 spare 00:23:18.190 12:04:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:18.190 12:04:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:18.190 12:04:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:18.190 12:04:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:18.190 12:04:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:18.190 12:04:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:18.190 12:04:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:18.190 12:04:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:18.190 12:04:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:18.190 12:04:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:18.190 12:04:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:18.190 12:04:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:18.190 [2024-07-12 12:04:08.418250] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x155b7b0 00:23:18.190 [2024-07-12 12:04:08.418262] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:23:18.190 [2024-07-12 12:04:08.418315] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x154f3b0 00:23:18.190 [2024-07-12 12:04:08.418380] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x155b7b0 00:23:18.191 [2024-07-12 12:04:08.418385] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x155b7b0 00:23:18.191 [2024-07-12 12:04:08.418428] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:18.450 12:04:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:18.450 "name": "raid_bdev1", 00:23:18.450 "uuid": "f021f4c0-4504-4fae-ae6f-977fe9a21f9d", 00:23:18.450 "strip_size_kb": 0, 00:23:18.450 "state": "online", 00:23:18.450 "raid_level": "raid1", 00:23:18.450 "superblock": true, 00:23:18.450 "num_base_bdevs": 2, 00:23:18.450 "num_base_bdevs_discovered": 2, 00:23:18.450 "num_base_bdevs_operational": 2, 00:23:18.450 "base_bdevs_list": [ 00:23:18.450 { 00:23:18.450 "name": "spare", 00:23:18.450 "uuid": "ff90ff35-c598-5b17-8c88-b5a27d4673a6", 00:23:18.450 "is_configured": true, 00:23:18.450 "data_offset": 256, 00:23:18.450 "data_size": 7936 00:23:18.450 }, 00:23:18.450 { 00:23:18.450 "name": "BaseBdev2", 00:23:18.450 "uuid": "c904a626-68c2-5505-8810-9d04dc0d1a7d", 00:23:18.450 "is_configured": true, 00:23:18.450 "data_offset": 256, 00:23:18.450 "data_size": 7936 00:23:18.450 } 00:23:18.450 ] 00:23:18.450 }' 00:23:18.450 12:04:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:18.450 12:04:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:19.017 12:04:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:19.017 12:04:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:19.017 12:04:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:19.017 12:04:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:19.017 12:04:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:19.017 12:04:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:19.017 12:04:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:19.017 12:04:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:19.017 "name": "raid_bdev1", 00:23:19.017 "uuid": "f021f4c0-4504-4fae-ae6f-977fe9a21f9d", 00:23:19.017 "strip_size_kb": 0, 00:23:19.017 "state": "online", 00:23:19.017 "raid_level": "raid1", 00:23:19.017 "superblock": true, 00:23:19.017 "num_base_bdevs": 2, 00:23:19.017 "num_base_bdevs_discovered": 2, 00:23:19.017 "num_base_bdevs_operational": 2, 00:23:19.017 "base_bdevs_list": [ 00:23:19.017 { 00:23:19.017 "name": "spare", 00:23:19.017 "uuid": "ff90ff35-c598-5b17-8c88-b5a27d4673a6", 00:23:19.017 "is_configured": true, 00:23:19.017 "data_offset": 256, 00:23:19.017 "data_size": 7936 00:23:19.017 }, 00:23:19.017 { 00:23:19.017 "name": "BaseBdev2", 00:23:19.017 "uuid": "c904a626-68c2-5505-8810-9d04dc0d1a7d", 00:23:19.017 "is_configured": true, 00:23:19.017 "data_offset": 256, 00:23:19.017 "data_size": 7936 00:23:19.017 } 00:23:19.017 ] 00:23:19.017 }' 00:23:19.017 12:04:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:19.017 12:04:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:19.017 12:04:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:19.276 12:04:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:19.276 12:04:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:19.276 12:04:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:23:19.276 12:04:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:23:19.276 12:04:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:19.535 [2024-07-12 12:04:09.592012] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:19.535 12:04:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:19.535 12:04:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:19.535 12:04:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:19.535 12:04:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:19.535 12:04:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:19.535 12:04:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:19.535 12:04:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:19.535 12:04:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:19.536 12:04:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:19.536 12:04:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:19.536 12:04:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:19.536 12:04:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:19.536 12:04:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:19.536 "name": "raid_bdev1", 00:23:19.536 "uuid": "f021f4c0-4504-4fae-ae6f-977fe9a21f9d", 00:23:19.536 "strip_size_kb": 0, 00:23:19.536 "state": "online", 00:23:19.536 "raid_level": "raid1", 00:23:19.536 "superblock": true, 00:23:19.536 "num_base_bdevs": 2, 00:23:19.536 "num_base_bdevs_discovered": 1, 00:23:19.536 "num_base_bdevs_operational": 1, 00:23:19.536 "base_bdevs_list": [ 00:23:19.536 { 00:23:19.536 "name": null, 00:23:19.536 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:19.536 "is_configured": false, 00:23:19.536 "data_offset": 256, 00:23:19.536 "data_size": 7936 00:23:19.536 }, 00:23:19.536 { 00:23:19.536 "name": "BaseBdev2", 00:23:19.536 "uuid": "c904a626-68c2-5505-8810-9d04dc0d1a7d", 00:23:19.536 "is_configured": true, 00:23:19.536 "data_offset": 256, 00:23:19.536 "data_size": 7936 00:23:19.536 } 00:23:19.536 ] 00:23:19.536 }' 00:23:19.536 12:04:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:19.536 12:04:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:20.104 12:04:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:20.363 [2024-07-12 12:04:10.438220] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:20.363 [2024-07-12 12:04:10.438342] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:23:20.363 [2024-07-12 12:04:10.438354] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:20.363 [2024-07-12 12:04:10.438373] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:20.363 [2024-07-12 12:04:10.441396] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13bfa20 00:23:20.363 [2024-07-12 12:04:10.442389] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:20.364 12:04:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@755 -- # sleep 1 00:23:21.300 12:04:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:21.301 12:04:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:21.301 12:04:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:21.301 12:04:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:21.301 12:04:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:21.301 12:04:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:21.301 12:04:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:21.559 12:04:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:21.560 "name": "raid_bdev1", 00:23:21.560 "uuid": "f021f4c0-4504-4fae-ae6f-977fe9a21f9d", 00:23:21.560 "strip_size_kb": 0, 00:23:21.560 "state": "online", 00:23:21.560 "raid_level": "raid1", 00:23:21.560 "superblock": true, 00:23:21.560 "num_base_bdevs": 2, 00:23:21.560 "num_base_bdevs_discovered": 2, 00:23:21.560 "num_base_bdevs_operational": 2, 00:23:21.560 "process": { 00:23:21.560 "type": "rebuild", 00:23:21.560 "target": "spare", 00:23:21.560 "progress": { 00:23:21.560 "blocks": 2816, 00:23:21.560 "percent": 35 00:23:21.560 } 00:23:21.560 }, 00:23:21.560 "base_bdevs_list": [ 00:23:21.560 { 00:23:21.560 "name": "spare", 00:23:21.560 "uuid": "ff90ff35-c598-5b17-8c88-b5a27d4673a6", 00:23:21.560 "is_configured": true, 00:23:21.560 "data_offset": 256, 00:23:21.560 "data_size": 7936 00:23:21.560 }, 00:23:21.560 { 00:23:21.560 "name": "BaseBdev2", 00:23:21.560 "uuid": "c904a626-68c2-5505-8810-9d04dc0d1a7d", 00:23:21.560 "is_configured": true, 00:23:21.560 "data_offset": 256, 00:23:21.560 "data_size": 7936 00:23:21.560 } 00:23:21.560 ] 00:23:21.560 }' 00:23:21.560 12:04:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:21.560 12:04:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:21.560 12:04:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:21.560 12:04:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:21.560 12:04:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:21.819 [2024-07-12 12:04:11.874662] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:21.819 [2024-07-12 12:04:11.952917] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:21.819 [2024-07-12 12:04:11.952951] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:21.819 [2024-07-12 12:04:11.952960] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:21.819 [2024-07-12 12:04:11.952964] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:21.819 12:04:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:21.819 12:04:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:21.819 12:04:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:21.819 12:04:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:21.819 12:04:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:21.819 12:04:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:21.819 12:04:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:21.819 12:04:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:21.819 12:04:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:21.819 12:04:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:21.819 12:04:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:21.819 12:04:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:22.077 12:04:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:22.077 "name": "raid_bdev1", 00:23:22.077 "uuid": "f021f4c0-4504-4fae-ae6f-977fe9a21f9d", 00:23:22.077 "strip_size_kb": 0, 00:23:22.077 "state": "online", 00:23:22.077 "raid_level": "raid1", 00:23:22.077 "superblock": true, 00:23:22.077 "num_base_bdevs": 2, 00:23:22.077 "num_base_bdevs_discovered": 1, 00:23:22.077 "num_base_bdevs_operational": 1, 00:23:22.077 "base_bdevs_list": [ 00:23:22.077 { 00:23:22.077 "name": null, 00:23:22.077 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:22.077 "is_configured": false, 00:23:22.077 "data_offset": 256, 00:23:22.077 "data_size": 7936 00:23:22.077 }, 00:23:22.077 { 00:23:22.077 "name": "BaseBdev2", 00:23:22.077 "uuid": "c904a626-68c2-5505-8810-9d04dc0d1a7d", 00:23:22.077 "is_configured": true, 00:23:22.077 "data_offset": 256, 00:23:22.077 "data_size": 7936 00:23:22.077 } 00:23:22.077 ] 00:23:22.077 }' 00:23:22.077 12:04:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:22.077 12:04:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:22.644 12:04:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:22.644 [2024-07-12 12:04:12.778510] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:22.644 [2024-07-12 12:04:12.778556] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:22.644 [2024-07-12 12:04:12.778569] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x155aa20 00:23:22.644 [2024-07-12 12:04:12.778577] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:22.644 [2024-07-12 12:04:12.778728] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:22.645 [2024-07-12 12:04:12.778737] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:22.645 [2024-07-12 12:04:12.778778] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:22.645 [2024-07-12 12:04:12.778784] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:23:22.645 [2024-07-12 12:04:12.778790] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:22.645 [2024-07-12 12:04:12.778800] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:22.645 [2024-07-12 12:04:12.781820] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15581b0 00:23:22.645 [2024-07-12 12:04:12.782774] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:22.645 spare 00:23:22.645 12:04:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@762 -- # sleep 1 00:23:23.580 12:04:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:23.580 12:04:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:23.580 12:04:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:23.580 12:04:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:23.580 12:04:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:23.580 12:04:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:23.580 12:04:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:23.839 12:04:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:23.839 "name": "raid_bdev1", 00:23:23.839 "uuid": "f021f4c0-4504-4fae-ae6f-977fe9a21f9d", 00:23:23.839 "strip_size_kb": 0, 00:23:23.839 "state": "online", 00:23:23.839 "raid_level": "raid1", 00:23:23.839 "superblock": true, 00:23:23.839 "num_base_bdevs": 2, 00:23:23.839 "num_base_bdevs_discovered": 2, 00:23:23.839 "num_base_bdevs_operational": 2, 00:23:23.839 "process": { 00:23:23.839 "type": "rebuild", 00:23:23.839 "target": "spare", 00:23:23.839 "progress": { 00:23:23.839 "blocks": 2816, 00:23:23.839 "percent": 35 00:23:23.839 } 00:23:23.839 }, 00:23:23.839 "base_bdevs_list": [ 00:23:23.839 { 00:23:23.839 "name": "spare", 00:23:23.839 "uuid": "ff90ff35-c598-5b17-8c88-b5a27d4673a6", 00:23:23.839 "is_configured": true, 00:23:23.839 "data_offset": 256, 00:23:23.839 "data_size": 7936 00:23:23.839 }, 00:23:23.839 { 00:23:23.839 "name": "BaseBdev2", 00:23:23.839 "uuid": "c904a626-68c2-5505-8810-9d04dc0d1a7d", 00:23:23.839 "is_configured": true, 00:23:23.839 "data_offset": 256, 00:23:23.839 "data_size": 7936 00:23:23.839 } 00:23:23.839 ] 00:23:23.839 }' 00:23:23.839 12:04:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:23.839 12:04:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:23.839 12:04:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:23.839 12:04:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:23.839 12:04:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:24.098 [2024-07-12 12:04:14.203871] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:24.098 [2024-07-12 12:04:14.293318] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:24.098 [2024-07-12 12:04:14.293346] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:24.098 [2024-07-12 12:04:14.293354] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:24.098 [2024-07-12 12:04:14.293374] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:24.098 12:04:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:24.098 12:04:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:24.098 12:04:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:24.098 12:04:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:24.098 12:04:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:24.098 12:04:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:24.098 12:04:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:24.098 12:04:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:24.098 12:04:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:24.098 12:04:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:24.098 12:04:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:24.098 12:04:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:24.357 12:04:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:24.357 "name": "raid_bdev1", 00:23:24.357 "uuid": "f021f4c0-4504-4fae-ae6f-977fe9a21f9d", 00:23:24.357 "strip_size_kb": 0, 00:23:24.357 "state": "online", 00:23:24.357 "raid_level": "raid1", 00:23:24.357 "superblock": true, 00:23:24.357 "num_base_bdevs": 2, 00:23:24.357 "num_base_bdevs_discovered": 1, 00:23:24.357 "num_base_bdevs_operational": 1, 00:23:24.357 "base_bdevs_list": [ 00:23:24.357 { 00:23:24.357 "name": null, 00:23:24.357 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:24.358 "is_configured": false, 00:23:24.358 "data_offset": 256, 00:23:24.358 "data_size": 7936 00:23:24.358 }, 00:23:24.358 { 00:23:24.358 "name": "BaseBdev2", 00:23:24.358 "uuid": "c904a626-68c2-5505-8810-9d04dc0d1a7d", 00:23:24.358 "is_configured": true, 00:23:24.358 "data_offset": 256, 00:23:24.358 "data_size": 7936 00:23:24.358 } 00:23:24.358 ] 00:23:24.358 }' 00:23:24.358 12:04:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:24.358 12:04:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:24.926 12:04:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:24.926 12:04:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:24.926 12:04:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:24.926 12:04:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:24.926 12:04:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:24.926 12:04:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:24.926 12:04:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:24.926 12:04:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:24.926 "name": "raid_bdev1", 00:23:24.926 "uuid": "f021f4c0-4504-4fae-ae6f-977fe9a21f9d", 00:23:24.926 "strip_size_kb": 0, 00:23:24.926 "state": "online", 00:23:24.926 "raid_level": "raid1", 00:23:24.926 "superblock": true, 00:23:24.926 "num_base_bdevs": 2, 00:23:24.926 "num_base_bdevs_discovered": 1, 00:23:24.926 "num_base_bdevs_operational": 1, 00:23:24.926 "base_bdevs_list": [ 00:23:24.926 { 00:23:24.926 "name": null, 00:23:24.926 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:24.926 "is_configured": false, 00:23:24.926 "data_offset": 256, 00:23:24.926 "data_size": 7936 00:23:24.926 }, 00:23:24.926 { 00:23:24.926 "name": "BaseBdev2", 00:23:24.926 "uuid": "c904a626-68c2-5505-8810-9d04dc0d1a7d", 00:23:24.926 "is_configured": true, 00:23:24.926 "data_offset": 256, 00:23:24.926 "data_size": 7936 00:23:24.926 } 00:23:24.926 ] 00:23:24.926 }' 00:23:24.926 12:04:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:25.185 12:04:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:25.185 12:04:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:25.185 12:04:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:25.185 12:04:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:23:25.185 12:04:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:25.444 [2024-07-12 12:04:15.531973] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:25.444 [2024-07-12 12:04:15.532011] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:25.444 [2024-07-12 12:04:15.532026] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13c11a0 00:23:25.444 [2024-07-12 12:04:15.532032] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:25.444 [2024-07-12 12:04:15.532179] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:25.444 [2024-07-12 12:04:15.532188] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:25.444 [2024-07-12 12:04:15.532218] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:23:25.444 [2024-07-12 12:04:15.532236] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:23:25.444 [2024-07-12 12:04:15.532241] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:25.444 BaseBdev1 00:23:25.444 12:04:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@773 -- # sleep 1 00:23:26.380 12:04:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:26.380 12:04:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:26.380 12:04:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:26.380 12:04:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:26.380 12:04:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:26.380 12:04:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:26.380 12:04:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:26.380 12:04:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:26.380 12:04:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:26.381 12:04:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:26.381 12:04:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:26.381 12:04:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:26.638 12:04:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:26.638 "name": "raid_bdev1", 00:23:26.638 "uuid": "f021f4c0-4504-4fae-ae6f-977fe9a21f9d", 00:23:26.638 "strip_size_kb": 0, 00:23:26.638 "state": "online", 00:23:26.638 "raid_level": "raid1", 00:23:26.638 "superblock": true, 00:23:26.638 "num_base_bdevs": 2, 00:23:26.638 "num_base_bdevs_discovered": 1, 00:23:26.638 "num_base_bdevs_operational": 1, 00:23:26.638 "base_bdevs_list": [ 00:23:26.638 { 00:23:26.638 "name": null, 00:23:26.638 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:26.638 "is_configured": false, 00:23:26.638 "data_offset": 256, 00:23:26.638 "data_size": 7936 00:23:26.638 }, 00:23:26.638 { 00:23:26.638 "name": "BaseBdev2", 00:23:26.638 "uuid": "c904a626-68c2-5505-8810-9d04dc0d1a7d", 00:23:26.638 "is_configured": true, 00:23:26.638 "data_offset": 256, 00:23:26.638 "data_size": 7936 00:23:26.638 } 00:23:26.638 ] 00:23:26.638 }' 00:23:26.638 12:04:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:26.638 12:04:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:27.205 12:04:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:27.205 12:04:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:27.205 12:04:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:27.205 12:04:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:27.205 12:04:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:27.205 12:04:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:27.205 12:04:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:27.205 12:04:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:27.205 "name": "raid_bdev1", 00:23:27.205 "uuid": "f021f4c0-4504-4fae-ae6f-977fe9a21f9d", 00:23:27.205 "strip_size_kb": 0, 00:23:27.205 "state": "online", 00:23:27.205 "raid_level": "raid1", 00:23:27.205 "superblock": true, 00:23:27.205 "num_base_bdevs": 2, 00:23:27.205 "num_base_bdevs_discovered": 1, 00:23:27.205 "num_base_bdevs_operational": 1, 00:23:27.205 "base_bdevs_list": [ 00:23:27.205 { 00:23:27.205 "name": null, 00:23:27.205 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:27.205 "is_configured": false, 00:23:27.205 "data_offset": 256, 00:23:27.205 "data_size": 7936 00:23:27.205 }, 00:23:27.205 { 00:23:27.205 "name": "BaseBdev2", 00:23:27.205 "uuid": "c904a626-68c2-5505-8810-9d04dc0d1a7d", 00:23:27.205 "is_configured": true, 00:23:27.205 "data_offset": 256, 00:23:27.205 "data_size": 7936 00:23:27.205 } 00:23:27.205 ] 00:23:27.205 }' 00:23:27.205 12:04:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:27.205 12:04:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:27.205 12:04:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:27.464 12:04:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:27.464 12:04:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:27.464 12:04:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:23:27.464 12:04:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:27.464 12:04:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:27.464 12:04:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:27.464 12:04:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:27.464 12:04:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:27.464 12:04:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:27.464 12:04:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:27.464 12:04:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:27.464 12:04:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:23:27.464 12:04:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:27.464 [2024-07-12 12:04:17.625399] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:27.464 [2024-07-12 12:04:17.625499] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:23:27.464 [2024-07-12 12:04:17.625510] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:27.464 request: 00:23:27.464 { 00:23:27.464 "raid_bdev": "raid_bdev1", 00:23:27.464 "base_bdev": "BaseBdev1", 00:23:27.464 "method": "bdev_raid_add_base_bdev", 00:23:27.464 "req_id": 1 00:23:27.464 } 00:23:27.464 Got JSON-RPC error response 00:23:27.464 response: 00:23:27.464 { 00:23:27.464 "code": -22, 00:23:27.464 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:23:27.464 } 00:23:27.464 12:04:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:23:27.464 12:04:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:23:27.464 12:04:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:23:27.464 12:04:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:23:27.464 12:04:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@777 -- # sleep 1 00:23:28.399 12:04:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:28.399 12:04:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:28.399 12:04:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:28.399 12:04:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:28.399 12:04:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:28.399 12:04:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:28.399 12:04:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:28.399 12:04:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:28.657 12:04:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:28.657 12:04:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:28.657 12:04:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:28.657 12:04:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:28.657 12:04:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:28.657 "name": "raid_bdev1", 00:23:28.657 "uuid": "f021f4c0-4504-4fae-ae6f-977fe9a21f9d", 00:23:28.657 "strip_size_kb": 0, 00:23:28.657 "state": "online", 00:23:28.657 "raid_level": "raid1", 00:23:28.657 "superblock": true, 00:23:28.657 "num_base_bdevs": 2, 00:23:28.657 "num_base_bdevs_discovered": 1, 00:23:28.657 "num_base_bdevs_operational": 1, 00:23:28.657 "base_bdevs_list": [ 00:23:28.657 { 00:23:28.657 "name": null, 00:23:28.657 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:28.657 "is_configured": false, 00:23:28.657 "data_offset": 256, 00:23:28.657 "data_size": 7936 00:23:28.657 }, 00:23:28.657 { 00:23:28.657 "name": "BaseBdev2", 00:23:28.657 "uuid": "c904a626-68c2-5505-8810-9d04dc0d1a7d", 00:23:28.657 "is_configured": true, 00:23:28.657 "data_offset": 256, 00:23:28.657 "data_size": 7936 00:23:28.657 } 00:23:28.657 ] 00:23:28.657 }' 00:23:28.657 12:04:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:28.657 12:04:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:29.222 12:04:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:29.222 12:04:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:29.222 12:04:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:29.222 12:04:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:29.222 12:04:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:29.222 12:04:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:29.222 12:04:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:29.222 12:04:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:29.222 "name": "raid_bdev1", 00:23:29.222 "uuid": "f021f4c0-4504-4fae-ae6f-977fe9a21f9d", 00:23:29.222 "strip_size_kb": 0, 00:23:29.222 "state": "online", 00:23:29.222 "raid_level": "raid1", 00:23:29.222 "superblock": true, 00:23:29.222 "num_base_bdevs": 2, 00:23:29.222 "num_base_bdevs_discovered": 1, 00:23:29.222 "num_base_bdevs_operational": 1, 00:23:29.222 "base_bdevs_list": [ 00:23:29.222 { 00:23:29.222 "name": null, 00:23:29.222 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:29.222 "is_configured": false, 00:23:29.222 "data_offset": 256, 00:23:29.222 "data_size": 7936 00:23:29.222 }, 00:23:29.222 { 00:23:29.222 "name": "BaseBdev2", 00:23:29.222 "uuid": "c904a626-68c2-5505-8810-9d04dc0d1a7d", 00:23:29.222 "is_configured": true, 00:23:29.222 "data_offset": 256, 00:23:29.222 "data_size": 7936 00:23:29.222 } 00:23:29.222 ] 00:23:29.222 }' 00:23:29.222 12:04:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:29.480 12:04:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:29.480 12:04:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:29.480 12:04:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:29.480 12:04:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@782 -- # killprocess 746244 00:23:29.480 12:04:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 746244 ']' 00:23:29.480 12:04:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 746244 00:23:29.480 12:04:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:23:29.480 12:04:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:29.480 12:04:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 746244 00:23:29.480 12:04:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:29.480 12:04:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:29.480 12:04:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 746244' 00:23:29.480 killing process with pid 746244 00:23:29.480 12:04:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # kill 746244 00:23:29.480 Received shutdown signal, test time was about 60.000000 seconds 00:23:29.480 00:23:29.480 Latency(us) 00:23:29.480 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:29.480 =================================================================================================================== 00:23:29.480 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:23:29.480 [2024-07-12 12:04:19.536151] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:29.480 [2024-07-12 12:04:19.536222] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:29.480 [2024-07-12 12:04:19.536254] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:29.480 [2024-07-12 12:04:19.536260] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x155b7b0 name raid_bdev1, state offline 00:23:29.480 12:04:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@972 -- # wait 746244 00:23:29.480 [2024-07-12 12:04:19.559688] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:29.739 12:04:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@784 -- # return 0 00:23:29.739 00:23:29.739 real 0m23.812s 00:23:29.739 user 0m37.036s 00:23:29.739 sys 0m2.489s 00:23:29.739 12:04:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:29.739 12:04:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:29.739 ************************************ 00:23:29.739 END TEST raid_rebuild_test_sb_md_interleaved 00:23:29.739 ************************************ 00:23:29.739 12:04:19 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:29.739 12:04:19 bdev_raid -- bdev/bdev_raid.sh@916 -- # trap - EXIT 00:23:29.739 12:04:19 bdev_raid -- bdev/bdev_raid.sh@917 -- # cleanup 00:23:29.739 12:04:19 bdev_raid -- bdev/bdev_raid.sh@58 -- # '[' -n 746244 ']' 00:23:29.739 12:04:19 bdev_raid -- bdev/bdev_raid.sh@58 -- # ps -p 746244 00:23:29.739 12:04:19 bdev_raid -- bdev/bdev_raid.sh@62 -- # rm -rf /raidtest 00:23:29.739 00:23:29.739 real 14m9.068s 00:23:29.739 user 23m59.498s 00:23:29.739 sys 2m8.295s 00:23:29.739 12:04:19 bdev_raid -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:29.739 12:04:19 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:29.739 ************************************ 00:23:29.739 END TEST bdev_raid 00:23:29.739 ************************************ 00:23:29.739 12:04:19 -- common/autotest_common.sh@1142 -- # return 0 00:23:29.739 12:04:19 -- spdk/autotest.sh@191 -- # run_test bdevperf_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:23:29.739 12:04:19 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:23:29.739 12:04:19 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:29.739 12:04:19 -- common/autotest_common.sh@10 -- # set +x 00:23:29.739 ************************************ 00:23:29.739 START TEST bdevperf_config 00:23:29.739 ************************************ 00:23:29.739 12:04:19 bdevperf_config -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:23:29.739 * Looking for test storage... 00:23:29.739 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/test_config.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/common.sh 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/common.sh@5 -- # bdevperf=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/test_config.sh@12 -- # jsonconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/test_config.sh@13 -- # testconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/test_config.sh@15 -- # trap 'cleanup; exit 1' SIGINT SIGTERM EXIT 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/test_config.sh@17 -- # create_job global read Malloc0 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=read 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:23:29.739 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/test_config.sh@18 -- # create_job job0 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:23:29.739 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/test_config.sh@19 -- # create_job job1 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:23:29.739 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/test_config.sh@20 -- # create_job job2 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:23:29.739 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/test_config.sh@21 -- # create_job job3 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:23:29.739 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:23:29.739 12:04:19 bdevperf_config -- bdevperf/test_config.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:23:33.051 12:04:22 bdevperf_config -- bdevperf/test_config.sh@22 -- # bdevperf_output='[2024-07-12 12:04:20.027110] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:23:33.051 [2024-07-12 12:04:20.027157] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid750654 ] 00:23:33.051 Using job config with 4 jobs 00:23:33.051 [2024-07-12 12:04:20.102356] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:33.051 [2024-07-12 12:04:20.188255] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:23:33.051 cpumask for '\''job0'\'' is too big 00:23:33.051 cpumask for '\''job1'\'' is too big 00:23:33.051 cpumask for '\''job2'\'' is too big 00:23:33.051 cpumask for '\''job3'\'' is too big 00:23:33.051 Running I/O for 2 seconds... 00:23:33.051 00:23:33.051 Latency(us) 00:23:33.051 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:33.051 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:33.051 Malloc0 : 2.01 38607.22 37.70 0.00 0.00 6624.71 1256.11 10111.27 00:23:33.051 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:33.052 Malloc0 : 2.01 38583.37 37.68 0.00 0.00 6619.05 1154.68 8925.38 00:23:33.052 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:33.052 Malloc0 : 2.02 38618.08 37.71 0.00 0.00 6604.12 1146.88 8363.64 00:23:33.052 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:33.052 Malloc0 : 2.02 38594.31 37.69 0.00 0.00 6598.92 1146.88 8488.47 00:23:33.052 =================================================================================================================== 00:23:33.052 Total : 154402.97 150.78 0.00 0.00 6611.68 1146.88 10111.27' 00:23:33.052 12:04:22 bdevperf_config -- bdevperf/test_config.sh@23 -- # get_num_jobs '[2024-07-12 12:04:20.027110] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:23:33.052 [2024-07-12 12:04:20.027157] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid750654 ] 00:23:33.052 Using job config with 4 jobs 00:23:33.052 [2024-07-12 12:04:20.102356] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:33.052 [2024-07-12 12:04:20.188255] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:23:33.052 cpumask for '\''job0'\'' is too big 00:23:33.052 cpumask for '\''job1'\'' is too big 00:23:33.052 cpumask for '\''job2'\'' is too big 00:23:33.052 cpumask for '\''job3'\'' is too big 00:23:33.052 Running I/O for 2 seconds... 00:23:33.052 00:23:33.052 Latency(us) 00:23:33.052 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:33.052 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:33.052 Malloc0 : 2.01 38607.22 37.70 0.00 0.00 6624.71 1256.11 10111.27 00:23:33.052 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:33.052 Malloc0 : 2.01 38583.37 37.68 0.00 0.00 6619.05 1154.68 8925.38 00:23:33.052 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:33.052 Malloc0 : 2.02 38618.08 37.71 0.00 0.00 6604.12 1146.88 8363.64 00:23:33.052 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:33.052 Malloc0 : 2.02 38594.31 37.69 0.00 0.00 6598.92 1146.88 8488.47 00:23:33.052 =================================================================================================================== 00:23:33.052 Total : 154402.97 150.78 0.00 0.00 6611.68 1146.88 10111.27' 00:23:33.052 12:04:22 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-12 12:04:20.027110] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:23:33.052 [2024-07-12 12:04:20.027157] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid750654 ] 00:23:33.052 Using job config with 4 jobs 00:23:33.052 [2024-07-12 12:04:20.102356] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:33.052 [2024-07-12 12:04:20.188255] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:23:33.052 cpumask for '\''job0'\'' is too big 00:23:33.052 cpumask for '\''job1'\'' is too big 00:23:33.052 cpumask for '\''job2'\'' is too big 00:23:33.052 cpumask for '\''job3'\'' is too big 00:23:33.052 Running I/O for 2 seconds... 00:23:33.052 00:23:33.052 Latency(us) 00:23:33.052 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:33.052 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:33.052 Malloc0 : 2.01 38607.22 37.70 0.00 0.00 6624.71 1256.11 10111.27 00:23:33.052 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:33.052 Malloc0 : 2.01 38583.37 37.68 0.00 0.00 6619.05 1154.68 8925.38 00:23:33.052 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:33.052 Malloc0 : 2.02 38618.08 37.71 0.00 0.00 6604.12 1146.88 8363.64 00:23:33.052 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:33.052 Malloc0 : 2.02 38594.31 37.69 0.00 0.00 6598.92 1146.88 8488.47 00:23:33.052 =================================================================================================================== 00:23:33.052 Total : 154402.97 150.78 0.00 0.00 6611.68 1146.88 10111.27' 00:23:33.052 12:04:22 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:23:33.052 12:04:22 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:23:33.052 12:04:22 bdevperf_config -- bdevperf/test_config.sh@23 -- # [[ 4 == \4 ]] 00:23:33.052 12:04:22 bdevperf_config -- bdevperf/test_config.sh@25 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -C -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:23:33.052 [2024-07-12 12:04:22.595614] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:23:33.052 [2024-07-12 12:04:22.595657] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid751009 ] 00:23:33.052 [2024-07-12 12:04:22.670359] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:33.052 [2024-07-12 12:04:22.755568] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:23:33.052 cpumask for 'job0' is too big 00:23:33.052 cpumask for 'job1' is too big 00:23:33.052 cpumask for 'job2' is too big 00:23:33.052 cpumask for 'job3' is too big 00:23:34.953 12:04:25 bdevperf_config -- bdevperf/test_config.sh@25 -- # bdevperf_output='Using job config with 4 jobs 00:23:34.953 Running I/O for 2 seconds... 00:23:34.953 00:23:34.953 Latency(us) 00:23:34.953 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:34.953 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:34.953 Malloc0 : 2.01 39111.92 38.20 0.00 0.00 6542.16 1170.29 10048.85 00:23:34.953 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:34.953 Malloc0 : 2.01 39089.86 38.17 0.00 0.00 6536.66 1154.68 8925.38 00:23:34.953 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:34.953 Malloc0 : 2.01 39131.49 38.21 0.00 0.00 6520.62 1146.88 7739.49 00:23:34.953 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:34.953 Malloc0 : 2.02 39109.48 38.19 0.00 0.00 6515.41 1146.88 7208.96 00:23:34.953 =================================================================================================================== 00:23:34.953 Total : 156442.75 152.78 0.00 0.00 6528.69 1146.88 10048.85' 00:23:34.953 12:04:25 bdevperf_config -- bdevperf/test_config.sh@27 -- # cleanup 00:23:34.953 12:04:25 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:23:34.953 12:04:25 bdevperf_config -- bdevperf/test_config.sh@29 -- # create_job job0 write Malloc0 00:23:34.953 12:04:25 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:23:34.953 12:04:25 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:23:34.953 12:04:25 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:23:34.953 12:04:25 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:23:34.953 12:04:25 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:23:34.953 12:04:25 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:23:34.953 00:23:34.953 12:04:25 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:23:34.953 12:04:25 bdevperf_config -- bdevperf/test_config.sh@30 -- # create_job job1 write Malloc0 00:23:34.953 12:04:25 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:23:34.953 12:04:25 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:23:34.953 12:04:25 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:23:34.953 12:04:25 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:23:34.953 12:04:25 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:23:34.953 12:04:25 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:23:34.953 00:23:34.953 12:04:25 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:23:34.953 12:04:25 bdevperf_config -- bdevperf/test_config.sh@31 -- # create_job job2 write Malloc0 00:23:34.953 12:04:25 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:23:34.953 12:04:25 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:23:34.954 12:04:25 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:23:34.954 12:04:25 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:23:34.954 12:04:25 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:23:34.954 12:04:25 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:23:34.954 00:23:34.954 12:04:25 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:23:34.954 12:04:25 bdevperf_config -- bdevperf/test_config.sh@32 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/test_config.sh@32 -- # bdevperf_output='[2024-07-12 12:04:25.154705] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:23:37.487 [2024-07-12 12:04:25.154762] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid751358 ] 00:23:37.487 Using job config with 3 jobs 00:23:37.487 [2024-07-12 12:04:25.232099] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:37.487 [2024-07-12 12:04:25.319271] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:23:37.487 cpumask for '\''job0'\'' is too big 00:23:37.487 cpumask for '\''job1'\'' is too big 00:23:37.487 cpumask for '\''job2'\'' is too big 00:23:37.487 Running I/O for 2 seconds... 00:23:37.487 00:23:37.487 Latency(us) 00:23:37.487 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:37.487 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:23:37.487 Malloc0 : 2.01 51767.48 50.55 0.00 0.00 4937.02 1139.08 7146.54 00:23:37.487 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:23:37.487 Malloc0 : 2.01 51736.58 50.52 0.00 0.00 4933.14 1123.47 6023.07 00:23:37.487 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:23:37.487 Malloc0 : 2.01 51788.63 50.57 0.00 0.00 4921.66 604.65 5305.30 00:23:37.487 =================================================================================================================== 00:23:37.487 Total : 155292.70 151.65 0.00 0.00 4930.60 604.65 7146.54' 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/test_config.sh@33 -- # get_num_jobs '[2024-07-12 12:04:25.154705] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:23:37.487 [2024-07-12 12:04:25.154762] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid751358 ] 00:23:37.487 Using job config with 3 jobs 00:23:37.487 [2024-07-12 12:04:25.232099] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:37.487 [2024-07-12 12:04:25.319271] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:23:37.487 cpumask for '\''job0'\'' is too big 00:23:37.487 cpumask for '\''job1'\'' is too big 00:23:37.487 cpumask for '\''job2'\'' is too big 00:23:37.487 Running I/O for 2 seconds... 00:23:37.487 00:23:37.487 Latency(us) 00:23:37.487 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:37.487 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:23:37.487 Malloc0 : 2.01 51767.48 50.55 0.00 0.00 4937.02 1139.08 7146.54 00:23:37.487 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:23:37.487 Malloc0 : 2.01 51736.58 50.52 0.00 0.00 4933.14 1123.47 6023.07 00:23:37.487 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:23:37.487 Malloc0 : 2.01 51788.63 50.57 0.00 0.00 4921.66 604.65 5305.30 00:23:37.487 =================================================================================================================== 00:23:37.487 Total : 155292.70 151.65 0.00 0.00 4930.60 604.65 7146.54' 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-12 12:04:25.154705] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:23:37.487 [2024-07-12 12:04:25.154762] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid751358 ] 00:23:37.487 Using job config with 3 jobs 00:23:37.487 [2024-07-12 12:04:25.232099] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:37.487 [2024-07-12 12:04:25.319271] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:23:37.487 cpumask for '\''job0'\'' is too big 00:23:37.487 cpumask for '\''job1'\'' is too big 00:23:37.487 cpumask for '\''job2'\'' is too big 00:23:37.487 Running I/O for 2 seconds... 00:23:37.487 00:23:37.487 Latency(us) 00:23:37.487 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:37.487 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:23:37.487 Malloc0 : 2.01 51767.48 50.55 0.00 0.00 4937.02 1139.08 7146.54 00:23:37.487 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:23:37.487 Malloc0 : 2.01 51736.58 50.52 0.00 0.00 4933.14 1123.47 6023.07 00:23:37.487 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:23:37.487 Malloc0 : 2.01 51788.63 50.57 0.00 0.00 4921.66 604.65 5305.30 00:23:37.487 =================================================================================================================== 00:23:37.487 Total : 155292.70 151.65 0.00 0.00 4930.60 604.65 7146.54' 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/test_config.sh@33 -- # [[ 3 == \3 ]] 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/test_config.sh@35 -- # cleanup 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/test_config.sh@37 -- # create_job global rw Malloc0:Malloc1 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=rw 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0:Malloc1 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:23:37.487 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/test_config.sh@38 -- # create_job job0 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:23:37.487 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/test_config.sh@39 -- # create_job job1 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:23:37.487 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/test_config.sh@40 -- # create_job job2 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:23:37.487 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/test_config.sh@41 -- # create_job job3 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:23:37.487 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:23:37.487 12:04:27 bdevperf_config -- bdevperf/test_config.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:23:40.022 12:04:30 bdevperf_config -- bdevperf/test_config.sh@42 -- # bdevperf_output='[2024-07-12 12:04:27.738712] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:23:40.022 [2024-07-12 12:04:27.738757] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid751822 ] 00:23:40.022 Using job config with 4 jobs 00:23:40.022 [2024-07-12 12:04:27.810622] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:40.022 [2024-07-12 12:04:27.893745] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:23:40.022 cpumask for '\''job0'\'' is too big 00:23:40.022 cpumask for '\''job1'\'' is too big 00:23:40.022 cpumask for '\''job2'\'' is too big 00:23:40.022 cpumask for '\''job3'\'' is too big 00:23:40.022 Running I/O for 2 seconds... 00:23:40.022 00:23:40.022 Latency(us) 00:23:40.022 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:40.022 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:40.022 Malloc0 : 2.02 18777.03 18.34 0.00 0.00 13623.69 2465.40 21720.50 00:23:40.022 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:40.022 Malloc1 : 2.02 18765.35 18.33 0.00 0.00 13623.66 2886.70 21720.50 00:23:40.022 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:40.022 Malloc0 : 2.02 18754.34 18.31 0.00 0.00 13600.62 2559.02 19099.06 00:23:40.022 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:40.022 Malloc1 : 2.03 18789.80 18.35 0.00 0.00 13567.39 3136.37 18974.23 00:23:40.022 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:40.022 Malloc0 : 2.03 18779.07 18.34 0.00 0.00 13542.66 2559.02 16352.79 00:23:40.022 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:40.022 Malloc1 : 2.03 18768.28 18.33 0.00 0.00 13539.96 3105.16 16227.96 00:23:40.022 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:40.022 Malloc0 : 2.03 18757.77 18.32 0.00 0.00 13514.92 2481.01 14792.41 00:23:40.022 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:40.022 Malloc1 : 2.03 18747.08 18.31 0.00 0.00 13515.20 3042.74 14917.24 00:23:40.022 =================================================================================================================== 00:23:40.022 Total : 150138.73 146.62 0.00 0.00 13565.89 2465.40 21720.50' 00:23:40.022 12:04:30 bdevperf_config -- bdevperf/test_config.sh@43 -- # get_num_jobs '[2024-07-12 12:04:27.738712] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:23:40.022 [2024-07-12 12:04:27.738757] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid751822 ] 00:23:40.022 Using job config with 4 jobs 00:23:40.022 [2024-07-12 12:04:27.810622] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:40.022 [2024-07-12 12:04:27.893745] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:23:40.022 cpumask for '\''job0'\'' is too big 00:23:40.022 cpumask for '\''job1'\'' is too big 00:23:40.022 cpumask for '\''job2'\'' is too big 00:23:40.022 cpumask for '\''job3'\'' is too big 00:23:40.022 Running I/O for 2 seconds... 00:23:40.022 00:23:40.022 Latency(us) 00:23:40.022 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:40.022 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:40.022 Malloc0 : 2.02 18777.03 18.34 0.00 0.00 13623.69 2465.40 21720.50 00:23:40.022 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:40.022 Malloc1 : 2.02 18765.35 18.33 0.00 0.00 13623.66 2886.70 21720.50 00:23:40.023 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:40.023 Malloc0 : 2.02 18754.34 18.31 0.00 0.00 13600.62 2559.02 19099.06 00:23:40.023 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:40.023 Malloc1 : 2.03 18789.80 18.35 0.00 0.00 13567.39 3136.37 18974.23 00:23:40.023 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:40.023 Malloc0 : 2.03 18779.07 18.34 0.00 0.00 13542.66 2559.02 16352.79 00:23:40.023 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:40.023 Malloc1 : 2.03 18768.28 18.33 0.00 0.00 13539.96 3105.16 16227.96 00:23:40.023 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:40.023 Malloc0 : 2.03 18757.77 18.32 0.00 0.00 13514.92 2481.01 14792.41 00:23:40.023 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:40.023 Malloc1 : 2.03 18747.08 18.31 0.00 0.00 13515.20 3042.74 14917.24 00:23:40.023 =================================================================================================================== 00:23:40.023 Total : 150138.73 146.62 0.00 0.00 13565.89 2465.40 21720.50' 00:23:40.023 12:04:30 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-12 12:04:27.738712] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:23:40.023 [2024-07-12 12:04:27.738757] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid751822 ] 00:23:40.023 Using job config with 4 jobs 00:23:40.023 [2024-07-12 12:04:27.810622] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:40.023 [2024-07-12 12:04:27.893745] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:23:40.023 cpumask for '\''job0'\'' is too big 00:23:40.023 cpumask for '\''job1'\'' is too big 00:23:40.023 cpumask for '\''job2'\'' is too big 00:23:40.023 cpumask for '\''job3'\'' is too big 00:23:40.023 Running I/O for 2 seconds... 00:23:40.023 00:23:40.023 Latency(us) 00:23:40.023 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:40.023 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:40.023 Malloc0 : 2.02 18777.03 18.34 0.00 0.00 13623.69 2465.40 21720.50 00:23:40.023 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:40.023 Malloc1 : 2.02 18765.35 18.33 0.00 0.00 13623.66 2886.70 21720.50 00:23:40.023 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:40.023 Malloc0 : 2.02 18754.34 18.31 0.00 0.00 13600.62 2559.02 19099.06 00:23:40.023 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:40.023 Malloc1 : 2.03 18789.80 18.35 0.00 0.00 13567.39 3136.37 18974.23 00:23:40.023 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:40.023 Malloc0 : 2.03 18779.07 18.34 0.00 0.00 13542.66 2559.02 16352.79 00:23:40.023 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:40.023 Malloc1 : 2.03 18768.28 18.33 0.00 0.00 13539.96 3105.16 16227.96 00:23:40.023 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:40.023 Malloc0 : 2.03 18757.77 18.32 0.00 0.00 13514.92 2481.01 14792.41 00:23:40.023 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:40.023 Malloc1 : 2.03 18747.08 18.31 0.00 0.00 13515.20 3042.74 14917.24 00:23:40.023 =================================================================================================================== 00:23:40.023 Total : 150138.73 146.62 0.00 0.00 13565.89 2465.40 21720.50' 00:23:40.023 12:04:30 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:23:40.023 12:04:30 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:23:40.023 12:04:30 bdevperf_config -- bdevperf/test_config.sh@43 -- # [[ 4 == \4 ]] 00:23:40.023 12:04:30 bdevperf_config -- bdevperf/test_config.sh@44 -- # cleanup 00:23:40.023 12:04:30 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:23:40.284 12:04:30 bdevperf_config -- bdevperf/test_config.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:23:40.284 00:23:40.284 real 0m10.410s 00:23:40.284 user 0m9.465s 00:23:40.284 sys 0m0.802s 00:23:40.284 12:04:30 bdevperf_config -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:40.284 12:04:30 bdevperf_config -- common/autotest_common.sh@10 -- # set +x 00:23:40.284 ************************************ 00:23:40.284 END TEST bdevperf_config 00:23:40.284 ************************************ 00:23:40.284 12:04:30 -- common/autotest_common.sh@1142 -- # return 0 00:23:40.284 12:04:30 -- spdk/autotest.sh@192 -- # uname -s 00:23:40.284 12:04:30 -- spdk/autotest.sh@192 -- # [[ Linux == Linux ]] 00:23:40.284 12:04:30 -- spdk/autotest.sh@193 -- # run_test reactor_set_interrupt /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:23:40.284 12:04:30 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:23:40.284 12:04:30 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:40.284 12:04:30 -- common/autotest_common.sh@10 -- # set +x 00:23:40.284 ************************************ 00:23:40.284 START TEST reactor_set_interrupt 00:23:40.284 ************************************ 00:23:40.284 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:23:40.284 * Looking for test storage... 00:23:40.284 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:40.284 12:04:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:23:40.284 12:04:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:23:40.284 12:04:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:40.284 12:04:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:40.284 12:04:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:23:40.284 12:04:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:23:40.284 12:04:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:23:40.284 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:23:40.284 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@34 -- # set -e 00:23:40.284 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:23:40.284 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@36 -- # shopt -s extglob 00:23:40.284 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:23:40.284 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:23:40.284 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:23:40.284 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:23:40.284 12:04:30 reactor_set_interrupt -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:23:40.284 12:04:30 reactor_set_interrupt -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:23:40.284 12:04:30 reactor_set_interrupt -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:23:40.284 12:04:30 reactor_set_interrupt -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:23:40.284 12:04:30 reactor_set_interrupt -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:23:40.284 12:04:30 reactor_set_interrupt -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:23:40.284 12:04:30 reactor_set_interrupt -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:23:40.284 12:04:30 reactor_set_interrupt -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:23:40.284 12:04:30 reactor_set_interrupt -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:23:40.284 12:04:30 reactor_set_interrupt -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:23:40.284 12:04:30 reactor_set_interrupt -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:23:40.284 12:04:30 reactor_set_interrupt -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:23:40.284 12:04:30 reactor_set_interrupt -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:23:40.284 12:04:30 reactor_set_interrupt -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:23:40.284 12:04:30 reactor_set_interrupt -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:23:40.284 12:04:30 reactor_set_interrupt -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:23:40.284 12:04:30 reactor_set_interrupt -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:23:40.284 12:04:30 reactor_set_interrupt -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:23:40.284 12:04:30 reactor_set_interrupt -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:23:40.284 12:04:30 reactor_set_interrupt -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:23:40.284 12:04:30 reactor_set_interrupt -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:23:40.284 12:04:30 reactor_set_interrupt -- common/build_config.sh@22 -- # CONFIG_CET=n 00:23:40.284 12:04:30 reactor_set_interrupt -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:23:40.284 12:04:30 reactor_set_interrupt -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:23:40.284 12:04:30 reactor_set_interrupt -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:23:40.284 12:04:30 reactor_set_interrupt -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:23:40.284 12:04:30 reactor_set_interrupt -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:23:40.284 12:04:30 reactor_set_interrupt -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:23:40.284 12:04:30 reactor_set_interrupt -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:23:40.284 12:04:30 reactor_set_interrupt -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:23:40.284 12:04:30 reactor_set_interrupt -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:23:40.284 12:04:30 reactor_set_interrupt -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:23:40.284 12:04:30 reactor_set_interrupt -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:23:40.284 12:04:30 reactor_set_interrupt -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@70 -- # CONFIG_FC=n 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES= 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:23:40.285 12:04:30 reactor_set_interrupt -- common/build_config.sh@83 -- # CONFIG_URING=n 00:23:40.285 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:23:40.285 12:04:30 reactor_set_interrupt -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:23:40.285 12:04:30 reactor_set_interrupt -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:23:40.285 12:04:30 reactor_set_interrupt -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:23:40.285 12:04:30 reactor_set_interrupt -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:23:40.285 12:04:30 reactor_set_interrupt -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:23:40.285 12:04:30 reactor_set_interrupt -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:23:40.285 12:04:30 reactor_set_interrupt -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:23:40.285 12:04:30 reactor_set_interrupt -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:23:40.285 12:04:30 reactor_set_interrupt -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:23:40.285 12:04:30 reactor_set_interrupt -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:23:40.285 12:04:30 reactor_set_interrupt -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:23:40.285 12:04:30 reactor_set_interrupt -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:23:40.285 12:04:30 reactor_set_interrupt -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:23:40.285 12:04:30 reactor_set_interrupt -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:23:40.285 12:04:30 reactor_set_interrupt -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:23:40.285 #define SPDK_CONFIG_H 00:23:40.285 #define SPDK_CONFIG_APPS 1 00:23:40.285 #define SPDK_CONFIG_ARCH native 00:23:40.285 #undef SPDK_CONFIG_ASAN 00:23:40.285 #undef SPDK_CONFIG_AVAHI 00:23:40.285 #undef SPDK_CONFIG_CET 00:23:40.285 #define SPDK_CONFIG_COVERAGE 1 00:23:40.285 #define SPDK_CONFIG_CROSS_PREFIX 00:23:40.285 #define SPDK_CONFIG_CRYPTO 1 00:23:40.285 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:23:40.285 #undef SPDK_CONFIG_CUSTOMOCF 00:23:40.285 #undef SPDK_CONFIG_DAOS 00:23:40.285 #define SPDK_CONFIG_DAOS_DIR 00:23:40.285 #define SPDK_CONFIG_DEBUG 1 00:23:40.285 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:23:40.285 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:23:40.285 #define SPDK_CONFIG_DPDK_INC_DIR 00:23:40.285 #define SPDK_CONFIG_DPDK_LIB_DIR 00:23:40.285 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:23:40.285 #undef SPDK_CONFIG_DPDK_UADK 00:23:40.285 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:23:40.285 #define SPDK_CONFIG_EXAMPLES 1 00:23:40.285 #undef SPDK_CONFIG_FC 00:23:40.285 #define SPDK_CONFIG_FC_PATH 00:23:40.285 #define SPDK_CONFIG_FIO_PLUGIN 1 00:23:40.285 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:23:40.285 #undef SPDK_CONFIG_FUSE 00:23:40.285 #undef SPDK_CONFIG_FUZZER 00:23:40.285 #define SPDK_CONFIG_FUZZER_LIB 00:23:40.285 #undef SPDK_CONFIG_GOLANG 00:23:40.285 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:23:40.285 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:23:40.285 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:23:40.285 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:23:40.285 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:23:40.285 #undef SPDK_CONFIG_HAVE_LIBBSD 00:23:40.285 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:23:40.285 #define SPDK_CONFIG_IDXD 1 00:23:40.285 #define SPDK_CONFIG_IDXD_KERNEL 1 00:23:40.285 #define SPDK_CONFIG_IPSEC_MB 1 00:23:40.285 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:23:40.285 #define SPDK_CONFIG_ISAL 1 00:23:40.285 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:23:40.285 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:23:40.285 #define SPDK_CONFIG_LIBDIR 00:23:40.285 #undef SPDK_CONFIG_LTO 00:23:40.285 #define SPDK_CONFIG_MAX_LCORES 00:23:40.285 #define SPDK_CONFIG_NVME_CUSE 1 00:23:40.285 #undef SPDK_CONFIG_OCF 00:23:40.285 #define SPDK_CONFIG_OCF_PATH 00:23:40.285 #define SPDK_CONFIG_OPENSSL_PATH 00:23:40.285 #undef SPDK_CONFIG_PGO_CAPTURE 00:23:40.285 #define SPDK_CONFIG_PGO_DIR 00:23:40.285 #undef SPDK_CONFIG_PGO_USE 00:23:40.285 #define SPDK_CONFIG_PREFIX /usr/local 00:23:40.285 #undef SPDK_CONFIG_RAID5F 00:23:40.285 #undef SPDK_CONFIG_RBD 00:23:40.285 #define SPDK_CONFIG_RDMA 1 00:23:40.285 #define SPDK_CONFIG_RDMA_PROV verbs 00:23:40.285 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:23:40.285 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:23:40.285 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:23:40.285 #define SPDK_CONFIG_SHARED 1 00:23:40.285 #undef SPDK_CONFIG_SMA 00:23:40.285 #define SPDK_CONFIG_TESTS 1 00:23:40.285 #undef SPDK_CONFIG_TSAN 00:23:40.285 #define SPDK_CONFIG_UBLK 1 00:23:40.285 #define SPDK_CONFIG_UBSAN 1 00:23:40.285 #undef SPDK_CONFIG_UNIT_TESTS 00:23:40.285 #undef SPDK_CONFIG_URING 00:23:40.285 #define SPDK_CONFIG_URING_PATH 00:23:40.285 #undef SPDK_CONFIG_URING_ZNS 00:23:40.285 #undef SPDK_CONFIG_USDT 00:23:40.285 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:23:40.285 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:23:40.285 #undef SPDK_CONFIG_VFIO_USER 00:23:40.285 #define SPDK_CONFIG_VFIO_USER_DIR 00:23:40.285 #define SPDK_CONFIG_VHOST 1 00:23:40.285 #define SPDK_CONFIG_VIRTIO 1 00:23:40.285 #undef SPDK_CONFIG_VTUNE 00:23:40.285 #define SPDK_CONFIG_VTUNE_DIR 00:23:40.285 #define SPDK_CONFIG_WERROR 1 00:23:40.285 #define SPDK_CONFIG_WPDK_DIR 00:23:40.285 #undef SPDK_CONFIG_XNVME 00:23:40.285 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:23:40.285 12:04:30 reactor_set_interrupt -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:23:40.285 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:23:40.285 12:04:30 reactor_set_interrupt -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:23:40.285 12:04:30 reactor_set_interrupt -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:23:40.285 12:04:30 reactor_set_interrupt -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:23:40.285 12:04:30 reactor_set_interrupt -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:23:40.285 12:04:30 reactor_set_interrupt -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:23:40.286 12:04:30 reactor_set_interrupt -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:23:40.286 12:04:30 reactor_set_interrupt -- paths/export.sh@5 -- # export PATH 00:23:40.286 12:04:30 reactor_set_interrupt -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:23:40.286 12:04:30 reactor_set_interrupt -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:23:40.286 12:04:30 reactor_set_interrupt -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:23:40.286 12:04:30 reactor_set_interrupt -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:23:40.286 12:04:30 reactor_set_interrupt -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:23:40.286 12:04:30 reactor_set_interrupt -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:23:40.286 12:04:30 reactor_set_interrupt -- pm/common@64 -- # TEST_TAG=N/A 00:23:40.286 12:04:30 reactor_set_interrupt -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:23:40.286 12:04:30 reactor_set_interrupt -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:23:40.286 12:04:30 reactor_set_interrupt -- pm/common@68 -- # uname -s 00:23:40.286 12:04:30 reactor_set_interrupt -- pm/common@68 -- # PM_OS=Linux 00:23:40.286 12:04:30 reactor_set_interrupt -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:23:40.286 12:04:30 reactor_set_interrupt -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:23:40.286 12:04:30 reactor_set_interrupt -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:23:40.286 12:04:30 reactor_set_interrupt -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:23:40.286 12:04:30 reactor_set_interrupt -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:23:40.286 12:04:30 reactor_set_interrupt -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:23:40.286 12:04:30 reactor_set_interrupt -- pm/common@76 -- # SUDO[0]= 00:23:40.286 12:04:30 reactor_set_interrupt -- pm/common@76 -- # SUDO[1]='sudo -E' 00:23:40.286 12:04:30 reactor_set_interrupt -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:23:40.286 12:04:30 reactor_set_interrupt -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:23:40.286 12:04:30 reactor_set_interrupt -- pm/common@81 -- # [[ Linux == Linux ]] 00:23:40.286 12:04:30 reactor_set_interrupt -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:23:40.286 12:04:30 reactor_set_interrupt -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:23:40.286 12:04:30 reactor_set_interrupt -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:23:40.286 12:04:30 reactor_set_interrupt -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:23:40.286 12:04:30 reactor_set_interrupt -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@58 -- # : 0 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@62 -- # : 0 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@64 -- # : 0 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@66 -- # : 1 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@68 -- # : 0 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@70 -- # : 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@72 -- # : 0 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@74 -- # : 1 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@76 -- # : 0 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@78 -- # : 0 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@80 -- # : 0 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@82 -- # : 0 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@84 -- # : 0 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@86 -- # : 0 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@88 -- # : 0 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@90 -- # : 0 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@92 -- # : 0 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@94 -- # : 0 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@96 -- # : 0 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@98 -- # : 0 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@100 -- # : 0 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@102 -- # : rdma 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@104 -- # : 0 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@106 -- # : 0 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@108 -- # : 1 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@110 -- # : 0 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@112 -- # : 0 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@114 -- # : 0 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@116 -- # : 0 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@118 -- # : 1 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@120 -- # : 0 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@122 -- # : 1 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@124 -- # : 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@126 -- # : 0 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@128 -- # : 1 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@130 -- # : 0 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@132 -- # : 0 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@134 -- # : 0 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@136 -- # : 0 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@138 -- # : 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@140 -- # : true 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@142 -- # : 0 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@144 -- # : 0 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@146 -- # : 0 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@148 -- # : 0 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@150 -- # : 0 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@152 -- # : 0 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@154 -- # : 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@156 -- # : 0 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:23:40.286 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@158 -- # : 0 00:23:40.287 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:23:40.287 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@160 -- # : 0 00:23:40.287 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:23:40.287 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@162 -- # : 0 00:23:40.287 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:23:40.287 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@164 -- # : 0 00:23:40.287 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:23:40.287 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@167 -- # : 00:23:40.287 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:23:40.287 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@169 -- # : 0 00:23:40.287 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:23:40.287 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@171 -- # : 0 00:23:40.287 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:23:40.287 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:23:40.287 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:23:40.287 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:23:40.287 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:23:40.287 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:23:40.287 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:23:40.287 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:23:40.287 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:23:40.548 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:23:40.548 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:23:40.548 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:23:40.548 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:23:40.548 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:23:40.548 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:23:40.548 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:23:40.548 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:23:40.548 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:23:40.548 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:23:40.548 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:23:40.548 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:23:40.548 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@200 -- # cat 00:23:40.548 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:23:40.548 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@263 -- # export valgrind= 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@263 -- # valgrind= 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@269 -- # uname -s 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@279 -- # MAKE=make 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j96 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@299 -- # TEST_MODE= 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@318 -- # [[ -z 752307 ]] 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@318 -- # kill -0 752307 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@331 -- # local mount target_dir 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.OsXRAc 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.OsXRAc/tests/interrupt /tmp/spdk.OsXRAc 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@327 -- # df -T 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=895512576 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4388917248 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=90267181056 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=95562739712 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=5295558656 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=47777992704 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=47781367808 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=3375104 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=19102969856 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=19112550400 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=9580544 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=47780941824 00:23:40.549 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=47781371904 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=430080 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=9556267008 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=9556271104 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:23:40.550 * Looking for test storage... 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@368 -- # local target_space new_size 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@372 -- # mount=/ 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@374 -- # target_space=90267181056 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@381 -- # new_size=7510151168 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:40.550 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@389 -- # return 0 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@1682 -- # set -o errtrace 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@1687 -- # true 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@1689 -- # xtrace_fd 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@27 -- # exec 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@29 -- # exec 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@31 -- # xtrace_restore 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@18 -- # set -x 00:23:40.550 12:04:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:23:40.550 12:04:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:40.550 12:04:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:23:40.550 12:04:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:23:40.550 12:04:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:23:40.550 12:04:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:23:40.550 12:04:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:23:40.550 12:04:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:23:40.550 12:04:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:23:40.550 12:04:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@86 -- # start_intr_tgt 00:23:40.550 12:04:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:23:40.550 12:04:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:23:40.550 12:04:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=752348 00:23:40.550 12:04:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:23:40.550 12:04:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:23:40.550 12:04:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 752348 /var/tmp/spdk.sock 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@829 -- # '[' -z 752348 ']' 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:23:40.550 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:40.550 12:04:30 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:23:40.550 [2024-07-12 12:04:30.625805] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:23:40.550 [2024-07-12 12:04:30.625844] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid752348 ] 00:23:40.550 [2024-07-12 12:04:30.688780] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:23:40.550 [2024-07-12 12:04:30.760650] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:23:40.550 [2024-07-12 12:04:30.760746] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:23:40.550 [2024-07-12 12:04:30.760748] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:23:40.810 [2024-07-12 12:04:30.824686] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:23:41.386 12:04:31 reactor_set_interrupt -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:41.386 12:04:31 reactor_set_interrupt -- common/autotest_common.sh@862 -- # return 0 00:23:41.386 12:04:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@87 -- # setup_bdev_mem 00:23:41.386 12:04:31 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:41.386 Malloc0 00:23:41.386 Malloc1 00:23:41.386 Malloc2 00:23:41.645 12:04:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@88 -- # setup_bdev_aio 00:23:41.645 12:04:31 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:23:41.645 12:04:31 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:23:41.645 12:04:31 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:23:41.645 5000+0 records in 00:23:41.645 5000+0 records out 00:23:41.645 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0181997 s, 563 MB/s 00:23:41.645 12:04:31 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:23:41.645 AIO0 00:23:41.645 12:04:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@90 -- # reactor_set_mode_without_threads 752348 00:23:41.645 12:04:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@76 -- # reactor_set_intr_mode 752348 without_thd 00:23:41.645 12:04:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=752348 00:23:41.645 12:04:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd=without_thd 00:23:41.645 12:04:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:23:41.645 12:04:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:23:41.645 12:04:31 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:23:41.645 12:04:31 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:23:41.645 12:04:31 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:23:41.645 12:04:31 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:23:41.645 12:04:31 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:23:41.645 12:04:31 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:23:41.905 12:04:32 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:23:41.905 12:04:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:23:41.905 12:04:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:23:41.905 12:04:32 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:23:41.905 12:04:32 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:23:41.905 12:04:32 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:23:41.905 12:04:32 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:23:41.905 12:04:32 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:23:41.905 12:04:32 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:23:42.164 12:04:32 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:23:42.164 12:04:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:23:42.164 12:04:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:23:42.164 spdk_thread ids are 1 on reactor0. 00:23:42.164 12:04:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:23:42.164 12:04:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 752348 0 00:23:42.164 12:04:32 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 752348 0 idle 00:23:42.164 12:04:32 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=752348 00:23:42.164 12:04:32 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:23:42.164 12:04:32 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:23:42.164 12:04:32 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:23:42.164 12:04:32 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:23:42.164 12:04:32 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:42.164 12:04:32 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:42.164 12:04:32 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:42.164 12:04:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 752348 -w 256 00:23:42.164 12:04:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:23:42.164 12:04:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 752348 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.25 reactor_0' 00:23:42.164 12:04:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 752348 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.25 reactor_0 00:23:42.164 12:04:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:42.164 12:04:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:42.164 12:04:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:23:42.164 12:04:32 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:23:42.164 12:04:32 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:23:42.164 12:04:32 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:23:42.164 12:04:32 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:23:42.164 12:04:32 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:42.164 12:04:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:23:42.164 12:04:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 752348 1 00:23:42.164 12:04:32 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 752348 1 idle 00:23:42.164 12:04:32 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=752348 00:23:42.164 12:04:32 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:23:42.164 12:04:32 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:23:42.164 12:04:32 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:23:42.164 12:04:32 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:23:42.164 12:04:32 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:42.164 12:04:32 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:42.164 12:04:32 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:42.164 12:04:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 752348 -w 256 00:23:42.164 12:04:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:23:42.423 12:04:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 752351 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.00 reactor_1' 00:23:42.423 12:04:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 752351 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.00 reactor_1 00:23:42.423 12:04:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:42.423 12:04:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:42.423 12:04:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:23:42.423 12:04:32 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:23:42.423 12:04:32 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:23:42.423 12:04:32 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:23:42.423 12:04:32 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:23:42.423 12:04:32 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:42.423 12:04:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:23:42.423 12:04:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 752348 2 00:23:42.423 12:04:32 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 752348 2 idle 00:23:42.423 12:04:32 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=752348 00:23:42.423 12:04:32 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:23:42.423 12:04:32 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:23:42.423 12:04:32 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:23:42.423 12:04:32 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:23:42.423 12:04:32 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:42.423 12:04:32 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:42.423 12:04:32 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:42.423 12:04:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 752348 -w 256 00:23:42.423 12:04:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:23:42.683 12:04:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 752352 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.00 reactor_2' 00:23:42.683 12:04:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 752352 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.00 reactor_2 00:23:42.683 12:04:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:42.683 12:04:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:42.683 12:04:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:23:42.683 12:04:32 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:23:42.683 12:04:32 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:23:42.683 12:04:32 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:23:42.683 12:04:32 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:23:42.683 12:04:32 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:42.683 12:04:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' without_thdx '!=' x ']' 00:23:42.683 12:04:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@35 -- # for i in "${thd0_ids[@]}" 00:23:42.683 12:04:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@36 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x2 00:23:42.683 [2024-07-12 12:04:32.885434] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:23:42.683 12:04:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:23:42.942 [2024-07-12 12:04:33.053141] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:23:42.942 [2024-07-12 12:04:33.056562] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:23:42.942 12:04:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:23:43.201 [2024-07-12 12:04:33.217125] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:23:43.201 [2024-07-12 12:04:33.217218] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:23:43.201 12:04:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:23:43.201 12:04:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 752348 0 00:23:43.201 12:04:33 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 752348 0 busy 00:23:43.201 12:04:33 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=752348 00:23:43.201 12:04:33 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:23:43.201 12:04:33 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:23:43.201 12:04:33 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:23:43.201 12:04:33 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:43.201 12:04:33 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:43.201 12:04:33 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:43.201 12:04:33 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 752348 -w 256 00:23:43.201 12:04:33 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:23:43.201 12:04:33 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 752348 root 20 0 128.2g 36096 23040 R 99.9 0.0 0:00.61 reactor_0' 00:23:43.201 12:04:33 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 752348 root 20 0 128.2g 36096 23040 R 99.9 0.0 0:00.61 reactor_0 00:23:43.201 12:04:33 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:43.201 12:04:33 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:43.201 12:04:33 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:23:43.201 12:04:33 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:23:43.201 12:04:33 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:23:43.201 12:04:33 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:23:43.201 12:04:33 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:23:43.201 12:04:33 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:43.201 12:04:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:23:43.201 12:04:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 752348 2 00:23:43.201 12:04:33 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 752348 2 busy 00:23:43.201 12:04:33 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=752348 00:23:43.201 12:04:33 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:23:43.201 12:04:33 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:23:43.201 12:04:33 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:23:43.201 12:04:33 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:43.201 12:04:33 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:43.201 12:04:33 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:43.201 12:04:33 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 752348 -w 256 00:23:43.201 12:04:33 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:23:43.460 12:04:33 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 752352 root 20 0 128.2g 36096 23040 R 99.9 0.0 0:00.35 reactor_2' 00:23:43.460 12:04:33 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 752352 root 20 0 128.2g 36096 23040 R 99.9 0.0 0:00.35 reactor_2 00:23:43.460 12:04:33 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:43.460 12:04:33 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:43.460 12:04:33 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:23:43.460 12:04:33 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:23:43.460 12:04:33 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:23:43.460 12:04:33 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:23:43.460 12:04:33 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:23:43.460 12:04:33 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:43.460 12:04:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:23:43.719 [2024-07-12 12:04:33.737123] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:23:43.719 [2024-07-12 12:04:33.737220] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:23:43.719 12:04:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' without_thdx '!=' x ']' 00:23:43.719 12:04:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 752348 2 00:23:43.719 12:04:33 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 752348 2 idle 00:23:43.719 12:04:33 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=752348 00:23:43.719 12:04:33 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:23:43.719 12:04:33 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:23:43.719 12:04:33 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:23:43.719 12:04:33 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:23:43.719 12:04:33 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:43.719 12:04:33 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:43.719 12:04:33 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:43.719 12:04:33 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 752348 -w 256 00:23:43.719 12:04:33 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:23:43.719 12:04:33 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 752352 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.51 reactor_2' 00:23:43.719 12:04:33 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 752352 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.51 reactor_2 00:23:43.719 12:04:33 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:43.719 12:04:33 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:43.719 12:04:33 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:23:43.719 12:04:33 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:23:43.719 12:04:33 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:23:43.719 12:04:33 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:23:43.719 12:04:33 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:23:43.719 12:04:33 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:43.719 12:04:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:23:43.977 [2024-07-12 12:04:34.077130] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:23:43.977 [2024-07-12 12:04:34.077238] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:23:43.977 12:04:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' without_thdx '!=' x ']' 00:23:43.977 12:04:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@65 -- # for i in "${thd0_ids[@]}" 00:23:43.977 12:04:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@66 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x1 00:23:44.235 [2024-07-12 12:04:34.249440] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:23:44.235 12:04:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 752348 0 00:23:44.235 12:04:34 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 752348 0 idle 00:23:44.235 12:04:34 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=752348 00:23:44.235 12:04:34 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:23:44.235 12:04:34 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:23:44.235 12:04:34 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:23:44.235 12:04:34 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:23:44.235 12:04:34 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:44.235 12:04:34 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:44.235 12:04:34 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:44.235 12:04:34 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 752348 -w 256 00:23:44.235 12:04:34 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:23:44.235 12:04:34 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 752348 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:01.29 reactor_0' 00:23:44.235 12:04:34 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 752348 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:01.29 reactor_0 00:23:44.235 12:04:34 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:44.235 12:04:34 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:44.235 12:04:34 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:23:44.235 12:04:34 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:23:44.235 12:04:34 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:23:44.235 12:04:34 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:23:44.235 12:04:34 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:23:44.235 12:04:34 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:44.235 12:04:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:23:44.235 12:04:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@77 -- # return 0 00:23:44.235 12:04:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@92 -- # trap - SIGINT SIGTERM EXIT 00:23:44.235 12:04:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@93 -- # killprocess 752348 00:23:44.235 12:04:34 reactor_set_interrupt -- common/autotest_common.sh@948 -- # '[' -z 752348 ']' 00:23:44.235 12:04:34 reactor_set_interrupt -- common/autotest_common.sh@952 -- # kill -0 752348 00:23:44.235 12:04:34 reactor_set_interrupt -- common/autotest_common.sh@953 -- # uname 00:23:44.235 12:04:34 reactor_set_interrupt -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:44.235 12:04:34 reactor_set_interrupt -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 752348 00:23:44.494 12:04:34 reactor_set_interrupt -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:44.494 12:04:34 reactor_set_interrupt -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:44.494 12:04:34 reactor_set_interrupt -- common/autotest_common.sh@966 -- # echo 'killing process with pid 752348' 00:23:44.494 killing process with pid 752348 00:23:44.494 12:04:34 reactor_set_interrupt -- common/autotest_common.sh@967 -- # kill 752348 00:23:44.494 12:04:34 reactor_set_interrupt -- common/autotest_common.sh@972 -- # wait 752348 00:23:44.494 12:04:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@94 -- # cleanup 00:23:44.494 12:04:34 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:23:44.494 12:04:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@97 -- # start_intr_tgt 00:23:44.494 12:04:34 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:23:44.494 12:04:34 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:23:44.494 12:04:34 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=753101 00:23:44.494 12:04:34 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:23:44.494 12:04:34 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 753101 /var/tmp/spdk.sock 00:23:44.494 12:04:34 reactor_set_interrupt -- common/autotest_common.sh@829 -- # '[' -z 753101 ']' 00:23:44.494 12:04:34 reactor_set_interrupt -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:23:44.494 12:04:34 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:44.494 12:04:34 reactor_set_interrupt -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:23:44.494 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:23:44.494 12:04:34 reactor_set_interrupt -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:44.494 12:04:34 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:23:44.494 12:04:34 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:23:44.494 [2024-07-12 12:04:34.730287] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:23:44.494 [2024-07-12 12:04:34.730327] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid753101 ] 00:23:44.753 [2024-07-12 12:04:34.793642] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:23:44.753 [2024-07-12 12:04:34.873344] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:23:44.753 [2024-07-12 12:04:34.873445] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:23:44.753 [2024-07-12 12:04:34.873446] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:23:44.753 [2024-07-12 12:04:34.936691] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:23:45.319 12:04:35 reactor_set_interrupt -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:45.319 12:04:35 reactor_set_interrupt -- common/autotest_common.sh@862 -- # return 0 00:23:45.319 12:04:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@98 -- # setup_bdev_mem 00:23:45.319 12:04:35 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:45.577 Malloc0 00:23:45.577 Malloc1 00:23:45.577 Malloc2 00:23:45.577 12:04:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@99 -- # setup_bdev_aio 00:23:45.577 12:04:35 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:23:45.577 12:04:35 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:23:45.577 12:04:35 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:23:45.577 5000+0 records in 00:23:45.577 5000+0 records out 00:23:45.577 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0165229 s, 620 MB/s 00:23:45.577 12:04:35 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:23:45.836 AIO0 00:23:45.836 12:04:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@101 -- # reactor_set_mode_with_threads 753101 00:23:45.836 12:04:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@81 -- # reactor_set_intr_mode 753101 00:23:45.836 12:04:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=753101 00:23:45.836 12:04:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd= 00:23:45.836 12:04:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:23:45.836 12:04:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:23:45.836 12:04:35 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:23:45.836 12:04:35 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:23:45.836 12:04:35 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:23:45.836 12:04:35 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:23:45.836 12:04:35 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:23:45.836 12:04:35 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:23:46.095 12:04:36 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:23:46.095 12:04:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:23:46.095 12:04:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:23:46.095 12:04:36 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:23:46.095 12:04:36 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:23:46.096 12:04:36 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:23:46.096 12:04:36 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:23:46.096 12:04:36 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:23:46.096 12:04:36 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:23:46.096 12:04:36 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:23:46.096 12:04:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:23:46.096 12:04:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:23:46.096 spdk_thread ids are 1 on reactor0. 00:23:46.096 12:04:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:23:46.096 12:04:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 753101 0 00:23:46.096 12:04:36 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 753101 0 idle 00:23:46.096 12:04:36 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=753101 00:23:46.096 12:04:36 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:23:46.096 12:04:36 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:23:46.096 12:04:36 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:23:46.096 12:04:36 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:23:46.096 12:04:36 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:46.096 12:04:36 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:46.096 12:04:36 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:46.096 12:04:36 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 753101 -w 256 00:23:46.096 12:04:36 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:23:46.355 12:04:36 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 753101 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.27 reactor_0' 00:23:46.355 12:04:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 753101 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.27 reactor_0 00:23:46.355 12:04:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:46.355 12:04:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:46.355 12:04:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:23:46.355 12:04:36 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:23:46.355 12:04:36 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:23:46.355 12:04:36 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:23:46.355 12:04:36 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:23:46.355 12:04:36 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:46.355 12:04:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:23:46.355 12:04:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 753101 1 00:23:46.355 12:04:36 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 753101 1 idle 00:23:46.355 12:04:36 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=753101 00:23:46.355 12:04:36 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:23:46.355 12:04:36 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:23:46.355 12:04:36 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:23:46.355 12:04:36 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:23:46.355 12:04:36 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:46.355 12:04:36 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:46.355 12:04:36 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:46.355 12:04:36 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 753101 -w 256 00:23:46.355 12:04:36 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:23:46.615 12:04:36 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 753104 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.00 reactor_1' 00:23:46.615 12:04:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:46.615 12:04:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 753104 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.00 reactor_1 00:23:46.615 12:04:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:46.615 12:04:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:23:46.615 12:04:36 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:23:46.615 12:04:36 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:23:46.615 12:04:36 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:23:46.615 12:04:36 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:23:46.615 12:04:36 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:46.615 12:04:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:23:46.615 12:04:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 753101 2 00:23:46.615 12:04:36 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 753101 2 idle 00:23:46.615 12:04:36 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=753101 00:23:46.615 12:04:36 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:23:46.615 12:04:36 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:23:46.615 12:04:36 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:23:46.615 12:04:36 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:23:46.615 12:04:36 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:46.615 12:04:36 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:46.615 12:04:36 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:46.615 12:04:36 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 753101 -w 256 00:23:46.615 12:04:36 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:23:46.615 12:04:36 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 753105 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.00 reactor_2' 00:23:46.615 12:04:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 753105 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.00 reactor_2 00:23:46.615 12:04:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:46.615 12:04:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:46.615 12:04:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:23:46.615 12:04:36 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:23:46.615 12:04:36 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:23:46.615 12:04:36 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:23:46.615 12:04:36 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:23:46.615 12:04:36 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:46.615 12:04:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' x '!=' x ']' 00:23:46.615 12:04:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:23:46.874 [2024-07-12 12:04:36.945817] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:23:46.874 [2024-07-12 12:04:36.945913] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to poll mode from intr mode. 00:23:46.874 [2024-07-12 12:04:36.946055] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:23:46.874 12:04:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:23:46.874 [2024-07-12 12:04:37.114188] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:23:46.875 [2024-07-12 12:04:37.114356] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:23:47.133 12:04:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:23:47.133 12:04:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 753101 0 00:23:47.133 12:04:37 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 753101 0 busy 00:23:47.133 12:04:37 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=753101 00:23:47.133 12:04:37 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:23:47.133 12:04:37 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:23:47.133 12:04:37 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:23:47.133 12:04:37 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:47.134 12:04:37 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:47.134 12:04:37 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:47.134 12:04:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 753101 -w 256 00:23:47.134 12:04:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:23:47.134 12:04:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 753101 root 20 0 128.2g 36096 23040 R 99.9 0.0 0:00.62 reactor_0' 00:23:47.134 12:04:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 753101 root 20 0 128.2g 36096 23040 R 99.9 0.0 0:00.62 reactor_0 00:23:47.134 12:04:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:47.134 12:04:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:47.134 12:04:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:23:47.134 12:04:37 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:23:47.134 12:04:37 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:23:47.134 12:04:37 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:23:47.134 12:04:37 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:23:47.134 12:04:37 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:47.134 12:04:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:23:47.134 12:04:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 753101 2 00:23:47.134 12:04:37 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 753101 2 busy 00:23:47.134 12:04:37 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=753101 00:23:47.134 12:04:37 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:23:47.134 12:04:37 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:23:47.134 12:04:37 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:23:47.134 12:04:37 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:47.134 12:04:37 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:47.134 12:04:37 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:47.134 12:04:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 753101 -w 256 00:23:47.134 12:04:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:23:47.393 12:04:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 753105 root 20 0 128.2g 36096 23040 R 99.9 0.0 0:00.35 reactor_2' 00:23:47.393 12:04:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 753105 root 20 0 128.2g 36096 23040 R 99.9 0.0 0:00.35 reactor_2 00:23:47.393 12:04:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:47.393 12:04:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:47.393 12:04:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:23:47.393 12:04:37 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:23:47.393 12:04:37 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:23:47.393 12:04:37 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:23:47.393 12:04:37 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:23:47.393 12:04:37 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:47.393 12:04:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:23:47.393 [2024-07-12 12:04:37.627603] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:23:47.393 [2024-07-12 12:04:37.627695] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:23:47.652 12:04:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' x '!=' x ']' 00:23:47.652 12:04:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 753101 2 00:23:47.652 12:04:37 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 753101 2 idle 00:23:47.652 12:04:37 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=753101 00:23:47.652 12:04:37 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:23:47.652 12:04:37 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:23:47.652 12:04:37 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:23:47.652 12:04:37 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:23:47.652 12:04:37 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:47.653 12:04:37 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:47.653 12:04:37 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:47.653 12:04:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 753101 -w 256 00:23:47.653 12:04:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:23:47.653 12:04:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 753105 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.51 reactor_2' 00:23:47.653 12:04:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 753105 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.51 reactor_2 00:23:47.653 12:04:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:47.653 12:04:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:47.653 12:04:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:23:47.653 12:04:37 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:23:47.653 12:04:37 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:23:47.653 12:04:37 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:23:47.653 12:04:37 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:23:47.653 12:04:37 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:47.653 12:04:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:23:47.912 [2024-07-12 12:04:37.972485] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:23:47.912 [2024-07-12 12:04:37.972722] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from poll mode. 00:23:47.912 [2024-07-12 12:04:37.972739] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:23:47.912 12:04:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' x '!=' x ']' 00:23:47.912 12:04:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 753101 0 00:23:47.912 12:04:37 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 753101 0 idle 00:23:47.912 12:04:37 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=753101 00:23:47.912 12:04:37 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:23:47.912 12:04:37 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:23:47.912 12:04:37 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:23:47.912 12:04:37 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:23:47.912 12:04:37 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:47.912 12:04:37 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:47.912 12:04:37 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:47.912 12:04:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 753101 -w 256 00:23:47.912 12:04:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:23:48.172 12:04:38 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 753101 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:01.30 reactor_0' 00:23:48.172 12:04:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 753101 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:01.30 reactor_0 00:23:48.172 12:04:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:48.172 12:04:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:48.172 12:04:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:23:48.172 12:04:38 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:23:48.172 12:04:38 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:23:48.172 12:04:38 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:23:48.172 12:04:38 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:23:48.172 12:04:38 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:48.172 12:04:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:23:48.172 12:04:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@82 -- # return 0 00:23:48.172 12:04:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@103 -- # trap - SIGINT SIGTERM EXIT 00:23:48.172 12:04:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@104 -- # killprocess 753101 00:23:48.172 12:04:38 reactor_set_interrupt -- common/autotest_common.sh@948 -- # '[' -z 753101 ']' 00:23:48.172 12:04:38 reactor_set_interrupt -- common/autotest_common.sh@952 -- # kill -0 753101 00:23:48.172 12:04:38 reactor_set_interrupt -- common/autotest_common.sh@953 -- # uname 00:23:48.172 12:04:38 reactor_set_interrupt -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:48.172 12:04:38 reactor_set_interrupt -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 753101 00:23:48.172 12:04:38 reactor_set_interrupt -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:48.172 12:04:38 reactor_set_interrupt -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:48.172 12:04:38 reactor_set_interrupt -- common/autotest_common.sh@966 -- # echo 'killing process with pid 753101' 00:23:48.172 killing process with pid 753101 00:23:48.172 12:04:38 reactor_set_interrupt -- common/autotest_common.sh@967 -- # kill 753101 00:23:48.172 12:04:38 reactor_set_interrupt -- common/autotest_common.sh@972 -- # wait 753101 00:23:48.172 12:04:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@105 -- # cleanup 00:23:48.172 12:04:38 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:23:48.172 00:23:48.172 real 0m8.068s 00:23:48.172 user 0m7.225s 00:23:48.172 sys 0m1.457s 00:23:48.172 12:04:38 reactor_set_interrupt -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:48.172 12:04:38 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:23:48.172 ************************************ 00:23:48.172 END TEST reactor_set_interrupt 00:23:48.172 ************************************ 00:23:48.433 12:04:38 -- common/autotest_common.sh@1142 -- # return 0 00:23:48.433 12:04:38 -- spdk/autotest.sh@194 -- # run_test reap_unregistered_poller /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:23:48.433 12:04:38 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:23:48.433 12:04:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:48.433 12:04:38 -- common/autotest_common.sh@10 -- # set +x 00:23:48.433 ************************************ 00:23:48.433 START TEST reap_unregistered_poller 00:23:48.433 ************************************ 00:23:48.433 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:23:48.433 * Looking for test storage... 00:23:48.433 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:48.433 12:04:38 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:23:48.433 12:04:38 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:23:48.433 12:04:38 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:48.433 12:04:38 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:48.433 12:04:38 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:23:48.433 12:04:38 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:23:48.433 12:04:38 reap_unregistered_poller -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:23:48.433 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:23:48.433 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@34 -- # set -e 00:23:48.433 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:23:48.433 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@36 -- # shopt -s extglob 00:23:48.433 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:23:48.433 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:23:48.433 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:23:48.433 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:23:48.433 12:04:38 reap_unregistered_poller -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:23:48.433 12:04:38 reap_unregistered_poller -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:23:48.433 12:04:38 reap_unregistered_poller -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:23:48.433 12:04:38 reap_unregistered_poller -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:23:48.433 12:04:38 reap_unregistered_poller -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:23:48.433 12:04:38 reap_unregistered_poller -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:23:48.433 12:04:38 reap_unregistered_poller -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:23:48.433 12:04:38 reap_unregistered_poller -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:23:48.433 12:04:38 reap_unregistered_poller -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:23:48.433 12:04:38 reap_unregistered_poller -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:23:48.433 12:04:38 reap_unregistered_poller -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:23:48.433 12:04:38 reap_unregistered_poller -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:23:48.433 12:04:38 reap_unregistered_poller -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:23:48.433 12:04:38 reap_unregistered_poller -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:23:48.433 12:04:38 reap_unregistered_poller -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:23:48.433 12:04:38 reap_unregistered_poller -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:23:48.433 12:04:38 reap_unregistered_poller -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:23:48.433 12:04:38 reap_unregistered_poller -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:23:48.433 12:04:38 reap_unregistered_poller -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:23:48.433 12:04:38 reap_unregistered_poller -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:23:48.433 12:04:38 reap_unregistered_poller -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:23:48.433 12:04:38 reap_unregistered_poller -- common/build_config.sh@22 -- # CONFIG_CET=n 00:23:48.433 12:04:38 reap_unregistered_poller -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:23:48.433 12:04:38 reap_unregistered_poller -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:23:48.433 12:04:38 reap_unregistered_poller -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:23:48.433 12:04:38 reap_unregistered_poller -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:23:48.433 12:04:38 reap_unregistered_poller -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@70 -- # CONFIG_FC=n 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES= 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:23:48.434 12:04:38 reap_unregistered_poller -- common/build_config.sh@83 -- # CONFIG_URING=n 00:23:48.434 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:23:48.434 12:04:38 reap_unregistered_poller -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:23:48.434 12:04:38 reap_unregistered_poller -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:23:48.434 12:04:38 reap_unregistered_poller -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:23:48.434 12:04:38 reap_unregistered_poller -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:23:48.434 12:04:38 reap_unregistered_poller -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:23:48.434 12:04:38 reap_unregistered_poller -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:23:48.434 12:04:38 reap_unregistered_poller -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:23:48.434 12:04:38 reap_unregistered_poller -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:23:48.434 12:04:38 reap_unregistered_poller -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:23:48.434 12:04:38 reap_unregistered_poller -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:23:48.434 12:04:38 reap_unregistered_poller -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:23:48.434 12:04:38 reap_unregistered_poller -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:23:48.434 12:04:38 reap_unregistered_poller -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:23:48.434 12:04:38 reap_unregistered_poller -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:23:48.434 12:04:38 reap_unregistered_poller -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:23:48.434 #define SPDK_CONFIG_H 00:23:48.434 #define SPDK_CONFIG_APPS 1 00:23:48.434 #define SPDK_CONFIG_ARCH native 00:23:48.434 #undef SPDK_CONFIG_ASAN 00:23:48.434 #undef SPDK_CONFIG_AVAHI 00:23:48.434 #undef SPDK_CONFIG_CET 00:23:48.434 #define SPDK_CONFIG_COVERAGE 1 00:23:48.434 #define SPDK_CONFIG_CROSS_PREFIX 00:23:48.434 #define SPDK_CONFIG_CRYPTO 1 00:23:48.434 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:23:48.434 #undef SPDK_CONFIG_CUSTOMOCF 00:23:48.434 #undef SPDK_CONFIG_DAOS 00:23:48.434 #define SPDK_CONFIG_DAOS_DIR 00:23:48.434 #define SPDK_CONFIG_DEBUG 1 00:23:48.434 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:23:48.434 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:23:48.434 #define SPDK_CONFIG_DPDK_INC_DIR 00:23:48.434 #define SPDK_CONFIG_DPDK_LIB_DIR 00:23:48.434 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:23:48.434 #undef SPDK_CONFIG_DPDK_UADK 00:23:48.434 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:23:48.434 #define SPDK_CONFIG_EXAMPLES 1 00:23:48.434 #undef SPDK_CONFIG_FC 00:23:48.434 #define SPDK_CONFIG_FC_PATH 00:23:48.434 #define SPDK_CONFIG_FIO_PLUGIN 1 00:23:48.434 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:23:48.434 #undef SPDK_CONFIG_FUSE 00:23:48.434 #undef SPDK_CONFIG_FUZZER 00:23:48.434 #define SPDK_CONFIG_FUZZER_LIB 00:23:48.434 #undef SPDK_CONFIG_GOLANG 00:23:48.434 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:23:48.434 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:23:48.434 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:23:48.434 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:23:48.434 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:23:48.434 #undef SPDK_CONFIG_HAVE_LIBBSD 00:23:48.434 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:23:48.434 #define SPDK_CONFIG_IDXD 1 00:23:48.434 #define SPDK_CONFIG_IDXD_KERNEL 1 00:23:48.434 #define SPDK_CONFIG_IPSEC_MB 1 00:23:48.435 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:23:48.435 #define SPDK_CONFIG_ISAL 1 00:23:48.435 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:23:48.435 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:23:48.435 #define SPDK_CONFIG_LIBDIR 00:23:48.435 #undef SPDK_CONFIG_LTO 00:23:48.435 #define SPDK_CONFIG_MAX_LCORES 00:23:48.435 #define SPDK_CONFIG_NVME_CUSE 1 00:23:48.435 #undef SPDK_CONFIG_OCF 00:23:48.435 #define SPDK_CONFIG_OCF_PATH 00:23:48.435 #define SPDK_CONFIG_OPENSSL_PATH 00:23:48.435 #undef SPDK_CONFIG_PGO_CAPTURE 00:23:48.435 #define SPDK_CONFIG_PGO_DIR 00:23:48.435 #undef SPDK_CONFIG_PGO_USE 00:23:48.435 #define SPDK_CONFIG_PREFIX /usr/local 00:23:48.435 #undef SPDK_CONFIG_RAID5F 00:23:48.435 #undef SPDK_CONFIG_RBD 00:23:48.435 #define SPDK_CONFIG_RDMA 1 00:23:48.435 #define SPDK_CONFIG_RDMA_PROV verbs 00:23:48.435 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:23:48.435 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:23:48.435 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:23:48.435 #define SPDK_CONFIG_SHARED 1 00:23:48.435 #undef SPDK_CONFIG_SMA 00:23:48.435 #define SPDK_CONFIG_TESTS 1 00:23:48.435 #undef SPDK_CONFIG_TSAN 00:23:48.435 #define SPDK_CONFIG_UBLK 1 00:23:48.435 #define SPDK_CONFIG_UBSAN 1 00:23:48.435 #undef SPDK_CONFIG_UNIT_TESTS 00:23:48.435 #undef SPDK_CONFIG_URING 00:23:48.435 #define SPDK_CONFIG_URING_PATH 00:23:48.435 #undef SPDK_CONFIG_URING_ZNS 00:23:48.435 #undef SPDK_CONFIG_USDT 00:23:48.435 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:23:48.435 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:23:48.435 #undef SPDK_CONFIG_VFIO_USER 00:23:48.435 #define SPDK_CONFIG_VFIO_USER_DIR 00:23:48.435 #define SPDK_CONFIG_VHOST 1 00:23:48.435 #define SPDK_CONFIG_VIRTIO 1 00:23:48.435 #undef SPDK_CONFIG_VTUNE 00:23:48.435 #define SPDK_CONFIG_VTUNE_DIR 00:23:48.435 #define SPDK_CONFIG_WERROR 1 00:23:48.435 #define SPDK_CONFIG_WPDK_DIR 00:23:48.435 #undef SPDK_CONFIG_XNVME 00:23:48.435 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:23:48.435 12:04:38 reap_unregistered_poller -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:23:48.435 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:23:48.435 12:04:38 reap_unregistered_poller -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:23:48.435 12:04:38 reap_unregistered_poller -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:23:48.435 12:04:38 reap_unregistered_poller -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:23:48.435 12:04:38 reap_unregistered_poller -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:23:48.435 12:04:38 reap_unregistered_poller -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:23:48.435 12:04:38 reap_unregistered_poller -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:23:48.435 12:04:38 reap_unregistered_poller -- paths/export.sh@5 -- # export PATH 00:23:48.435 12:04:38 reap_unregistered_poller -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:23:48.435 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:23:48.435 12:04:38 reap_unregistered_poller -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:23:48.435 12:04:38 reap_unregistered_poller -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:23:48.435 12:04:38 reap_unregistered_poller -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:23:48.435 12:04:38 reap_unregistered_poller -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:23:48.435 12:04:38 reap_unregistered_poller -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:23:48.435 12:04:38 reap_unregistered_poller -- pm/common@64 -- # TEST_TAG=N/A 00:23:48.435 12:04:38 reap_unregistered_poller -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:23:48.435 12:04:38 reap_unregistered_poller -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:23:48.435 12:04:38 reap_unregistered_poller -- pm/common@68 -- # uname -s 00:23:48.435 12:04:38 reap_unregistered_poller -- pm/common@68 -- # PM_OS=Linux 00:23:48.435 12:04:38 reap_unregistered_poller -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:23:48.435 12:04:38 reap_unregistered_poller -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:23:48.435 12:04:38 reap_unregistered_poller -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:23:48.435 12:04:38 reap_unregistered_poller -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:23:48.435 12:04:38 reap_unregistered_poller -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:23:48.435 12:04:38 reap_unregistered_poller -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:23:48.435 12:04:38 reap_unregistered_poller -- pm/common@76 -- # SUDO[0]= 00:23:48.435 12:04:38 reap_unregistered_poller -- pm/common@76 -- # SUDO[1]='sudo -E' 00:23:48.435 12:04:38 reap_unregistered_poller -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:23:48.435 12:04:38 reap_unregistered_poller -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:23:48.435 12:04:38 reap_unregistered_poller -- pm/common@81 -- # [[ Linux == Linux ]] 00:23:48.435 12:04:38 reap_unregistered_poller -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:23:48.435 12:04:38 reap_unregistered_poller -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:23:48.435 12:04:38 reap_unregistered_poller -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:23:48.435 12:04:38 reap_unregistered_poller -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:23:48.435 12:04:38 reap_unregistered_poller -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:23:48.435 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@58 -- # : 0 00:23:48.435 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:23:48.435 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@62 -- # : 0 00:23:48.435 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:23:48.435 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@64 -- # : 0 00:23:48.435 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:23:48.435 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@66 -- # : 1 00:23:48.435 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:23:48.435 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@68 -- # : 0 00:23:48.435 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:23:48.435 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@70 -- # : 00:23:48.435 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:23:48.435 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@72 -- # : 0 00:23:48.435 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:23:48.435 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@74 -- # : 1 00:23:48.435 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:23:48.435 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@76 -- # : 0 00:23:48.435 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:23:48.435 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@78 -- # : 0 00:23:48.435 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:23:48.435 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@80 -- # : 0 00:23:48.435 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:23:48.435 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@82 -- # : 0 00:23:48.435 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:23:48.435 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@84 -- # : 0 00:23:48.435 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:23:48.435 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@86 -- # : 0 00:23:48.435 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:23:48.435 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@88 -- # : 0 00:23:48.435 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@90 -- # : 0 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@92 -- # : 0 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@94 -- # : 0 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@96 -- # : 0 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@98 -- # : 0 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@100 -- # : 0 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@102 -- # : rdma 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@104 -- # : 0 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@106 -- # : 0 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@108 -- # : 1 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@110 -- # : 0 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@112 -- # : 0 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@114 -- # : 0 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@116 -- # : 0 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@118 -- # : 1 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@120 -- # : 0 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@122 -- # : 1 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@124 -- # : 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@126 -- # : 0 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@128 -- # : 1 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@130 -- # : 0 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@132 -- # : 0 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@134 -- # : 0 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@136 -- # : 0 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@138 -- # : 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@140 -- # : true 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@142 -- # : 0 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@144 -- # : 0 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@146 -- # : 0 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@148 -- # : 0 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@150 -- # : 0 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@152 -- # : 0 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@154 -- # : 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@156 -- # : 0 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@158 -- # : 0 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@160 -- # : 0 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@162 -- # : 0 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@164 -- # : 0 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@167 -- # : 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@169 -- # : 0 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@171 -- # : 0 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:23:48.436 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:23:48.437 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:23:48.437 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:23:48.437 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:23:48.437 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:23:48.437 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:23:48.437 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:23:48.437 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:23:48.437 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:23:48.437 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:23:48.437 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:23:48.437 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:23:48.437 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:23:48.437 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@200 -- # cat 00:23:48.437 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:23:48.437 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:23:48.437 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:23:48.437 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:23:48.437 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:23:48.437 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:23:48.437 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:23:48.437 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:23:48.437 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:23:48.437 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:23:48.437 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:23:48.437 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:23:48.437 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:23:48.437 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:23:48.437 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:23:48.437 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:23:48.437 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:23:48.437 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:23:48.437 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:23:48.437 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:23:48.437 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@263 -- # export valgrind= 00:23:48.437 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@263 -- # valgrind= 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@269 -- # uname -s 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@279 -- # MAKE=make 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j96 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@299 -- # TEST_MODE= 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@318 -- # [[ -z 753886 ]] 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@318 -- # kill -0 753886 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@331 -- # local mount target_dir 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.G8hSbs 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.G8hSbs/tests/interrupt /tmp/spdk.G8hSbs 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@327 -- # df -T 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=895512576 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4388917248 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:23:48.697 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=90267025408 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=95562739712 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=5295714304 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=47777992704 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=47781367808 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=3375104 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=19102969856 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=19112550400 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=9580544 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=47780941824 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=47781371904 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=430080 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=9556267008 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=9556271104 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:23:48.698 * Looking for test storage... 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@368 -- # local target_space new_size 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@372 -- # mount=/ 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@374 -- # target_space=90267025408 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@381 -- # new_size=7510306816 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:48.698 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@389 -- # return 0 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@1682 -- # set -o errtrace 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@1687 -- # true 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@1689 -- # xtrace_fd 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@27 -- # exec 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@29 -- # exec 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@31 -- # xtrace_restore 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@18 -- # set -x 00:23:48.698 12:04:38 reap_unregistered_poller -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:23:48.698 12:04:38 reap_unregistered_poller -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:48.698 12:04:38 reap_unregistered_poller -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:23:48.698 12:04:38 reap_unregistered_poller -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:23:48.698 12:04:38 reap_unregistered_poller -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:23:48.698 12:04:38 reap_unregistered_poller -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:23:48.698 12:04:38 reap_unregistered_poller -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:23:48.698 12:04:38 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:23:48.698 12:04:38 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:23:48.698 12:04:38 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@17 -- # start_intr_tgt 00:23:48.698 12:04:38 reap_unregistered_poller -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:23:48.698 12:04:38 reap_unregistered_poller -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:23:48.698 12:04:38 reap_unregistered_poller -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=753928 00:23:48.698 12:04:38 reap_unregistered_poller -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:23:48.698 12:04:38 reap_unregistered_poller -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:23:48.698 12:04:38 reap_unregistered_poller -- interrupt/interrupt_common.sh@26 -- # waitforlisten 753928 /var/tmp/spdk.sock 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@829 -- # '[' -z 753928 ']' 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:23:48.698 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:48.698 12:04:38 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:23:48.698 [2024-07-12 12:04:38.763386] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:23:48.698 [2024-07-12 12:04:38.763428] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid753928 ] 00:23:48.698 [2024-07-12 12:04:38.827278] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:23:48.698 [2024-07-12 12:04:38.900623] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:23:48.698 [2024-07-12 12:04:38.900725] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:23:48.698 [2024-07-12 12:04:38.900727] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:23:48.958 [2024-07-12 12:04:38.964331] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:23:49.525 12:04:39 reap_unregistered_poller -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:49.525 12:04:39 reap_unregistered_poller -- common/autotest_common.sh@862 -- # return 0 00:23:49.525 12:04:39 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # rpc_cmd thread_get_pollers 00:23:49.525 12:04:39 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # jq -r '.threads[0]' 00:23:49.525 12:04:39 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:23:49.525 12:04:39 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:23:49.525 12:04:39 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:23:49.525 12:04:39 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # app_thread='{ 00:23:49.525 "name": "app_thread", 00:23:49.525 "id": 1, 00:23:49.525 "active_pollers": [], 00:23:49.525 "timed_pollers": [ 00:23:49.525 { 00:23:49.525 "name": "rpc_subsystem_poll_servers", 00:23:49.525 "id": 1, 00:23:49.525 "state": "waiting", 00:23:49.525 "run_count": 0, 00:23:49.525 "busy_count": 0, 00:23:49.525 "period_ticks": 8400000 00:23:49.525 } 00:23:49.525 ], 00:23:49.525 "paused_pollers": [] 00:23:49.525 }' 00:23:49.525 12:04:39 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # jq -r '.active_pollers[].name' 00:23:49.525 12:04:39 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # native_pollers= 00:23:49.525 12:04:39 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@22 -- # native_pollers+=' ' 00:23:49.525 12:04:39 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # jq -r '.timed_pollers[].name' 00:23:49.525 12:04:39 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # native_pollers+=rpc_subsystem_poll_servers 00:23:49.525 12:04:39 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@28 -- # setup_bdev_aio 00:23:49.525 12:04:39 reap_unregistered_poller -- interrupt/common.sh@75 -- # uname -s 00:23:49.525 12:04:39 reap_unregistered_poller -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:23:49.525 12:04:39 reap_unregistered_poller -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:23:49.525 5000+0 records in 00:23:49.525 5000+0 records out 00:23:49.525 10240000 bytes (10 MB, 9.8 MiB) copied, 0.00700676 s, 1.5 GB/s 00:23:49.525 12:04:39 reap_unregistered_poller -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:23:49.784 AIO0 00:23:49.784 12:04:39 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:23:50.043 12:04:40 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@34 -- # sleep 0.1 00:23:50.043 12:04:40 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # rpc_cmd thread_get_pollers 00:23:50.043 12:04:40 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # jq -r '.threads[0]' 00:23:50.043 12:04:40 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:23:50.043 12:04:40 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:23:50.043 12:04:40 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:23:50.043 12:04:40 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # app_thread='{ 00:23:50.043 "name": "app_thread", 00:23:50.043 "id": 1, 00:23:50.043 "active_pollers": [], 00:23:50.043 "timed_pollers": [ 00:23:50.043 { 00:23:50.043 "name": "rpc_subsystem_poll_servers", 00:23:50.043 "id": 1, 00:23:50.043 "state": "waiting", 00:23:50.043 "run_count": 0, 00:23:50.043 "busy_count": 0, 00:23:50.043 "period_ticks": 8400000 00:23:50.043 } 00:23:50.043 ], 00:23:50.043 "paused_pollers": [] 00:23:50.043 }' 00:23:50.043 12:04:40 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # jq -r '.active_pollers[].name' 00:23:50.043 12:04:40 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # remaining_pollers= 00:23:50.043 12:04:40 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@39 -- # remaining_pollers+=' ' 00:23:50.301 12:04:40 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # jq -r '.timed_pollers[].name' 00:23:50.301 12:04:40 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # remaining_pollers+=rpc_subsystem_poll_servers 00:23:50.301 12:04:40 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@44 -- # [[ rpc_subsystem_poll_servers == \ \r\p\c\_\s\u\b\s\y\s\t\e\m\_\p\o\l\l\_\s\e\r\v\e\r\s ]] 00:23:50.301 12:04:40 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@46 -- # trap - SIGINT SIGTERM EXIT 00:23:50.301 12:04:40 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@47 -- # killprocess 753928 00:23:50.301 12:04:40 reap_unregistered_poller -- common/autotest_common.sh@948 -- # '[' -z 753928 ']' 00:23:50.301 12:04:40 reap_unregistered_poller -- common/autotest_common.sh@952 -- # kill -0 753928 00:23:50.301 12:04:40 reap_unregistered_poller -- common/autotest_common.sh@953 -- # uname 00:23:50.301 12:04:40 reap_unregistered_poller -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:50.301 12:04:40 reap_unregistered_poller -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 753928 00:23:50.301 12:04:40 reap_unregistered_poller -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:50.301 12:04:40 reap_unregistered_poller -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:50.301 12:04:40 reap_unregistered_poller -- common/autotest_common.sh@966 -- # echo 'killing process with pid 753928' 00:23:50.301 killing process with pid 753928 00:23:50.301 12:04:40 reap_unregistered_poller -- common/autotest_common.sh@967 -- # kill 753928 00:23:50.301 12:04:40 reap_unregistered_poller -- common/autotest_common.sh@972 -- # wait 753928 00:23:50.301 12:04:40 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@48 -- # cleanup 00:23:50.301 12:04:40 reap_unregistered_poller -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:23:50.301 00:23:50.301 real 0m2.061s 00:23:50.301 user 0m1.179s 00:23:50.301 sys 0m0.483s 00:23:50.301 12:04:40 reap_unregistered_poller -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:50.301 12:04:40 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:23:50.301 ************************************ 00:23:50.301 END TEST reap_unregistered_poller 00:23:50.301 ************************************ 00:23:50.559 12:04:40 -- common/autotest_common.sh@1142 -- # return 0 00:23:50.559 12:04:40 -- spdk/autotest.sh@198 -- # uname -s 00:23:50.559 12:04:40 -- spdk/autotest.sh@198 -- # [[ Linux == Linux ]] 00:23:50.559 12:04:40 -- spdk/autotest.sh@199 -- # [[ 1 -eq 1 ]] 00:23:50.559 12:04:40 -- spdk/autotest.sh@205 -- # [[ 1 -eq 0 ]] 00:23:50.559 12:04:40 -- spdk/autotest.sh@211 -- # '[' 0 -eq 1 ']' 00:23:50.559 12:04:40 -- spdk/autotest.sh@256 -- # '[' 0 -eq 1 ']' 00:23:50.559 12:04:40 -- spdk/autotest.sh@260 -- # timing_exit lib 00:23:50.559 12:04:40 -- common/autotest_common.sh@728 -- # xtrace_disable 00:23:50.559 12:04:40 -- common/autotest_common.sh@10 -- # set +x 00:23:50.559 12:04:40 -- spdk/autotest.sh@262 -- # '[' 0 -eq 1 ']' 00:23:50.560 12:04:40 -- spdk/autotest.sh@270 -- # '[' 0 -eq 1 ']' 00:23:50.560 12:04:40 -- spdk/autotest.sh@279 -- # '[' 0 -eq 1 ']' 00:23:50.560 12:04:40 -- spdk/autotest.sh@308 -- # '[' 0 -eq 1 ']' 00:23:50.560 12:04:40 -- spdk/autotest.sh@312 -- # '[' 0 -eq 1 ']' 00:23:50.560 12:04:40 -- spdk/autotest.sh@316 -- # '[' 0 -eq 1 ']' 00:23:50.560 12:04:40 -- spdk/autotest.sh@321 -- # '[' 0 -eq 1 ']' 00:23:50.560 12:04:40 -- spdk/autotest.sh@330 -- # '[' 0 -eq 1 ']' 00:23:50.560 12:04:40 -- spdk/autotest.sh@335 -- # '[' 0 -eq 1 ']' 00:23:50.560 12:04:40 -- spdk/autotest.sh@339 -- # '[' 0 -eq 1 ']' 00:23:50.560 12:04:40 -- spdk/autotest.sh@343 -- # '[' 0 -eq 1 ']' 00:23:50.560 12:04:40 -- spdk/autotest.sh@347 -- # '[' 1 -eq 1 ']' 00:23:50.560 12:04:40 -- spdk/autotest.sh@348 -- # run_test compress_compdev /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:23:50.560 12:04:40 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:23:50.560 12:04:40 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:50.560 12:04:40 -- common/autotest_common.sh@10 -- # set +x 00:23:50.560 ************************************ 00:23:50.560 START TEST compress_compdev 00:23:50.560 ************************************ 00:23:50.560 12:04:40 compress_compdev -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:23:50.560 * Looking for test storage... 00:23:50.560 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:23:50.560 12:04:40 compress_compdev -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:23:50.560 12:04:40 compress_compdev -- nvmf/common.sh@7 -- # uname -s 00:23:50.560 12:04:40 compress_compdev -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:23:50.560 12:04:40 compress_compdev -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:23:50.560 12:04:40 compress_compdev -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:23:50.560 12:04:40 compress_compdev -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:23:50.560 12:04:40 compress_compdev -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:23:50.560 12:04:40 compress_compdev -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:23:50.560 12:04:40 compress_compdev -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:23:50.560 12:04:40 compress_compdev -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:23:50.560 12:04:40 compress_compdev -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:23:50.560 12:04:40 compress_compdev -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:23:50.560 12:04:40 compress_compdev -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:801347e8-3fd0-e911-906e-0017a4403562 00:23:50.560 12:04:40 compress_compdev -- nvmf/common.sh@18 -- # NVME_HOSTID=801347e8-3fd0-e911-906e-0017a4403562 00:23:50.560 12:04:40 compress_compdev -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:23:50.560 12:04:40 compress_compdev -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:23:50.560 12:04:40 compress_compdev -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:23:50.560 12:04:40 compress_compdev -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:23:50.560 12:04:40 compress_compdev -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:23:50.560 12:04:40 compress_compdev -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:23:50.560 12:04:40 compress_compdev -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:23:50.560 12:04:40 compress_compdev -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:23:50.560 12:04:40 compress_compdev -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:23:50.560 12:04:40 compress_compdev -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:23:50.560 12:04:40 compress_compdev -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:23:50.560 12:04:40 compress_compdev -- paths/export.sh@5 -- # export PATH 00:23:50.560 12:04:40 compress_compdev -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:23:50.560 12:04:40 compress_compdev -- nvmf/common.sh@47 -- # : 0 00:23:50.560 12:04:40 compress_compdev -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:23:50.560 12:04:40 compress_compdev -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:23:50.560 12:04:40 compress_compdev -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:23:50.560 12:04:40 compress_compdev -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:23:50.560 12:04:40 compress_compdev -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:23:50.560 12:04:40 compress_compdev -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:23:50.560 12:04:40 compress_compdev -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:23:50.560 12:04:40 compress_compdev -- nvmf/common.sh@51 -- # have_pci_nics=0 00:23:50.560 12:04:40 compress_compdev -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:50.560 12:04:40 compress_compdev -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:23:50.560 12:04:40 compress_compdev -- compress/compress.sh@82 -- # test_type=compdev 00:23:50.560 12:04:40 compress_compdev -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:23:50.560 12:04:40 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:23:50.560 12:04:40 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=754290 00:23:50.560 12:04:40 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:23:50.560 12:04:40 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 754290 00:23:50.560 12:04:40 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 754290 ']' 00:23:50.560 12:04:40 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:23:50.560 12:04:40 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:23:50.560 12:04:40 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:50.560 12:04:40 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:23:50.560 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:23:50.560 12:04:40 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:50.560 12:04:40 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:23:50.819 [2024-07-12 12:04:40.820893] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:23:50.819 [2024-07-12 12:04:40.820935] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid754290 ] 00:23:50.819 [2024-07-12 12:04:40.886268] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:23:50.819 [2024-07-12 12:04:40.964516] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:23:50.819 [2024-07-12 12:04:40.964522] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:23:51.385 [2024-07-12 12:04:41.334691] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:23:51.385 12:04:41 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:51.385 12:04:41 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:23:51.385 12:04:41 compress_compdev -- compress/compress.sh@74 -- # create_vols 00:23:51.385 12:04:41 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:23:51.385 12:04:41 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:23:54.665 [2024-07-12 12:04:44.603606] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2a40f40 PMD being used: compress_qat 00:23:54.665 12:04:44 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:23:54.665 12:04:44 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:23:54.665 12:04:44 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:54.665 12:04:44 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:23:54.665 12:04:44 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:54.665 12:04:44 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:54.665 12:04:44 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:23:54.665 12:04:44 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:23:54.923 [ 00:23:54.923 { 00:23:54.923 "name": "Nvme0n1", 00:23:54.923 "aliases": [ 00:23:54.923 "a1a5c171-cace-4ba3-8876-0b5cfdb7cec6" 00:23:54.923 ], 00:23:54.923 "product_name": "NVMe disk", 00:23:54.923 "block_size": 512, 00:23:54.923 "num_blocks": 1953525168, 00:23:54.923 "uuid": "a1a5c171-cace-4ba3-8876-0b5cfdb7cec6", 00:23:54.923 "assigned_rate_limits": { 00:23:54.923 "rw_ios_per_sec": 0, 00:23:54.923 "rw_mbytes_per_sec": 0, 00:23:54.923 "r_mbytes_per_sec": 0, 00:23:54.923 "w_mbytes_per_sec": 0 00:23:54.923 }, 00:23:54.923 "claimed": false, 00:23:54.923 "zoned": false, 00:23:54.923 "supported_io_types": { 00:23:54.923 "read": true, 00:23:54.923 "write": true, 00:23:54.923 "unmap": true, 00:23:54.923 "flush": true, 00:23:54.923 "reset": true, 00:23:54.923 "nvme_admin": true, 00:23:54.923 "nvme_io": true, 00:23:54.923 "nvme_io_md": false, 00:23:54.923 "write_zeroes": true, 00:23:54.923 "zcopy": false, 00:23:54.923 "get_zone_info": false, 00:23:54.923 "zone_management": false, 00:23:54.923 "zone_append": false, 00:23:54.923 "compare": false, 00:23:54.923 "compare_and_write": false, 00:23:54.923 "abort": true, 00:23:54.923 "seek_hole": false, 00:23:54.923 "seek_data": false, 00:23:54.923 "copy": false, 00:23:54.923 "nvme_iov_md": false 00:23:54.923 }, 00:23:54.923 "driver_specific": { 00:23:54.923 "nvme": [ 00:23:54.923 { 00:23:54.923 "pci_address": "0000:5e:00.0", 00:23:54.923 "trid": { 00:23:54.923 "trtype": "PCIe", 00:23:54.923 "traddr": "0000:5e:00.0" 00:23:54.923 }, 00:23:54.923 "ctrlr_data": { 00:23:54.923 "cntlid": 0, 00:23:54.923 "vendor_id": "0x8086", 00:23:54.923 "model_number": "INTEL SSDPE2KX010T8", 00:23:54.923 "serial_number": "BTLJ807001JM1P0FGN", 00:23:54.923 "firmware_revision": "VDV10170", 00:23:54.923 "oacs": { 00:23:54.923 "security": 1, 00:23:54.923 "format": 1, 00:23:54.923 "firmware": 1, 00:23:54.923 "ns_manage": 1 00:23:54.923 }, 00:23:54.923 "multi_ctrlr": false, 00:23:54.923 "ana_reporting": false 00:23:54.923 }, 00:23:54.923 "vs": { 00:23:54.923 "nvme_version": "1.2" 00:23:54.923 }, 00:23:54.923 "ns_data": { 00:23:54.923 "id": 1, 00:23:54.923 "can_share": false 00:23:54.923 }, 00:23:54.923 "security": { 00:23:54.923 "opal": true 00:23:54.923 } 00:23:54.923 } 00:23:54.923 ], 00:23:54.923 "mp_policy": "active_passive" 00:23:54.923 } 00:23:54.923 } 00:23:54.923 ] 00:23:54.923 12:04:44 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:23:54.923 12:04:44 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:23:54.923 [2024-07-12 12:04:45.151728] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2a41e70 PMD being used: compress_qat 00:23:55.857 9c68014f-bc05-41a5-ba10-8a68e85832b7 00:23:55.857 12:04:46 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:23:56.115 9433b4cb-fe3b-40f2-a254-27d1be4bb065 00:23:56.115 12:04:46 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:23:56.115 12:04:46 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:23:56.115 12:04:46 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:56.115 12:04:46 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:23:56.115 12:04:46 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:56.115 12:04:46 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:56.115 12:04:46 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:23:56.115 12:04:46 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:23:56.373 [ 00:23:56.373 { 00:23:56.373 "name": "9433b4cb-fe3b-40f2-a254-27d1be4bb065", 00:23:56.373 "aliases": [ 00:23:56.373 "lvs0/lv0" 00:23:56.373 ], 00:23:56.373 "product_name": "Logical Volume", 00:23:56.373 "block_size": 512, 00:23:56.373 "num_blocks": 204800, 00:23:56.373 "uuid": "9433b4cb-fe3b-40f2-a254-27d1be4bb065", 00:23:56.373 "assigned_rate_limits": { 00:23:56.373 "rw_ios_per_sec": 0, 00:23:56.373 "rw_mbytes_per_sec": 0, 00:23:56.373 "r_mbytes_per_sec": 0, 00:23:56.373 "w_mbytes_per_sec": 0 00:23:56.373 }, 00:23:56.373 "claimed": false, 00:23:56.373 "zoned": false, 00:23:56.373 "supported_io_types": { 00:23:56.373 "read": true, 00:23:56.373 "write": true, 00:23:56.373 "unmap": true, 00:23:56.373 "flush": false, 00:23:56.373 "reset": true, 00:23:56.373 "nvme_admin": false, 00:23:56.373 "nvme_io": false, 00:23:56.373 "nvme_io_md": false, 00:23:56.373 "write_zeroes": true, 00:23:56.373 "zcopy": false, 00:23:56.373 "get_zone_info": false, 00:23:56.373 "zone_management": false, 00:23:56.373 "zone_append": false, 00:23:56.373 "compare": false, 00:23:56.373 "compare_and_write": false, 00:23:56.373 "abort": false, 00:23:56.373 "seek_hole": true, 00:23:56.373 "seek_data": true, 00:23:56.373 "copy": false, 00:23:56.373 "nvme_iov_md": false 00:23:56.373 }, 00:23:56.373 "driver_specific": { 00:23:56.373 "lvol": { 00:23:56.373 "lvol_store_uuid": "9c68014f-bc05-41a5-ba10-8a68e85832b7", 00:23:56.373 "base_bdev": "Nvme0n1", 00:23:56.373 "thin_provision": true, 00:23:56.373 "num_allocated_clusters": 0, 00:23:56.373 "snapshot": false, 00:23:56.373 "clone": false, 00:23:56.373 "esnap_clone": false 00:23:56.373 } 00:23:56.373 } 00:23:56.373 } 00:23:56.373 ] 00:23:56.373 12:04:46 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:23:56.373 12:04:46 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:23:56.373 12:04:46 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:23:56.631 [2024-07-12 12:04:46.660632] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:23:56.631 COMP_lvs0/lv0 00:23:56.631 12:04:46 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:23:56.631 12:04:46 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:23:56.631 12:04:46 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:56.631 12:04:46 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:23:56.631 12:04:46 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:56.631 12:04:46 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:56.631 12:04:46 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:23:56.631 12:04:46 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:23:56.889 [ 00:23:56.889 { 00:23:56.889 "name": "COMP_lvs0/lv0", 00:23:56.889 "aliases": [ 00:23:56.889 "01d349d2-52d2-52aa-ae78-82f9ef512596" 00:23:56.889 ], 00:23:56.889 "product_name": "compress", 00:23:56.889 "block_size": 512, 00:23:56.889 "num_blocks": 200704, 00:23:56.889 "uuid": "01d349d2-52d2-52aa-ae78-82f9ef512596", 00:23:56.889 "assigned_rate_limits": { 00:23:56.889 "rw_ios_per_sec": 0, 00:23:56.889 "rw_mbytes_per_sec": 0, 00:23:56.889 "r_mbytes_per_sec": 0, 00:23:56.889 "w_mbytes_per_sec": 0 00:23:56.889 }, 00:23:56.889 "claimed": false, 00:23:56.889 "zoned": false, 00:23:56.889 "supported_io_types": { 00:23:56.889 "read": true, 00:23:56.889 "write": true, 00:23:56.889 "unmap": false, 00:23:56.889 "flush": false, 00:23:56.889 "reset": false, 00:23:56.889 "nvme_admin": false, 00:23:56.889 "nvme_io": false, 00:23:56.889 "nvme_io_md": false, 00:23:56.889 "write_zeroes": true, 00:23:56.889 "zcopy": false, 00:23:56.889 "get_zone_info": false, 00:23:56.889 "zone_management": false, 00:23:56.889 "zone_append": false, 00:23:56.889 "compare": false, 00:23:56.889 "compare_and_write": false, 00:23:56.889 "abort": false, 00:23:56.889 "seek_hole": false, 00:23:56.889 "seek_data": false, 00:23:56.889 "copy": false, 00:23:56.889 "nvme_iov_md": false 00:23:56.889 }, 00:23:56.889 "driver_specific": { 00:23:56.889 "compress": { 00:23:56.889 "name": "COMP_lvs0/lv0", 00:23:56.889 "base_bdev_name": "9433b4cb-fe3b-40f2-a254-27d1be4bb065" 00:23:56.889 } 00:23:56.889 } 00:23:56.889 } 00:23:56.889 ] 00:23:56.889 12:04:47 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:23:56.889 12:04:47 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:23:56.889 [2024-07-12 12:04:47.086390] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f6a3c1b15c0 PMD being used: compress_qat 00:23:56.889 [2024-07-12 12:04:47.087879] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2c33e00 PMD being used: compress_qat 00:23:56.890 Running I/O for 3 seconds... 00:24:00.230 00:24:00.230 Latency(us) 00:24:00.230 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:00.230 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:24:00.230 Verification LBA range: start 0x0 length 0x3100 00:24:00.230 COMP_lvs0/lv0 : 3.01 4090.55 15.98 0.00 0.00 7782.28 128.73 13294.45 00:24:00.230 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:24:00.230 Verification LBA range: start 0x3100 length 0x3100 00:24:00.230 COMP_lvs0/lv0 : 3.01 4207.98 16.44 0.00 0.00 7571.79 120.93 12483.05 00:24:00.230 =================================================================================================================== 00:24:00.230 Total : 8298.54 32.42 0.00 0.00 7675.59 120.93 13294.45 00:24:00.230 0 00:24:00.230 12:04:50 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:24:00.230 12:04:50 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:24:00.230 12:04:50 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:24:00.488 12:04:50 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:24:00.488 12:04:50 compress_compdev -- compress/compress.sh@78 -- # killprocess 754290 00:24:00.488 12:04:50 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 754290 ']' 00:24:00.488 12:04:50 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 754290 00:24:00.488 12:04:50 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:24:00.488 12:04:50 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:00.488 12:04:50 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 754290 00:24:00.488 12:04:50 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:24:00.488 12:04:50 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:24:00.488 12:04:50 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 754290' 00:24:00.488 killing process with pid 754290 00:24:00.488 12:04:50 compress_compdev -- common/autotest_common.sh@967 -- # kill 754290 00:24:00.488 Received shutdown signal, test time was about 3.000000 seconds 00:24:00.488 00:24:00.488 Latency(us) 00:24:00.488 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:00.488 =================================================================================================================== 00:24:00.488 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:24:00.488 12:04:50 compress_compdev -- common/autotest_common.sh@972 -- # wait 754290 00:24:01.859 12:04:51 compress_compdev -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:24:01.859 12:04:51 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:24:01.859 12:04:51 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=756098 00:24:01.859 12:04:51 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:24:01.859 12:04:51 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:24:01.859 12:04:51 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 756098 00:24:01.859 12:04:51 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 756098 ']' 00:24:01.859 12:04:51 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:01.859 12:04:51 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:01.859 12:04:51 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:24:01.859 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:24:01.859 12:04:51 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:01.860 12:04:51 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:24:01.860 [2024-07-12 12:04:52.023524] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:24:01.860 [2024-07-12 12:04:52.023568] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid756098 ] 00:24:01.860 [2024-07-12 12:04:52.088252] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:24:02.117 [2024-07-12 12:04:52.169099] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:24:02.117 [2024-07-12 12:04:52.169100] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:24:02.375 [2024-07-12 12:04:52.546743] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:24:02.632 12:04:52 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:02.632 12:04:52 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:24:02.632 12:04:52 compress_compdev -- compress/compress.sh@74 -- # create_vols 512 00:24:02.632 12:04:52 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:24:02.632 12:04:52 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:24:05.910 [2024-07-12 12:04:55.819403] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x22b3f40 PMD being used: compress_qat 00:24:05.910 12:04:55 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:24:05.910 12:04:55 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:24:05.911 12:04:55 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:05.911 12:04:55 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:24:05.911 12:04:55 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:05.911 12:04:55 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:05.911 12:04:55 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:05.911 12:04:56 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:24:06.168 [ 00:24:06.168 { 00:24:06.168 "name": "Nvme0n1", 00:24:06.168 "aliases": [ 00:24:06.168 "1c36f38b-9c09-4bf2-897f-89c1ca3d2f32" 00:24:06.168 ], 00:24:06.168 "product_name": "NVMe disk", 00:24:06.168 "block_size": 512, 00:24:06.168 "num_blocks": 1953525168, 00:24:06.168 "uuid": "1c36f38b-9c09-4bf2-897f-89c1ca3d2f32", 00:24:06.168 "assigned_rate_limits": { 00:24:06.168 "rw_ios_per_sec": 0, 00:24:06.168 "rw_mbytes_per_sec": 0, 00:24:06.168 "r_mbytes_per_sec": 0, 00:24:06.168 "w_mbytes_per_sec": 0 00:24:06.168 }, 00:24:06.168 "claimed": false, 00:24:06.168 "zoned": false, 00:24:06.168 "supported_io_types": { 00:24:06.168 "read": true, 00:24:06.168 "write": true, 00:24:06.168 "unmap": true, 00:24:06.168 "flush": true, 00:24:06.168 "reset": true, 00:24:06.168 "nvme_admin": true, 00:24:06.168 "nvme_io": true, 00:24:06.168 "nvme_io_md": false, 00:24:06.168 "write_zeroes": true, 00:24:06.168 "zcopy": false, 00:24:06.168 "get_zone_info": false, 00:24:06.168 "zone_management": false, 00:24:06.168 "zone_append": false, 00:24:06.168 "compare": false, 00:24:06.168 "compare_and_write": false, 00:24:06.168 "abort": true, 00:24:06.168 "seek_hole": false, 00:24:06.168 "seek_data": false, 00:24:06.168 "copy": false, 00:24:06.168 "nvme_iov_md": false 00:24:06.168 }, 00:24:06.168 "driver_specific": { 00:24:06.168 "nvme": [ 00:24:06.168 { 00:24:06.168 "pci_address": "0000:5e:00.0", 00:24:06.168 "trid": { 00:24:06.168 "trtype": "PCIe", 00:24:06.168 "traddr": "0000:5e:00.0" 00:24:06.168 }, 00:24:06.168 "ctrlr_data": { 00:24:06.168 "cntlid": 0, 00:24:06.168 "vendor_id": "0x8086", 00:24:06.168 "model_number": "INTEL SSDPE2KX010T8", 00:24:06.168 "serial_number": "BTLJ807001JM1P0FGN", 00:24:06.168 "firmware_revision": "VDV10170", 00:24:06.168 "oacs": { 00:24:06.168 "security": 1, 00:24:06.168 "format": 1, 00:24:06.168 "firmware": 1, 00:24:06.168 "ns_manage": 1 00:24:06.168 }, 00:24:06.168 "multi_ctrlr": false, 00:24:06.168 "ana_reporting": false 00:24:06.168 }, 00:24:06.168 "vs": { 00:24:06.168 "nvme_version": "1.2" 00:24:06.168 }, 00:24:06.168 "ns_data": { 00:24:06.168 "id": 1, 00:24:06.168 "can_share": false 00:24:06.168 }, 00:24:06.168 "security": { 00:24:06.168 "opal": true 00:24:06.168 } 00:24:06.168 } 00:24:06.168 ], 00:24:06.168 "mp_policy": "active_passive" 00:24:06.168 } 00:24:06.168 } 00:24:06.168 ] 00:24:06.168 12:04:56 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:24:06.168 12:04:56 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:24:06.168 [2024-07-12 12:04:56.359421] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x22b4e70 PMD being used: compress_qat 00:24:07.103 0cd4774c-d862-4e15-b5ad-98e2367fd3bd 00:24:07.103 12:04:57 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:24:07.362 6de22da0-1586-46b6-ad6b-ce329f52b540 00:24:07.362 12:04:57 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:24:07.362 12:04:57 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:24:07.362 12:04:57 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:07.362 12:04:57 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:24:07.362 12:04:57 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:07.362 12:04:57 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:07.362 12:04:57 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:07.362 12:04:57 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:24:07.620 [ 00:24:07.620 { 00:24:07.620 "name": "6de22da0-1586-46b6-ad6b-ce329f52b540", 00:24:07.620 "aliases": [ 00:24:07.620 "lvs0/lv0" 00:24:07.620 ], 00:24:07.620 "product_name": "Logical Volume", 00:24:07.620 "block_size": 512, 00:24:07.620 "num_blocks": 204800, 00:24:07.620 "uuid": "6de22da0-1586-46b6-ad6b-ce329f52b540", 00:24:07.620 "assigned_rate_limits": { 00:24:07.620 "rw_ios_per_sec": 0, 00:24:07.620 "rw_mbytes_per_sec": 0, 00:24:07.620 "r_mbytes_per_sec": 0, 00:24:07.620 "w_mbytes_per_sec": 0 00:24:07.620 }, 00:24:07.620 "claimed": false, 00:24:07.620 "zoned": false, 00:24:07.620 "supported_io_types": { 00:24:07.620 "read": true, 00:24:07.620 "write": true, 00:24:07.620 "unmap": true, 00:24:07.620 "flush": false, 00:24:07.620 "reset": true, 00:24:07.620 "nvme_admin": false, 00:24:07.620 "nvme_io": false, 00:24:07.620 "nvme_io_md": false, 00:24:07.620 "write_zeroes": true, 00:24:07.620 "zcopy": false, 00:24:07.620 "get_zone_info": false, 00:24:07.620 "zone_management": false, 00:24:07.620 "zone_append": false, 00:24:07.620 "compare": false, 00:24:07.620 "compare_and_write": false, 00:24:07.620 "abort": false, 00:24:07.620 "seek_hole": true, 00:24:07.620 "seek_data": true, 00:24:07.620 "copy": false, 00:24:07.620 "nvme_iov_md": false 00:24:07.620 }, 00:24:07.620 "driver_specific": { 00:24:07.620 "lvol": { 00:24:07.620 "lvol_store_uuid": "0cd4774c-d862-4e15-b5ad-98e2367fd3bd", 00:24:07.620 "base_bdev": "Nvme0n1", 00:24:07.620 "thin_provision": true, 00:24:07.620 "num_allocated_clusters": 0, 00:24:07.620 "snapshot": false, 00:24:07.620 "clone": false, 00:24:07.620 "esnap_clone": false 00:24:07.620 } 00:24:07.620 } 00:24:07.620 } 00:24:07.620 ] 00:24:07.620 12:04:57 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:24:07.620 12:04:57 compress_compdev -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:24:07.620 12:04:57 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:24:07.878 [2024-07-12 12:04:57.880985] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:24:07.878 COMP_lvs0/lv0 00:24:07.878 12:04:57 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:24:07.878 12:04:57 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:24:07.878 12:04:57 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:07.878 12:04:57 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:24:07.878 12:04:57 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:07.878 12:04:57 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:07.878 12:04:57 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:07.878 12:04:58 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:24:08.137 [ 00:24:08.137 { 00:24:08.137 "name": "COMP_lvs0/lv0", 00:24:08.137 "aliases": [ 00:24:08.137 "f53690a0-c820-5653-9f1e-bc7e43076a49" 00:24:08.137 ], 00:24:08.137 "product_name": "compress", 00:24:08.137 "block_size": 512, 00:24:08.137 "num_blocks": 200704, 00:24:08.137 "uuid": "f53690a0-c820-5653-9f1e-bc7e43076a49", 00:24:08.137 "assigned_rate_limits": { 00:24:08.137 "rw_ios_per_sec": 0, 00:24:08.137 "rw_mbytes_per_sec": 0, 00:24:08.137 "r_mbytes_per_sec": 0, 00:24:08.137 "w_mbytes_per_sec": 0 00:24:08.137 }, 00:24:08.137 "claimed": false, 00:24:08.137 "zoned": false, 00:24:08.137 "supported_io_types": { 00:24:08.137 "read": true, 00:24:08.137 "write": true, 00:24:08.137 "unmap": false, 00:24:08.137 "flush": false, 00:24:08.137 "reset": false, 00:24:08.137 "nvme_admin": false, 00:24:08.137 "nvme_io": false, 00:24:08.137 "nvme_io_md": false, 00:24:08.137 "write_zeroes": true, 00:24:08.137 "zcopy": false, 00:24:08.137 "get_zone_info": false, 00:24:08.137 "zone_management": false, 00:24:08.137 "zone_append": false, 00:24:08.137 "compare": false, 00:24:08.137 "compare_and_write": false, 00:24:08.137 "abort": false, 00:24:08.137 "seek_hole": false, 00:24:08.137 "seek_data": false, 00:24:08.137 "copy": false, 00:24:08.137 "nvme_iov_md": false 00:24:08.137 }, 00:24:08.137 "driver_specific": { 00:24:08.137 "compress": { 00:24:08.137 "name": "COMP_lvs0/lv0", 00:24:08.137 "base_bdev_name": "6de22da0-1586-46b6-ad6b-ce329f52b540" 00:24:08.137 } 00:24:08.137 } 00:24:08.137 } 00:24:08.137 ] 00:24:08.137 12:04:58 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:24:08.137 12:04:58 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:24:08.137 [2024-07-12 12:04:58.302718] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f1f341b15c0 PMD being used: compress_qat 00:24:08.137 [2024-07-12 12:04:58.304135] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x24a6f40 PMD being used: compress_qat 00:24:08.137 Running I/O for 3 seconds... 00:24:11.414 00:24:11.414 Latency(us) 00:24:11.414 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:11.414 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:24:11.414 Verification LBA range: start 0x0 length 0x3100 00:24:11.414 COMP_lvs0/lv0 : 3.01 4049.85 15.82 0.00 0.00 7866.18 126.78 13481.69 00:24:11.414 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:24:11.414 Verification LBA range: start 0x3100 length 0x3100 00:24:11.414 COMP_lvs0/lv0 : 3.01 4170.71 16.29 0.00 0.00 7638.78 119.47 13856.18 00:24:11.414 =================================================================================================================== 00:24:11.414 Total : 8220.55 32.11 0.00 0.00 7750.80 119.47 13856.18 00:24:11.414 0 00:24:11.414 12:05:01 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:24:11.414 12:05:01 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:24:11.414 12:05:01 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:24:11.672 12:05:01 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:24:11.672 12:05:01 compress_compdev -- compress/compress.sh@78 -- # killprocess 756098 00:24:11.672 12:05:01 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 756098 ']' 00:24:11.672 12:05:01 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 756098 00:24:11.672 12:05:01 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:24:11.672 12:05:01 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:11.672 12:05:01 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 756098 00:24:11.672 12:05:01 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:24:11.672 12:05:01 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:24:11.672 12:05:01 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 756098' 00:24:11.672 killing process with pid 756098 00:24:11.672 12:05:01 compress_compdev -- common/autotest_common.sh@967 -- # kill 756098 00:24:11.672 Received shutdown signal, test time was about 3.000000 seconds 00:24:11.672 00:24:11.672 Latency(us) 00:24:11.672 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:11.672 =================================================================================================================== 00:24:11.672 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:24:11.672 12:05:01 compress_compdev -- common/autotest_common.sh@972 -- # wait 756098 00:24:13.045 12:05:03 compress_compdev -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:24:13.045 12:05:03 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:24:13.045 12:05:03 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=757937 00:24:13.045 12:05:03 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:24:13.045 12:05:03 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:24:13.045 12:05:03 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 757937 00:24:13.045 12:05:03 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 757937 ']' 00:24:13.045 12:05:03 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:13.045 12:05:03 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:13.045 12:05:03 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:24:13.045 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:24:13.045 12:05:03 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:13.045 12:05:03 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:24:13.045 [2024-07-12 12:05:03.253348] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:24:13.045 [2024-07-12 12:05:03.253389] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid757937 ] 00:24:13.302 [2024-07-12 12:05:03.317031] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:24:13.302 [2024-07-12 12:05:03.395684] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:24:13.302 [2024-07-12 12:05:03.395685] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:24:13.560 [2024-07-12 12:05:03.770044] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:24:13.818 12:05:04 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:13.818 12:05:04 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:24:13.818 12:05:04 compress_compdev -- compress/compress.sh@74 -- # create_vols 4096 00:24:13.818 12:05:04 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:24:13.818 12:05:04 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:24:17.099 [2024-07-12 12:05:07.051531] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x26c2f40 PMD being used: compress_qat 00:24:17.099 12:05:07 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:24:17.099 12:05:07 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:24:17.099 12:05:07 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:17.099 12:05:07 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:24:17.099 12:05:07 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:17.099 12:05:07 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:17.099 12:05:07 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:17.099 12:05:07 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:24:17.357 [ 00:24:17.357 { 00:24:17.357 "name": "Nvme0n1", 00:24:17.357 "aliases": [ 00:24:17.357 "50809ebc-a32b-44a9-850f-77f43d8096c0" 00:24:17.357 ], 00:24:17.357 "product_name": "NVMe disk", 00:24:17.357 "block_size": 512, 00:24:17.357 "num_blocks": 1953525168, 00:24:17.357 "uuid": "50809ebc-a32b-44a9-850f-77f43d8096c0", 00:24:17.357 "assigned_rate_limits": { 00:24:17.357 "rw_ios_per_sec": 0, 00:24:17.357 "rw_mbytes_per_sec": 0, 00:24:17.357 "r_mbytes_per_sec": 0, 00:24:17.357 "w_mbytes_per_sec": 0 00:24:17.357 }, 00:24:17.357 "claimed": false, 00:24:17.357 "zoned": false, 00:24:17.357 "supported_io_types": { 00:24:17.357 "read": true, 00:24:17.357 "write": true, 00:24:17.357 "unmap": true, 00:24:17.357 "flush": true, 00:24:17.357 "reset": true, 00:24:17.357 "nvme_admin": true, 00:24:17.357 "nvme_io": true, 00:24:17.357 "nvme_io_md": false, 00:24:17.357 "write_zeroes": true, 00:24:17.357 "zcopy": false, 00:24:17.357 "get_zone_info": false, 00:24:17.357 "zone_management": false, 00:24:17.357 "zone_append": false, 00:24:17.357 "compare": false, 00:24:17.357 "compare_and_write": false, 00:24:17.357 "abort": true, 00:24:17.357 "seek_hole": false, 00:24:17.357 "seek_data": false, 00:24:17.357 "copy": false, 00:24:17.357 "nvme_iov_md": false 00:24:17.357 }, 00:24:17.357 "driver_specific": { 00:24:17.357 "nvme": [ 00:24:17.357 { 00:24:17.357 "pci_address": "0000:5e:00.0", 00:24:17.357 "trid": { 00:24:17.357 "trtype": "PCIe", 00:24:17.357 "traddr": "0000:5e:00.0" 00:24:17.357 }, 00:24:17.357 "ctrlr_data": { 00:24:17.357 "cntlid": 0, 00:24:17.357 "vendor_id": "0x8086", 00:24:17.357 "model_number": "INTEL SSDPE2KX010T8", 00:24:17.357 "serial_number": "BTLJ807001JM1P0FGN", 00:24:17.357 "firmware_revision": "VDV10170", 00:24:17.357 "oacs": { 00:24:17.357 "security": 1, 00:24:17.357 "format": 1, 00:24:17.357 "firmware": 1, 00:24:17.357 "ns_manage": 1 00:24:17.357 }, 00:24:17.357 "multi_ctrlr": false, 00:24:17.357 "ana_reporting": false 00:24:17.357 }, 00:24:17.357 "vs": { 00:24:17.357 "nvme_version": "1.2" 00:24:17.357 }, 00:24:17.357 "ns_data": { 00:24:17.357 "id": 1, 00:24:17.357 "can_share": false 00:24:17.357 }, 00:24:17.357 "security": { 00:24:17.357 "opal": true 00:24:17.357 } 00:24:17.357 } 00:24:17.357 ], 00:24:17.357 "mp_policy": "active_passive" 00:24:17.357 } 00:24:17.357 } 00:24:17.357 ] 00:24:17.357 12:05:07 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:24:17.357 12:05:07 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:24:17.357 [2024-07-12 12:05:07.539357] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x26c3e70 PMD being used: compress_qat 00:24:18.290 8a780628-c9f6-412c-b22f-7d6a2a853650 00:24:18.290 12:05:08 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:24:18.548 5a9a2e92-e533-4099-9bf0-82306cc29e67 00:24:18.548 12:05:08 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:24:18.548 12:05:08 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:24:18.548 12:05:08 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:18.548 12:05:08 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:24:18.548 12:05:08 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:18.548 12:05:08 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:18.548 12:05:08 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:18.548 12:05:08 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:24:18.805 [ 00:24:18.805 { 00:24:18.805 "name": "5a9a2e92-e533-4099-9bf0-82306cc29e67", 00:24:18.805 "aliases": [ 00:24:18.805 "lvs0/lv0" 00:24:18.805 ], 00:24:18.805 "product_name": "Logical Volume", 00:24:18.805 "block_size": 512, 00:24:18.805 "num_blocks": 204800, 00:24:18.805 "uuid": "5a9a2e92-e533-4099-9bf0-82306cc29e67", 00:24:18.805 "assigned_rate_limits": { 00:24:18.805 "rw_ios_per_sec": 0, 00:24:18.805 "rw_mbytes_per_sec": 0, 00:24:18.805 "r_mbytes_per_sec": 0, 00:24:18.805 "w_mbytes_per_sec": 0 00:24:18.805 }, 00:24:18.805 "claimed": false, 00:24:18.805 "zoned": false, 00:24:18.805 "supported_io_types": { 00:24:18.805 "read": true, 00:24:18.805 "write": true, 00:24:18.805 "unmap": true, 00:24:18.805 "flush": false, 00:24:18.806 "reset": true, 00:24:18.806 "nvme_admin": false, 00:24:18.806 "nvme_io": false, 00:24:18.806 "nvme_io_md": false, 00:24:18.806 "write_zeroes": true, 00:24:18.806 "zcopy": false, 00:24:18.806 "get_zone_info": false, 00:24:18.806 "zone_management": false, 00:24:18.806 "zone_append": false, 00:24:18.806 "compare": false, 00:24:18.806 "compare_and_write": false, 00:24:18.806 "abort": false, 00:24:18.806 "seek_hole": true, 00:24:18.806 "seek_data": true, 00:24:18.806 "copy": false, 00:24:18.806 "nvme_iov_md": false 00:24:18.806 }, 00:24:18.806 "driver_specific": { 00:24:18.806 "lvol": { 00:24:18.806 "lvol_store_uuid": "8a780628-c9f6-412c-b22f-7d6a2a853650", 00:24:18.806 "base_bdev": "Nvme0n1", 00:24:18.806 "thin_provision": true, 00:24:18.806 "num_allocated_clusters": 0, 00:24:18.806 "snapshot": false, 00:24:18.806 "clone": false, 00:24:18.806 "esnap_clone": false 00:24:18.806 } 00:24:18.806 } 00:24:18.806 } 00:24:18.806 ] 00:24:18.806 12:05:08 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:24:18.806 12:05:08 compress_compdev -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:24:18.806 12:05:08 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:24:19.063 [2024-07-12 12:05:09.092057] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:24:19.063 COMP_lvs0/lv0 00:24:19.063 12:05:09 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:24:19.063 12:05:09 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:24:19.063 12:05:09 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:19.063 12:05:09 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:24:19.063 12:05:09 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:19.063 12:05:09 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:19.063 12:05:09 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:19.064 12:05:09 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:24:19.321 [ 00:24:19.321 { 00:24:19.321 "name": "COMP_lvs0/lv0", 00:24:19.321 "aliases": [ 00:24:19.321 "86ea17d8-d0d7-550c-baa4-1086e4a55a7f" 00:24:19.321 ], 00:24:19.321 "product_name": "compress", 00:24:19.321 "block_size": 4096, 00:24:19.321 "num_blocks": 25088, 00:24:19.321 "uuid": "86ea17d8-d0d7-550c-baa4-1086e4a55a7f", 00:24:19.321 "assigned_rate_limits": { 00:24:19.321 "rw_ios_per_sec": 0, 00:24:19.321 "rw_mbytes_per_sec": 0, 00:24:19.321 "r_mbytes_per_sec": 0, 00:24:19.321 "w_mbytes_per_sec": 0 00:24:19.321 }, 00:24:19.321 "claimed": false, 00:24:19.321 "zoned": false, 00:24:19.321 "supported_io_types": { 00:24:19.321 "read": true, 00:24:19.321 "write": true, 00:24:19.321 "unmap": false, 00:24:19.321 "flush": false, 00:24:19.321 "reset": false, 00:24:19.321 "nvme_admin": false, 00:24:19.321 "nvme_io": false, 00:24:19.321 "nvme_io_md": false, 00:24:19.321 "write_zeroes": true, 00:24:19.321 "zcopy": false, 00:24:19.321 "get_zone_info": false, 00:24:19.321 "zone_management": false, 00:24:19.321 "zone_append": false, 00:24:19.321 "compare": false, 00:24:19.321 "compare_and_write": false, 00:24:19.321 "abort": false, 00:24:19.321 "seek_hole": false, 00:24:19.321 "seek_data": false, 00:24:19.321 "copy": false, 00:24:19.321 "nvme_iov_md": false 00:24:19.321 }, 00:24:19.321 "driver_specific": { 00:24:19.321 "compress": { 00:24:19.321 "name": "COMP_lvs0/lv0", 00:24:19.321 "base_bdev_name": "5a9a2e92-e533-4099-9bf0-82306cc29e67" 00:24:19.321 } 00:24:19.321 } 00:24:19.321 } 00:24:19.321 ] 00:24:19.321 12:05:09 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:24:19.321 12:05:09 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:24:19.321 [2024-07-12 12:05:09.509810] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f4ab41b15c0 PMD being used: compress_qat 00:24:19.321 [2024-07-12 12:05:09.511340] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x26b7440 PMD being used: compress_qat 00:24:19.321 Running I/O for 3 seconds... 00:24:22.599 00:24:22.599 Latency(us) 00:24:22.599 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:22.599 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:24:22.599 Verification LBA range: start 0x0 length 0x3100 00:24:22.599 COMP_lvs0/lv0 : 3.01 4014.88 15.68 0.00 0.00 7929.76 173.59 13044.78 00:24:22.599 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:24:22.599 Verification LBA range: start 0x3100 length 0x3100 00:24:22.599 COMP_lvs0/lv0 : 3.01 4085.57 15.96 0.00 0.00 7794.22 164.82 13606.52 00:24:22.599 =================================================================================================================== 00:24:22.599 Total : 8100.45 31.64 0.00 0.00 7861.40 164.82 13606.52 00:24:22.599 0 00:24:22.599 12:05:12 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:24:22.599 12:05:12 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:24:22.599 12:05:12 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:24:22.856 12:05:12 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:24:22.856 12:05:12 compress_compdev -- compress/compress.sh@78 -- # killprocess 757937 00:24:22.856 12:05:12 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 757937 ']' 00:24:22.856 12:05:12 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 757937 00:24:22.856 12:05:12 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:24:22.857 12:05:12 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:22.857 12:05:12 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 757937 00:24:22.857 12:05:12 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:24:22.857 12:05:12 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:24:22.857 12:05:12 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 757937' 00:24:22.857 killing process with pid 757937 00:24:22.857 12:05:12 compress_compdev -- common/autotest_common.sh@967 -- # kill 757937 00:24:22.857 Received shutdown signal, test time was about 3.000000 seconds 00:24:22.857 00:24:22.857 Latency(us) 00:24:22.857 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:22.857 =================================================================================================================== 00:24:22.857 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:24:22.857 12:05:12 compress_compdev -- common/autotest_common.sh@972 -- # wait 757937 00:24:24.227 12:05:14 compress_compdev -- compress/compress.sh@89 -- # run_bdevio 00:24:24.227 12:05:14 compress_compdev -- compress/compress.sh@50 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:24:24.227 12:05:14 compress_compdev -- compress/compress.sh@55 -- # bdevio_pid=759775 00:24:24.227 12:05:14 compress_compdev -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:24:24.227 12:05:14 compress_compdev -- compress/compress.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json -w 00:24:24.227 12:05:14 compress_compdev -- compress/compress.sh@57 -- # waitforlisten 759775 00:24:24.227 12:05:14 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 759775 ']' 00:24:24.227 12:05:14 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:24.227 12:05:14 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:24.227 12:05:14 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:24:24.227 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:24:24.227 12:05:14 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:24.227 12:05:14 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:24:24.227 [2024-07-12 12:05:14.456984] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:24:24.227 [2024-07-12 12:05:14.457027] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid759775 ] 00:24:24.485 [2024-07-12 12:05:14.520623] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:24:24.485 [2024-07-12 12:05:14.601738] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:24:24.485 [2024-07-12 12:05:14.601834] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:24:24.485 [2024-07-12 12:05:14.601834] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:24:24.743 [2024-07-12 12:05:14.988069] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:24:25.310 12:05:15 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:25.310 12:05:15 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:24:25.310 12:05:15 compress_compdev -- compress/compress.sh@58 -- # create_vols 00:24:25.310 12:05:15 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:24:25.310 12:05:15 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:24:28.589 [2024-07-12 12:05:18.267857] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1fd8ac0 PMD being used: compress_qat 00:24:28.589 12:05:18 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:24:28.589 12:05:18 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:24:28.589 12:05:18 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:28.589 12:05:18 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:24:28.589 12:05:18 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:28.589 12:05:18 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:28.589 12:05:18 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:28.589 12:05:18 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:24:28.589 [ 00:24:28.589 { 00:24:28.589 "name": "Nvme0n1", 00:24:28.589 "aliases": [ 00:24:28.589 "f5482633-673d-4722-b8c6-bd6c43d69a6e" 00:24:28.589 ], 00:24:28.589 "product_name": "NVMe disk", 00:24:28.589 "block_size": 512, 00:24:28.589 "num_blocks": 1953525168, 00:24:28.589 "uuid": "f5482633-673d-4722-b8c6-bd6c43d69a6e", 00:24:28.589 "assigned_rate_limits": { 00:24:28.589 "rw_ios_per_sec": 0, 00:24:28.589 "rw_mbytes_per_sec": 0, 00:24:28.589 "r_mbytes_per_sec": 0, 00:24:28.589 "w_mbytes_per_sec": 0 00:24:28.589 }, 00:24:28.589 "claimed": false, 00:24:28.589 "zoned": false, 00:24:28.589 "supported_io_types": { 00:24:28.589 "read": true, 00:24:28.589 "write": true, 00:24:28.589 "unmap": true, 00:24:28.589 "flush": true, 00:24:28.589 "reset": true, 00:24:28.589 "nvme_admin": true, 00:24:28.589 "nvme_io": true, 00:24:28.589 "nvme_io_md": false, 00:24:28.589 "write_zeroes": true, 00:24:28.589 "zcopy": false, 00:24:28.589 "get_zone_info": false, 00:24:28.589 "zone_management": false, 00:24:28.589 "zone_append": false, 00:24:28.589 "compare": false, 00:24:28.589 "compare_and_write": false, 00:24:28.589 "abort": true, 00:24:28.589 "seek_hole": false, 00:24:28.589 "seek_data": false, 00:24:28.589 "copy": false, 00:24:28.589 "nvme_iov_md": false 00:24:28.589 }, 00:24:28.589 "driver_specific": { 00:24:28.589 "nvme": [ 00:24:28.589 { 00:24:28.589 "pci_address": "0000:5e:00.0", 00:24:28.589 "trid": { 00:24:28.589 "trtype": "PCIe", 00:24:28.589 "traddr": "0000:5e:00.0" 00:24:28.589 }, 00:24:28.589 "ctrlr_data": { 00:24:28.589 "cntlid": 0, 00:24:28.589 "vendor_id": "0x8086", 00:24:28.589 "model_number": "INTEL SSDPE2KX010T8", 00:24:28.589 "serial_number": "BTLJ807001JM1P0FGN", 00:24:28.589 "firmware_revision": "VDV10170", 00:24:28.589 "oacs": { 00:24:28.589 "security": 1, 00:24:28.589 "format": 1, 00:24:28.589 "firmware": 1, 00:24:28.589 "ns_manage": 1 00:24:28.589 }, 00:24:28.589 "multi_ctrlr": false, 00:24:28.589 "ana_reporting": false 00:24:28.589 }, 00:24:28.589 "vs": { 00:24:28.589 "nvme_version": "1.2" 00:24:28.589 }, 00:24:28.589 "ns_data": { 00:24:28.589 "id": 1, 00:24:28.589 "can_share": false 00:24:28.589 }, 00:24:28.589 "security": { 00:24:28.589 "opal": true 00:24:28.589 } 00:24:28.589 } 00:24:28.589 ], 00:24:28.589 "mp_policy": "active_passive" 00:24:28.589 } 00:24:28.589 } 00:24:28.589 ] 00:24:28.589 12:05:18 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:24:28.589 12:05:18 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:24:28.589 [2024-07-12 12:05:18.800196] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1e3dd60 PMD being used: compress_qat 00:24:29.582 ef81ba55-6538-4608-bb22-e8003ccdcfbf 00:24:29.582 12:05:19 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:24:29.582 6b86a227-bccd-497f-bf31-6a4a2be273a8 00:24:29.840 12:05:19 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:24:29.840 12:05:19 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:24:29.840 12:05:19 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:29.840 12:05:19 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:24:29.840 12:05:19 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:29.840 12:05:19 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:29.840 12:05:19 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:29.840 12:05:20 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:24:30.097 [ 00:24:30.097 { 00:24:30.097 "name": "6b86a227-bccd-497f-bf31-6a4a2be273a8", 00:24:30.097 "aliases": [ 00:24:30.097 "lvs0/lv0" 00:24:30.097 ], 00:24:30.097 "product_name": "Logical Volume", 00:24:30.097 "block_size": 512, 00:24:30.097 "num_blocks": 204800, 00:24:30.097 "uuid": "6b86a227-bccd-497f-bf31-6a4a2be273a8", 00:24:30.097 "assigned_rate_limits": { 00:24:30.097 "rw_ios_per_sec": 0, 00:24:30.097 "rw_mbytes_per_sec": 0, 00:24:30.097 "r_mbytes_per_sec": 0, 00:24:30.097 "w_mbytes_per_sec": 0 00:24:30.097 }, 00:24:30.097 "claimed": false, 00:24:30.097 "zoned": false, 00:24:30.097 "supported_io_types": { 00:24:30.097 "read": true, 00:24:30.097 "write": true, 00:24:30.097 "unmap": true, 00:24:30.098 "flush": false, 00:24:30.098 "reset": true, 00:24:30.098 "nvme_admin": false, 00:24:30.098 "nvme_io": false, 00:24:30.098 "nvme_io_md": false, 00:24:30.098 "write_zeroes": true, 00:24:30.098 "zcopy": false, 00:24:30.098 "get_zone_info": false, 00:24:30.098 "zone_management": false, 00:24:30.098 "zone_append": false, 00:24:30.098 "compare": false, 00:24:30.098 "compare_and_write": false, 00:24:30.098 "abort": false, 00:24:30.098 "seek_hole": true, 00:24:30.098 "seek_data": true, 00:24:30.098 "copy": false, 00:24:30.098 "nvme_iov_md": false 00:24:30.098 }, 00:24:30.098 "driver_specific": { 00:24:30.098 "lvol": { 00:24:30.098 "lvol_store_uuid": "ef81ba55-6538-4608-bb22-e8003ccdcfbf", 00:24:30.098 "base_bdev": "Nvme0n1", 00:24:30.098 "thin_provision": true, 00:24:30.098 "num_allocated_clusters": 0, 00:24:30.098 "snapshot": false, 00:24:30.098 "clone": false, 00:24:30.098 "esnap_clone": false 00:24:30.098 } 00:24:30.098 } 00:24:30.098 } 00:24:30.098 ] 00:24:30.098 12:05:20 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:24:30.098 12:05:20 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:24:30.098 12:05:20 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:24:30.355 [2024-07-12 12:05:20.359151] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:24:30.355 COMP_lvs0/lv0 00:24:30.355 12:05:20 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:24:30.355 12:05:20 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:24:30.355 12:05:20 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:30.355 12:05:20 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:24:30.355 12:05:20 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:30.355 12:05:20 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:30.355 12:05:20 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:30.355 12:05:20 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:24:30.613 [ 00:24:30.613 { 00:24:30.613 "name": "COMP_lvs0/lv0", 00:24:30.613 "aliases": [ 00:24:30.613 "ba6b4d42-3009-5060-88b1-69d41ecf1b8b" 00:24:30.613 ], 00:24:30.613 "product_name": "compress", 00:24:30.613 "block_size": 512, 00:24:30.613 "num_blocks": 200704, 00:24:30.613 "uuid": "ba6b4d42-3009-5060-88b1-69d41ecf1b8b", 00:24:30.613 "assigned_rate_limits": { 00:24:30.613 "rw_ios_per_sec": 0, 00:24:30.613 "rw_mbytes_per_sec": 0, 00:24:30.613 "r_mbytes_per_sec": 0, 00:24:30.613 "w_mbytes_per_sec": 0 00:24:30.613 }, 00:24:30.613 "claimed": false, 00:24:30.613 "zoned": false, 00:24:30.613 "supported_io_types": { 00:24:30.613 "read": true, 00:24:30.613 "write": true, 00:24:30.613 "unmap": false, 00:24:30.613 "flush": false, 00:24:30.613 "reset": false, 00:24:30.613 "nvme_admin": false, 00:24:30.613 "nvme_io": false, 00:24:30.613 "nvme_io_md": false, 00:24:30.613 "write_zeroes": true, 00:24:30.613 "zcopy": false, 00:24:30.613 "get_zone_info": false, 00:24:30.613 "zone_management": false, 00:24:30.613 "zone_append": false, 00:24:30.613 "compare": false, 00:24:30.613 "compare_and_write": false, 00:24:30.613 "abort": false, 00:24:30.613 "seek_hole": false, 00:24:30.613 "seek_data": false, 00:24:30.613 "copy": false, 00:24:30.613 "nvme_iov_md": false 00:24:30.613 }, 00:24:30.613 "driver_specific": { 00:24:30.613 "compress": { 00:24:30.613 "name": "COMP_lvs0/lv0", 00:24:30.613 "base_bdev_name": "6b86a227-bccd-497f-bf31-6a4a2be273a8" 00:24:30.613 } 00:24:30.613 } 00:24:30.613 } 00:24:30.613 ] 00:24:30.613 12:05:20 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:24:30.613 12:05:20 compress_compdev -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:24:30.613 [2024-07-12 12:05:20.820035] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f0c641b1350 PMD being used: compress_qat 00:24:30.613 I/O targets: 00:24:30.613 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:24:30.613 00:24:30.613 00:24:30.613 CUnit - A unit testing framework for C - Version 2.1-3 00:24:30.613 http://cunit.sourceforge.net/ 00:24:30.613 00:24:30.613 00:24:30.613 Suite: bdevio tests on: COMP_lvs0/lv0 00:24:30.613 Test: blockdev write read block ...passed 00:24:30.613 Test: blockdev write zeroes read block ...passed 00:24:30.613 Test: blockdev write zeroes read no split ...passed 00:24:30.613 Test: blockdev write zeroes read split ...passed 00:24:30.871 Test: blockdev write zeroes read split partial ...passed 00:24:30.871 Test: blockdev reset ...[2024-07-12 12:05:20.873591] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:24:30.871 passed 00:24:30.871 Test: blockdev write read 8 blocks ...passed 00:24:30.871 Test: blockdev write read size > 128k ...passed 00:24:30.871 Test: blockdev write read invalid size ...passed 00:24:30.871 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:24:30.871 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:24:30.871 Test: blockdev write read max offset ...passed 00:24:30.871 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:24:30.871 Test: blockdev writev readv 8 blocks ...passed 00:24:30.871 Test: blockdev writev readv 30 x 1block ...passed 00:24:30.871 Test: blockdev writev readv block ...passed 00:24:30.871 Test: blockdev writev readv size > 128k ...passed 00:24:30.871 Test: blockdev writev readv size > 128k in two iovs ...passed 00:24:30.871 Test: blockdev comparev and writev ...passed 00:24:30.871 Test: blockdev nvme passthru rw ...passed 00:24:30.871 Test: blockdev nvme passthru vendor specific ...passed 00:24:30.871 Test: blockdev nvme admin passthru ...passed 00:24:30.871 Test: blockdev copy ...passed 00:24:30.871 00:24:30.871 Run Summary: Type Total Ran Passed Failed Inactive 00:24:30.871 suites 1 1 n/a 0 0 00:24:30.871 tests 23 23 23 0 0 00:24:30.871 asserts 130 130 130 0 n/a 00:24:30.871 00:24:30.871 Elapsed time = 0.157 seconds 00:24:30.871 0 00:24:30.871 12:05:20 compress_compdev -- compress/compress.sh@60 -- # destroy_vols 00:24:30.871 12:05:20 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:24:30.871 12:05:21 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:24:31.129 12:05:21 compress_compdev -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:24:31.129 12:05:21 compress_compdev -- compress/compress.sh@62 -- # killprocess 759775 00:24:31.129 12:05:21 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 759775 ']' 00:24:31.129 12:05:21 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 759775 00:24:31.129 12:05:21 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:24:31.129 12:05:21 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:31.129 12:05:21 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 759775 00:24:31.129 12:05:21 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:31.129 12:05:21 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:31.129 12:05:21 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 759775' 00:24:31.129 killing process with pid 759775 00:24:31.129 12:05:21 compress_compdev -- common/autotest_common.sh@967 -- # kill 759775 00:24:31.129 12:05:21 compress_compdev -- common/autotest_common.sh@972 -- # wait 759775 00:24:33.027 12:05:22 compress_compdev -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:24:33.027 12:05:22 compress_compdev -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:24:33.027 00:24:33.027 real 0m42.122s 00:24:33.027 user 1m34.587s 00:24:33.027 sys 0m3.506s 00:24:33.027 12:05:22 compress_compdev -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:33.027 12:05:22 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:24:33.027 ************************************ 00:24:33.027 END TEST compress_compdev 00:24:33.027 ************************************ 00:24:33.027 12:05:22 -- common/autotest_common.sh@1142 -- # return 0 00:24:33.027 12:05:22 -- spdk/autotest.sh@349 -- # run_test compress_isal /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:24:33.027 12:05:22 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:24:33.027 12:05:22 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:33.027 12:05:22 -- common/autotest_common.sh@10 -- # set +x 00:24:33.027 ************************************ 00:24:33.027 START TEST compress_isal 00:24:33.027 ************************************ 00:24:33.027 12:05:22 compress_isal -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:24:33.027 * Looking for test storage... 00:24:33.027 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:24:33.027 12:05:22 compress_isal -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:24:33.027 12:05:22 compress_isal -- nvmf/common.sh@7 -- # uname -s 00:24:33.027 12:05:22 compress_isal -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:24:33.027 12:05:22 compress_isal -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:24:33.027 12:05:22 compress_isal -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:24:33.027 12:05:22 compress_isal -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:24:33.027 12:05:22 compress_isal -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:24:33.027 12:05:22 compress_isal -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:24:33.027 12:05:22 compress_isal -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:24:33.027 12:05:22 compress_isal -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:24:33.027 12:05:22 compress_isal -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:24:33.028 12:05:22 compress_isal -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:24:33.028 12:05:22 compress_isal -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:801347e8-3fd0-e911-906e-0017a4403562 00:24:33.028 12:05:22 compress_isal -- nvmf/common.sh@18 -- # NVME_HOSTID=801347e8-3fd0-e911-906e-0017a4403562 00:24:33.028 12:05:22 compress_isal -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:24:33.028 12:05:22 compress_isal -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:24:33.028 12:05:22 compress_isal -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:24:33.028 12:05:22 compress_isal -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:24:33.028 12:05:22 compress_isal -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:24:33.028 12:05:22 compress_isal -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:24:33.028 12:05:22 compress_isal -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:24:33.028 12:05:22 compress_isal -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:24:33.028 12:05:22 compress_isal -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:33.028 12:05:22 compress_isal -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:33.028 12:05:22 compress_isal -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:33.028 12:05:22 compress_isal -- paths/export.sh@5 -- # export PATH 00:24:33.028 12:05:22 compress_isal -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:33.028 12:05:22 compress_isal -- nvmf/common.sh@47 -- # : 0 00:24:33.028 12:05:22 compress_isal -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:24:33.028 12:05:22 compress_isal -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:24:33.028 12:05:22 compress_isal -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:24:33.028 12:05:22 compress_isal -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:24:33.028 12:05:22 compress_isal -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:24:33.028 12:05:22 compress_isal -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:24:33.028 12:05:22 compress_isal -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:24:33.028 12:05:22 compress_isal -- nvmf/common.sh@51 -- # have_pci_nics=0 00:24:33.028 12:05:22 compress_isal -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:33.028 12:05:22 compress_isal -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:24:33.028 12:05:22 compress_isal -- compress/compress.sh@82 -- # test_type=isal 00:24:33.028 12:05:22 compress_isal -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:24:33.028 12:05:22 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:24:33.028 12:05:22 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=761224 00:24:33.028 12:05:22 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:24:33.028 12:05:22 compress_isal -- compress/compress.sh@73 -- # waitforlisten 761224 00:24:33.028 12:05:22 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 761224 ']' 00:24:33.028 12:05:22 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:24:33.028 12:05:22 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:33.028 12:05:22 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:33.028 12:05:22 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:24:33.028 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:24:33.028 12:05:22 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:33.028 12:05:22 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:24:33.028 [2024-07-12 12:05:23.007761] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:24:33.028 [2024-07-12 12:05:23.007808] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid761224 ] 00:24:33.028 [2024-07-12 12:05:23.072955] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:24:33.028 [2024-07-12 12:05:23.145547] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:24:33.028 [2024-07-12 12:05:23.145565] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:24:33.593 12:05:23 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:33.593 12:05:23 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:24:33.593 12:05:23 compress_isal -- compress/compress.sh@74 -- # create_vols 00:24:33.593 12:05:23 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:24:33.593 12:05:23 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:24:36.872 12:05:26 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:24:36.872 12:05:26 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:24:36.872 12:05:26 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:36.872 12:05:26 compress_isal -- common/autotest_common.sh@899 -- # local i 00:24:36.872 12:05:26 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:36.872 12:05:26 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:36.872 12:05:26 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:36.872 12:05:26 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:24:36.872 [ 00:24:36.872 { 00:24:36.872 "name": "Nvme0n1", 00:24:36.872 "aliases": [ 00:24:36.872 "63ede6f8-fb22-41f9-92c5-1d29cb92ba7c" 00:24:36.872 ], 00:24:36.872 "product_name": "NVMe disk", 00:24:36.872 "block_size": 512, 00:24:36.872 "num_blocks": 1953525168, 00:24:36.872 "uuid": "63ede6f8-fb22-41f9-92c5-1d29cb92ba7c", 00:24:36.872 "assigned_rate_limits": { 00:24:36.872 "rw_ios_per_sec": 0, 00:24:36.872 "rw_mbytes_per_sec": 0, 00:24:36.872 "r_mbytes_per_sec": 0, 00:24:36.872 "w_mbytes_per_sec": 0 00:24:36.872 }, 00:24:36.872 "claimed": false, 00:24:36.872 "zoned": false, 00:24:36.872 "supported_io_types": { 00:24:36.872 "read": true, 00:24:36.872 "write": true, 00:24:36.872 "unmap": true, 00:24:36.872 "flush": true, 00:24:36.872 "reset": true, 00:24:36.872 "nvme_admin": true, 00:24:36.872 "nvme_io": true, 00:24:36.872 "nvme_io_md": false, 00:24:36.872 "write_zeroes": true, 00:24:36.872 "zcopy": false, 00:24:36.872 "get_zone_info": false, 00:24:36.872 "zone_management": false, 00:24:36.872 "zone_append": false, 00:24:36.872 "compare": false, 00:24:36.872 "compare_and_write": false, 00:24:36.872 "abort": true, 00:24:36.872 "seek_hole": false, 00:24:36.872 "seek_data": false, 00:24:36.872 "copy": false, 00:24:36.872 "nvme_iov_md": false 00:24:36.872 }, 00:24:36.872 "driver_specific": { 00:24:36.872 "nvme": [ 00:24:36.872 { 00:24:36.872 "pci_address": "0000:5e:00.0", 00:24:36.872 "trid": { 00:24:36.872 "trtype": "PCIe", 00:24:36.872 "traddr": "0000:5e:00.0" 00:24:36.872 }, 00:24:36.872 "ctrlr_data": { 00:24:36.872 "cntlid": 0, 00:24:36.872 "vendor_id": "0x8086", 00:24:36.872 "model_number": "INTEL SSDPE2KX010T8", 00:24:36.872 "serial_number": "BTLJ807001JM1P0FGN", 00:24:36.872 "firmware_revision": "VDV10170", 00:24:36.872 "oacs": { 00:24:36.872 "security": 1, 00:24:36.872 "format": 1, 00:24:36.872 "firmware": 1, 00:24:36.872 "ns_manage": 1 00:24:36.872 }, 00:24:36.872 "multi_ctrlr": false, 00:24:36.872 "ana_reporting": false 00:24:36.872 }, 00:24:36.872 "vs": { 00:24:36.872 "nvme_version": "1.2" 00:24:36.872 }, 00:24:36.873 "ns_data": { 00:24:36.873 "id": 1, 00:24:36.873 "can_share": false 00:24:36.873 }, 00:24:36.873 "security": { 00:24:36.873 "opal": true 00:24:36.873 } 00:24:36.873 } 00:24:36.873 ], 00:24:36.873 "mp_policy": "active_passive" 00:24:36.873 } 00:24:36.873 } 00:24:36.873 ] 00:24:37.130 12:05:27 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:24:37.130 12:05:27 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:24:38.063 e07b56f7-4c5a-4744-9108-26935d79f6dc 00:24:38.063 12:05:28 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:24:38.063 15a96051-a6d9-4c93-9bea-1874b54ff848 00:24:38.063 12:05:28 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:24:38.063 12:05:28 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:24:38.063 12:05:28 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:38.063 12:05:28 compress_isal -- common/autotest_common.sh@899 -- # local i 00:24:38.063 12:05:28 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:38.063 12:05:28 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:38.063 12:05:28 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:38.321 12:05:28 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:24:38.579 [ 00:24:38.579 { 00:24:38.579 "name": "15a96051-a6d9-4c93-9bea-1874b54ff848", 00:24:38.579 "aliases": [ 00:24:38.579 "lvs0/lv0" 00:24:38.579 ], 00:24:38.579 "product_name": "Logical Volume", 00:24:38.579 "block_size": 512, 00:24:38.579 "num_blocks": 204800, 00:24:38.579 "uuid": "15a96051-a6d9-4c93-9bea-1874b54ff848", 00:24:38.579 "assigned_rate_limits": { 00:24:38.579 "rw_ios_per_sec": 0, 00:24:38.579 "rw_mbytes_per_sec": 0, 00:24:38.579 "r_mbytes_per_sec": 0, 00:24:38.579 "w_mbytes_per_sec": 0 00:24:38.579 }, 00:24:38.579 "claimed": false, 00:24:38.579 "zoned": false, 00:24:38.579 "supported_io_types": { 00:24:38.579 "read": true, 00:24:38.579 "write": true, 00:24:38.579 "unmap": true, 00:24:38.579 "flush": false, 00:24:38.579 "reset": true, 00:24:38.579 "nvme_admin": false, 00:24:38.579 "nvme_io": false, 00:24:38.579 "nvme_io_md": false, 00:24:38.579 "write_zeroes": true, 00:24:38.579 "zcopy": false, 00:24:38.579 "get_zone_info": false, 00:24:38.579 "zone_management": false, 00:24:38.579 "zone_append": false, 00:24:38.579 "compare": false, 00:24:38.579 "compare_and_write": false, 00:24:38.579 "abort": false, 00:24:38.579 "seek_hole": true, 00:24:38.579 "seek_data": true, 00:24:38.579 "copy": false, 00:24:38.579 "nvme_iov_md": false 00:24:38.579 }, 00:24:38.579 "driver_specific": { 00:24:38.579 "lvol": { 00:24:38.579 "lvol_store_uuid": "e07b56f7-4c5a-4744-9108-26935d79f6dc", 00:24:38.579 "base_bdev": "Nvme0n1", 00:24:38.579 "thin_provision": true, 00:24:38.579 "num_allocated_clusters": 0, 00:24:38.579 "snapshot": false, 00:24:38.579 "clone": false, 00:24:38.579 "esnap_clone": false 00:24:38.579 } 00:24:38.579 } 00:24:38.579 } 00:24:38.579 ] 00:24:38.579 12:05:28 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:24:38.579 12:05:28 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:24:38.579 12:05:28 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:24:38.579 [2024-07-12 12:05:28.789579] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:24:38.579 COMP_lvs0/lv0 00:24:38.579 12:05:28 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:24:38.579 12:05:28 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:24:38.579 12:05:28 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:38.579 12:05:28 compress_isal -- common/autotest_common.sh@899 -- # local i 00:24:38.579 12:05:28 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:38.579 12:05:28 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:38.579 12:05:28 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:38.837 12:05:28 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:24:39.095 [ 00:24:39.095 { 00:24:39.095 "name": "COMP_lvs0/lv0", 00:24:39.095 "aliases": [ 00:24:39.095 "cbfc200d-169b-5529-a8e9-725401e03799" 00:24:39.095 ], 00:24:39.095 "product_name": "compress", 00:24:39.095 "block_size": 512, 00:24:39.095 "num_blocks": 200704, 00:24:39.095 "uuid": "cbfc200d-169b-5529-a8e9-725401e03799", 00:24:39.095 "assigned_rate_limits": { 00:24:39.095 "rw_ios_per_sec": 0, 00:24:39.095 "rw_mbytes_per_sec": 0, 00:24:39.095 "r_mbytes_per_sec": 0, 00:24:39.095 "w_mbytes_per_sec": 0 00:24:39.095 }, 00:24:39.095 "claimed": false, 00:24:39.095 "zoned": false, 00:24:39.095 "supported_io_types": { 00:24:39.095 "read": true, 00:24:39.095 "write": true, 00:24:39.095 "unmap": false, 00:24:39.095 "flush": false, 00:24:39.095 "reset": false, 00:24:39.095 "nvme_admin": false, 00:24:39.095 "nvme_io": false, 00:24:39.095 "nvme_io_md": false, 00:24:39.095 "write_zeroes": true, 00:24:39.095 "zcopy": false, 00:24:39.095 "get_zone_info": false, 00:24:39.095 "zone_management": false, 00:24:39.095 "zone_append": false, 00:24:39.095 "compare": false, 00:24:39.095 "compare_and_write": false, 00:24:39.095 "abort": false, 00:24:39.095 "seek_hole": false, 00:24:39.095 "seek_data": false, 00:24:39.095 "copy": false, 00:24:39.095 "nvme_iov_md": false 00:24:39.095 }, 00:24:39.095 "driver_specific": { 00:24:39.095 "compress": { 00:24:39.095 "name": "COMP_lvs0/lv0", 00:24:39.095 "base_bdev_name": "15a96051-a6d9-4c93-9bea-1874b54ff848" 00:24:39.095 } 00:24:39.095 } 00:24:39.095 } 00:24:39.095 ] 00:24:39.095 12:05:29 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:24:39.095 12:05:29 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:24:39.095 Running I/O for 3 seconds... 00:24:42.375 00:24:42.375 Latency(us) 00:24:42.375 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:42.375 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:24:42.375 Verification LBA range: start 0x0 length 0x3100 00:24:42.375 COMP_lvs0/lv0 : 3.01 3377.14 13.19 0.00 0.00 9437.66 57.30 14417.92 00:24:42.375 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:24:42.375 Verification LBA range: start 0x3100 length 0x3100 00:24:42.375 COMP_lvs0/lv0 : 3.01 3412.64 13.33 0.00 0.00 9336.26 54.37 14480.34 00:24:42.375 =================================================================================================================== 00:24:42.375 Total : 6789.78 26.52 0.00 0.00 9386.68 54.37 14480.34 00:24:42.375 0 00:24:42.375 12:05:32 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:24:42.375 12:05:32 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:24:42.375 12:05:32 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:24:42.375 12:05:32 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:24:42.375 12:05:32 compress_isal -- compress/compress.sh@78 -- # killprocess 761224 00:24:42.375 12:05:32 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 761224 ']' 00:24:42.375 12:05:32 compress_isal -- common/autotest_common.sh@952 -- # kill -0 761224 00:24:42.375 12:05:32 compress_isal -- common/autotest_common.sh@953 -- # uname 00:24:42.375 12:05:32 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:42.375 12:05:32 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 761224 00:24:42.633 12:05:32 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:24:42.633 12:05:32 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:24:42.633 12:05:32 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 761224' 00:24:42.633 killing process with pid 761224 00:24:42.633 12:05:32 compress_isal -- common/autotest_common.sh@967 -- # kill 761224 00:24:42.633 Received shutdown signal, test time was about 3.000000 seconds 00:24:42.633 00:24:42.633 Latency(us) 00:24:42.633 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:42.633 =================================================================================================================== 00:24:42.633 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:24:42.633 12:05:32 compress_isal -- common/autotest_common.sh@972 -- # wait 761224 00:24:44.005 12:05:34 compress_isal -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:24:44.005 12:05:34 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:24:44.005 12:05:34 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=763059 00:24:44.005 12:05:34 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:24:44.005 12:05:34 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:24:44.005 12:05:34 compress_isal -- compress/compress.sh@73 -- # waitforlisten 763059 00:24:44.005 12:05:34 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 763059 ']' 00:24:44.005 12:05:34 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:44.005 12:05:34 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:44.005 12:05:34 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:24:44.005 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:24:44.005 12:05:34 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:44.005 12:05:34 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:24:44.005 [2024-07-12 12:05:34.168292] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:24:44.005 [2024-07-12 12:05:34.168337] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid763059 ] 00:24:44.005 [2024-07-12 12:05:34.231999] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:24:44.263 [2024-07-12 12:05:34.302762] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:24:44.263 [2024-07-12 12:05:34.302765] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:24:44.828 12:05:34 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:44.828 12:05:34 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:24:44.828 12:05:34 compress_isal -- compress/compress.sh@74 -- # create_vols 512 00:24:44.828 12:05:34 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:24:44.828 12:05:34 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:24:48.108 12:05:37 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:24:48.108 12:05:37 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:24:48.108 12:05:37 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:48.108 12:05:37 compress_isal -- common/autotest_common.sh@899 -- # local i 00:24:48.108 12:05:37 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:48.108 12:05:37 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:48.108 12:05:37 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:48.108 12:05:38 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:24:48.108 [ 00:24:48.108 { 00:24:48.108 "name": "Nvme0n1", 00:24:48.108 "aliases": [ 00:24:48.108 "968c4627-6451-4448-8c3a-f8a181a2cc87" 00:24:48.108 ], 00:24:48.108 "product_name": "NVMe disk", 00:24:48.108 "block_size": 512, 00:24:48.108 "num_blocks": 1953525168, 00:24:48.108 "uuid": "968c4627-6451-4448-8c3a-f8a181a2cc87", 00:24:48.108 "assigned_rate_limits": { 00:24:48.108 "rw_ios_per_sec": 0, 00:24:48.108 "rw_mbytes_per_sec": 0, 00:24:48.108 "r_mbytes_per_sec": 0, 00:24:48.108 "w_mbytes_per_sec": 0 00:24:48.108 }, 00:24:48.108 "claimed": false, 00:24:48.108 "zoned": false, 00:24:48.108 "supported_io_types": { 00:24:48.108 "read": true, 00:24:48.108 "write": true, 00:24:48.108 "unmap": true, 00:24:48.108 "flush": true, 00:24:48.108 "reset": true, 00:24:48.108 "nvme_admin": true, 00:24:48.108 "nvme_io": true, 00:24:48.108 "nvme_io_md": false, 00:24:48.108 "write_zeroes": true, 00:24:48.108 "zcopy": false, 00:24:48.108 "get_zone_info": false, 00:24:48.108 "zone_management": false, 00:24:48.108 "zone_append": false, 00:24:48.108 "compare": false, 00:24:48.108 "compare_and_write": false, 00:24:48.108 "abort": true, 00:24:48.108 "seek_hole": false, 00:24:48.108 "seek_data": false, 00:24:48.108 "copy": false, 00:24:48.108 "nvme_iov_md": false 00:24:48.108 }, 00:24:48.108 "driver_specific": { 00:24:48.108 "nvme": [ 00:24:48.108 { 00:24:48.108 "pci_address": "0000:5e:00.0", 00:24:48.108 "trid": { 00:24:48.108 "trtype": "PCIe", 00:24:48.108 "traddr": "0000:5e:00.0" 00:24:48.108 }, 00:24:48.108 "ctrlr_data": { 00:24:48.108 "cntlid": 0, 00:24:48.108 "vendor_id": "0x8086", 00:24:48.108 "model_number": "INTEL SSDPE2KX010T8", 00:24:48.108 "serial_number": "BTLJ807001JM1P0FGN", 00:24:48.108 "firmware_revision": "VDV10170", 00:24:48.108 "oacs": { 00:24:48.108 "security": 1, 00:24:48.108 "format": 1, 00:24:48.108 "firmware": 1, 00:24:48.108 "ns_manage": 1 00:24:48.108 }, 00:24:48.108 "multi_ctrlr": false, 00:24:48.108 "ana_reporting": false 00:24:48.108 }, 00:24:48.108 "vs": { 00:24:48.108 "nvme_version": "1.2" 00:24:48.108 }, 00:24:48.108 "ns_data": { 00:24:48.108 "id": 1, 00:24:48.108 "can_share": false 00:24:48.108 }, 00:24:48.108 "security": { 00:24:48.108 "opal": true 00:24:48.108 } 00:24:48.108 } 00:24:48.108 ], 00:24:48.108 "mp_policy": "active_passive" 00:24:48.108 } 00:24:48.108 } 00:24:48.108 ] 00:24:48.108 12:05:38 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:24:48.108 12:05:38 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:24:49.481 e1d1ad95-0f2a-47c8-aee1-f516a8b0c0f0 00:24:49.482 12:05:39 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:24:49.482 2c5bef32-3e30-4b78-bbcf-295d2a3c6c8d 00:24:49.482 12:05:39 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:24:49.482 12:05:39 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:24:49.482 12:05:39 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:49.482 12:05:39 compress_isal -- common/autotest_common.sh@899 -- # local i 00:24:49.482 12:05:39 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:49.482 12:05:39 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:49.482 12:05:39 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:49.482 12:05:39 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:24:49.740 [ 00:24:49.740 { 00:24:49.740 "name": "2c5bef32-3e30-4b78-bbcf-295d2a3c6c8d", 00:24:49.740 "aliases": [ 00:24:49.740 "lvs0/lv0" 00:24:49.740 ], 00:24:49.740 "product_name": "Logical Volume", 00:24:49.740 "block_size": 512, 00:24:49.740 "num_blocks": 204800, 00:24:49.740 "uuid": "2c5bef32-3e30-4b78-bbcf-295d2a3c6c8d", 00:24:49.740 "assigned_rate_limits": { 00:24:49.740 "rw_ios_per_sec": 0, 00:24:49.740 "rw_mbytes_per_sec": 0, 00:24:49.740 "r_mbytes_per_sec": 0, 00:24:49.740 "w_mbytes_per_sec": 0 00:24:49.740 }, 00:24:49.740 "claimed": false, 00:24:49.740 "zoned": false, 00:24:49.740 "supported_io_types": { 00:24:49.740 "read": true, 00:24:49.740 "write": true, 00:24:49.740 "unmap": true, 00:24:49.740 "flush": false, 00:24:49.740 "reset": true, 00:24:49.740 "nvme_admin": false, 00:24:49.740 "nvme_io": false, 00:24:49.740 "nvme_io_md": false, 00:24:49.740 "write_zeroes": true, 00:24:49.740 "zcopy": false, 00:24:49.740 "get_zone_info": false, 00:24:49.740 "zone_management": false, 00:24:49.740 "zone_append": false, 00:24:49.740 "compare": false, 00:24:49.740 "compare_and_write": false, 00:24:49.740 "abort": false, 00:24:49.740 "seek_hole": true, 00:24:49.740 "seek_data": true, 00:24:49.740 "copy": false, 00:24:49.740 "nvme_iov_md": false 00:24:49.740 }, 00:24:49.740 "driver_specific": { 00:24:49.740 "lvol": { 00:24:49.740 "lvol_store_uuid": "e1d1ad95-0f2a-47c8-aee1-f516a8b0c0f0", 00:24:49.740 "base_bdev": "Nvme0n1", 00:24:49.740 "thin_provision": true, 00:24:49.740 "num_allocated_clusters": 0, 00:24:49.740 "snapshot": false, 00:24:49.740 "clone": false, 00:24:49.740 "esnap_clone": false 00:24:49.740 } 00:24:49.740 } 00:24:49.740 } 00:24:49.740 ] 00:24:49.740 12:05:39 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:24:49.740 12:05:39 compress_isal -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:24:49.740 12:05:39 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:24:49.740 [2024-07-12 12:05:39.955041] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:24:49.740 COMP_lvs0/lv0 00:24:49.740 12:05:39 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:24:49.740 12:05:39 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:24:49.740 12:05:39 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:49.740 12:05:39 compress_isal -- common/autotest_common.sh@899 -- # local i 00:24:49.740 12:05:39 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:49.740 12:05:39 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:49.740 12:05:39 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:49.997 12:05:40 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:24:50.255 [ 00:24:50.255 { 00:24:50.255 "name": "COMP_lvs0/lv0", 00:24:50.255 "aliases": [ 00:24:50.255 "d1cb3745-a5df-5fa2-9025-faf15ccd0410" 00:24:50.255 ], 00:24:50.255 "product_name": "compress", 00:24:50.255 "block_size": 512, 00:24:50.255 "num_blocks": 200704, 00:24:50.255 "uuid": "d1cb3745-a5df-5fa2-9025-faf15ccd0410", 00:24:50.255 "assigned_rate_limits": { 00:24:50.255 "rw_ios_per_sec": 0, 00:24:50.255 "rw_mbytes_per_sec": 0, 00:24:50.255 "r_mbytes_per_sec": 0, 00:24:50.255 "w_mbytes_per_sec": 0 00:24:50.255 }, 00:24:50.255 "claimed": false, 00:24:50.255 "zoned": false, 00:24:50.255 "supported_io_types": { 00:24:50.255 "read": true, 00:24:50.255 "write": true, 00:24:50.255 "unmap": false, 00:24:50.255 "flush": false, 00:24:50.255 "reset": false, 00:24:50.255 "nvme_admin": false, 00:24:50.255 "nvme_io": false, 00:24:50.255 "nvme_io_md": false, 00:24:50.255 "write_zeroes": true, 00:24:50.255 "zcopy": false, 00:24:50.255 "get_zone_info": false, 00:24:50.255 "zone_management": false, 00:24:50.255 "zone_append": false, 00:24:50.255 "compare": false, 00:24:50.255 "compare_and_write": false, 00:24:50.255 "abort": false, 00:24:50.255 "seek_hole": false, 00:24:50.255 "seek_data": false, 00:24:50.255 "copy": false, 00:24:50.255 "nvme_iov_md": false 00:24:50.255 }, 00:24:50.255 "driver_specific": { 00:24:50.255 "compress": { 00:24:50.255 "name": "COMP_lvs0/lv0", 00:24:50.255 "base_bdev_name": "2c5bef32-3e30-4b78-bbcf-295d2a3c6c8d" 00:24:50.255 } 00:24:50.255 } 00:24:50.255 } 00:24:50.255 ] 00:24:50.255 12:05:40 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:24:50.255 12:05:40 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:24:50.255 Running I/O for 3 seconds... 00:24:53.535 00:24:53.535 Latency(us) 00:24:53.535 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:53.535 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:24:53.535 Verification LBA range: start 0x0 length 0x3100 00:24:53.535 COMP_lvs0/lv0 : 3.01 3404.28 13.30 0.00 0.00 9348.34 56.81 14417.92 00:24:53.535 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:24:53.535 Verification LBA range: start 0x3100 length 0x3100 00:24:53.535 COMP_lvs0/lv0 : 3.01 3417.52 13.35 0.00 0.00 9321.07 54.37 14230.67 00:24:53.535 =================================================================================================================== 00:24:53.535 Total : 6821.79 26.65 0.00 0.00 9334.68 54.37 14417.92 00:24:53.535 0 00:24:53.535 12:05:43 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:24:53.535 12:05:43 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:24:53.535 12:05:43 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:24:53.535 12:05:43 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:24:53.792 12:05:43 compress_isal -- compress/compress.sh@78 -- # killprocess 763059 00:24:53.792 12:05:43 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 763059 ']' 00:24:53.792 12:05:43 compress_isal -- common/autotest_common.sh@952 -- # kill -0 763059 00:24:53.792 12:05:43 compress_isal -- common/autotest_common.sh@953 -- # uname 00:24:53.792 12:05:43 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:53.792 12:05:43 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 763059 00:24:53.792 12:05:43 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:24:53.792 12:05:43 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:24:53.792 12:05:43 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 763059' 00:24:53.792 killing process with pid 763059 00:24:53.792 12:05:43 compress_isal -- common/autotest_common.sh@967 -- # kill 763059 00:24:53.792 Received shutdown signal, test time was about 3.000000 seconds 00:24:53.792 00:24:53.792 Latency(us) 00:24:53.792 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:53.792 =================================================================================================================== 00:24:53.792 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:24:53.792 12:05:43 compress_isal -- common/autotest_common.sh@972 -- # wait 763059 00:24:55.164 12:05:45 compress_isal -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:24:55.164 12:05:45 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:24:55.164 12:05:45 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=764897 00:24:55.164 12:05:45 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:24:55.164 12:05:45 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:24:55.164 12:05:45 compress_isal -- compress/compress.sh@73 -- # waitforlisten 764897 00:24:55.164 12:05:45 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 764897 ']' 00:24:55.164 12:05:45 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:55.164 12:05:45 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:55.164 12:05:45 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:24:55.164 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:24:55.164 12:05:45 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:55.164 12:05:45 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:24:55.164 [2024-07-12 12:05:45.315340] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:24:55.164 [2024-07-12 12:05:45.315385] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid764897 ] 00:24:55.164 [2024-07-12 12:05:45.377551] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:24:55.422 [2024-07-12 12:05:45.455281] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:24:55.422 [2024-07-12 12:05:45.455281] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:24:55.989 12:05:46 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:55.989 12:05:46 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:24:55.989 12:05:46 compress_isal -- compress/compress.sh@74 -- # create_vols 4096 00:24:55.989 12:05:46 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:24:55.989 12:05:46 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:24:59.337 12:05:49 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:24:59.337 12:05:49 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:24:59.337 12:05:49 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:59.337 12:05:49 compress_isal -- common/autotest_common.sh@899 -- # local i 00:24:59.337 12:05:49 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:59.337 12:05:49 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:59.337 12:05:49 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:59.337 12:05:49 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:24:59.337 [ 00:24:59.337 { 00:24:59.337 "name": "Nvme0n1", 00:24:59.337 "aliases": [ 00:24:59.337 "8a131049-bc72-4e2a-8517-c821e9550b5a" 00:24:59.337 ], 00:24:59.337 "product_name": "NVMe disk", 00:24:59.337 "block_size": 512, 00:24:59.337 "num_blocks": 1953525168, 00:24:59.337 "uuid": "8a131049-bc72-4e2a-8517-c821e9550b5a", 00:24:59.337 "assigned_rate_limits": { 00:24:59.337 "rw_ios_per_sec": 0, 00:24:59.337 "rw_mbytes_per_sec": 0, 00:24:59.337 "r_mbytes_per_sec": 0, 00:24:59.337 "w_mbytes_per_sec": 0 00:24:59.337 }, 00:24:59.337 "claimed": false, 00:24:59.337 "zoned": false, 00:24:59.337 "supported_io_types": { 00:24:59.337 "read": true, 00:24:59.337 "write": true, 00:24:59.337 "unmap": true, 00:24:59.337 "flush": true, 00:24:59.337 "reset": true, 00:24:59.337 "nvme_admin": true, 00:24:59.337 "nvme_io": true, 00:24:59.337 "nvme_io_md": false, 00:24:59.337 "write_zeroes": true, 00:24:59.337 "zcopy": false, 00:24:59.337 "get_zone_info": false, 00:24:59.337 "zone_management": false, 00:24:59.337 "zone_append": false, 00:24:59.337 "compare": false, 00:24:59.337 "compare_and_write": false, 00:24:59.337 "abort": true, 00:24:59.337 "seek_hole": false, 00:24:59.337 "seek_data": false, 00:24:59.337 "copy": false, 00:24:59.337 "nvme_iov_md": false 00:24:59.337 }, 00:24:59.337 "driver_specific": { 00:24:59.337 "nvme": [ 00:24:59.337 { 00:24:59.337 "pci_address": "0000:5e:00.0", 00:24:59.337 "trid": { 00:24:59.337 "trtype": "PCIe", 00:24:59.337 "traddr": "0000:5e:00.0" 00:24:59.337 }, 00:24:59.337 "ctrlr_data": { 00:24:59.337 "cntlid": 0, 00:24:59.337 "vendor_id": "0x8086", 00:24:59.337 "model_number": "INTEL SSDPE2KX010T8", 00:24:59.337 "serial_number": "BTLJ807001JM1P0FGN", 00:24:59.337 "firmware_revision": "VDV10170", 00:24:59.337 "oacs": { 00:24:59.337 "security": 1, 00:24:59.337 "format": 1, 00:24:59.337 "firmware": 1, 00:24:59.337 "ns_manage": 1 00:24:59.337 }, 00:24:59.337 "multi_ctrlr": false, 00:24:59.337 "ana_reporting": false 00:24:59.337 }, 00:24:59.337 "vs": { 00:24:59.337 "nvme_version": "1.2" 00:24:59.337 }, 00:24:59.337 "ns_data": { 00:24:59.337 "id": 1, 00:24:59.337 "can_share": false 00:24:59.337 }, 00:24:59.337 "security": { 00:24:59.337 "opal": true 00:24:59.337 } 00:24:59.337 } 00:24:59.337 ], 00:24:59.337 "mp_policy": "active_passive" 00:24:59.337 } 00:24:59.337 } 00:24:59.337 ] 00:24:59.337 12:05:49 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:24:59.337 12:05:49 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:25:00.271 3f3caca5-da69-40be-be29-22ba76390990 00:25:00.271 12:05:50 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:25:00.528 15c87ab1-9451-4b46-b661-435dca8775c5 00:25:00.528 12:05:50 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:25:00.528 12:05:50 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:25:00.528 12:05:50 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:25:00.528 12:05:50 compress_isal -- common/autotest_common.sh@899 -- # local i 00:25:00.528 12:05:50 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:25:00.528 12:05:50 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:25:00.528 12:05:50 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:25:00.786 12:05:50 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:25:00.786 [ 00:25:00.786 { 00:25:00.786 "name": "15c87ab1-9451-4b46-b661-435dca8775c5", 00:25:00.786 "aliases": [ 00:25:00.786 "lvs0/lv0" 00:25:00.786 ], 00:25:00.786 "product_name": "Logical Volume", 00:25:00.786 "block_size": 512, 00:25:00.786 "num_blocks": 204800, 00:25:00.786 "uuid": "15c87ab1-9451-4b46-b661-435dca8775c5", 00:25:00.786 "assigned_rate_limits": { 00:25:00.786 "rw_ios_per_sec": 0, 00:25:00.786 "rw_mbytes_per_sec": 0, 00:25:00.786 "r_mbytes_per_sec": 0, 00:25:00.786 "w_mbytes_per_sec": 0 00:25:00.786 }, 00:25:00.786 "claimed": false, 00:25:00.786 "zoned": false, 00:25:00.786 "supported_io_types": { 00:25:00.786 "read": true, 00:25:00.786 "write": true, 00:25:00.786 "unmap": true, 00:25:00.786 "flush": false, 00:25:00.786 "reset": true, 00:25:00.786 "nvme_admin": false, 00:25:00.786 "nvme_io": false, 00:25:00.786 "nvme_io_md": false, 00:25:00.786 "write_zeroes": true, 00:25:00.786 "zcopy": false, 00:25:00.786 "get_zone_info": false, 00:25:00.786 "zone_management": false, 00:25:00.786 "zone_append": false, 00:25:00.786 "compare": false, 00:25:00.786 "compare_and_write": false, 00:25:00.786 "abort": false, 00:25:00.786 "seek_hole": true, 00:25:00.786 "seek_data": true, 00:25:00.786 "copy": false, 00:25:00.786 "nvme_iov_md": false 00:25:00.786 }, 00:25:00.786 "driver_specific": { 00:25:00.786 "lvol": { 00:25:00.786 "lvol_store_uuid": "3f3caca5-da69-40be-be29-22ba76390990", 00:25:00.786 "base_bdev": "Nvme0n1", 00:25:00.786 "thin_provision": true, 00:25:00.786 "num_allocated_clusters": 0, 00:25:00.786 "snapshot": false, 00:25:00.786 "clone": false, 00:25:00.786 "esnap_clone": false 00:25:00.786 } 00:25:00.786 } 00:25:00.786 } 00:25:00.786 ] 00:25:00.786 12:05:50 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:25:00.786 12:05:50 compress_isal -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:25:00.786 12:05:50 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:25:01.044 [2024-07-12 12:05:51.130910] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:25:01.044 COMP_lvs0/lv0 00:25:01.044 12:05:51 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:25:01.044 12:05:51 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:25:01.044 12:05:51 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:25:01.044 12:05:51 compress_isal -- common/autotest_common.sh@899 -- # local i 00:25:01.044 12:05:51 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:25:01.044 12:05:51 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:25:01.044 12:05:51 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:25:01.301 12:05:51 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:25:01.301 [ 00:25:01.301 { 00:25:01.301 "name": "COMP_lvs0/lv0", 00:25:01.301 "aliases": [ 00:25:01.301 "66195728-619e-54d4-be03-2105acdab9f7" 00:25:01.301 ], 00:25:01.301 "product_name": "compress", 00:25:01.301 "block_size": 4096, 00:25:01.301 "num_blocks": 25088, 00:25:01.301 "uuid": "66195728-619e-54d4-be03-2105acdab9f7", 00:25:01.301 "assigned_rate_limits": { 00:25:01.301 "rw_ios_per_sec": 0, 00:25:01.301 "rw_mbytes_per_sec": 0, 00:25:01.301 "r_mbytes_per_sec": 0, 00:25:01.301 "w_mbytes_per_sec": 0 00:25:01.301 }, 00:25:01.301 "claimed": false, 00:25:01.301 "zoned": false, 00:25:01.301 "supported_io_types": { 00:25:01.302 "read": true, 00:25:01.302 "write": true, 00:25:01.302 "unmap": false, 00:25:01.302 "flush": false, 00:25:01.302 "reset": false, 00:25:01.302 "nvme_admin": false, 00:25:01.302 "nvme_io": false, 00:25:01.302 "nvme_io_md": false, 00:25:01.302 "write_zeroes": true, 00:25:01.302 "zcopy": false, 00:25:01.302 "get_zone_info": false, 00:25:01.302 "zone_management": false, 00:25:01.302 "zone_append": false, 00:25:01.302 "compare": false, 00:25:01.302 "compare_and_write": false, 00:25:01.302 "abort": false, 00:25:01.302 "seek_hole": false, 00:25:01.302 "seek_data": false, 00:25:01.302 "copy": false, 00:25:01.302 "nvme_iov_md": false 00:25:01.302 }, 00:25:01.302 "driver_specific": { 00:25:01.302 "compress": { 00:25:01.302 "name": "COMP_lvs0/lv0", 00:25:01.302 "base_bdev_name": "15c87ab1-9451-4b46-b661-435dca8775c5" 00:25:01.302 } 00:25:01.302 } 00:25:01.302 } 00:25:01.302 ] 00:25:01.302 12:05:51 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:25:01.302 12:05:51 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:25:01.560 Running I/O for 3 seconds... 00:25:04.847 00:25:04.847 Latency(us) 00:25:04.847 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:04.847 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:25:04.847 Verification LBA range: start 0x0 length 0x3100 00:25:04.847 COMP_lvs0/lv0 : 3.01 3377.23 13.19 0.00 0.00 9437.84 57.05 15166.90 00:25:04.847 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:25:04.847 Verification LBA range: start 0x3100 length 0x3100 00:25:04.847 COMP_lvs0/lv0 : 3.01 3377.82 13.19 0.00 0.00 9430.02 56.32 15166.90 00:25:04.847 =================================================================================================================== 00:25:04.847 Total : 6755.05 26.39 0.00 0.00 9433.93 56.32 15166.90 00:25:04.847 0 00:25:04.847 12:05:54 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:25:04.847 12:05:54 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:25:04.847 12:05:54 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:25:04.847 12:05:54 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:25:04.847 12:05:54 compress_isal -- compress/compress.sh@78 -- # killprocess 764897 00:25:04.847 12:05:54 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 764897 ']' 00:25:04.847 12:05:54 compress_isal -- common/autotest_common.sh@952 -- # kill -0 764897 00:25:04.847 12:05:54 compress_isal -- common/autotest_common.sh@953 -- # uname 00:25:04.847 12:05:54 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:04.847 12:05:54 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 764897 00:25:04.847 12:05:54 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:25:04.847 12:05:54 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:25:04.847 12:05:54 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 764897' 00:25:04.847 killing process with pid 764897 00:25:04.847 12:05:54 compress_isal -- common/autotest_common.sh@967 -- # kill 764897 00:25:04.847 Received shutdown signal, test time was about 3.000000 seconds 00:25:04.847 00:25:04.847 Latency(us) 00:25:04.847 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:04.847 =================================================================================================================== 00:25:04.847 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:25:04.847 12:05:54 compress_isal -- common/autotest_common.sh@972 -- # wait 764897 00:25:06.220 12:05:56 compress_isal -- compress/compress.sh@89 -- # run_bdevio 00:25:06.220 12:05:56 compress_isal -- compress/compress.sh@50 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:25:06.220 12:05:56 compress_isal -- compress/compress.sh@55 -- # bdevio_pid=766741 00:25:06.220 12:05:56 compress_isal -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:25:06.220 12:05:56 compress_isal -- compress/compress.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w 00:25:06.220 12:05:56 compress_isal -- compress/compress.sh@57 -- # waitforlisten 766741 00:25:06.220 12:05:56 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 766741 ']' 00:25:06.220 12:05:56 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:25:06.220 12:05:56 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:06.220 12:05:56 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:25:06.220 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:25:06.220 12:05:56 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:06.220 12:05:56 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:25:06.478 [2024-07-12 12:05:56.499904] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:25:06.478 [2024-07-12 12:05:56.499946] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid766741 ] 00:25:06.478 [2024-07-12 12:05:56.563742] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:25:06.478 [2024-07-12 12:05:56.633568] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:25:06.478 [2024-07-12 12:05:56.633664] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:25:06.478 [2024-07-12 12:05:56.633664] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:25:07.069 12:05:57 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:07.069 12:05:57 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:25:07.069 12:05:57 compress_isal -- compress/compress.sh@58 -- # create_vols 00:25:07.069 12:05:57 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:25:07.069 12:05:57 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:25:10.379 12:06:00 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:25:10.379 12:06:00 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:25:10.379 12:06:00 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:25:10.379 12:06:00 compress_isal -- common/autotest_common.sh@899 -- # local i 00:25:10.379 12:06:00 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:25:10.379 12:06:00 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:25:10.379 12:06:00 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:25:10.380 12:06:00 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:25:10.638 [ 00:25:10.638 { 00:25:10.638 "name": "Nvme0n1", 00:25:10.638 "aliases": [ 00:25:10.638 "e7ef5743-6cd9-4e86-a815-1b17c653d788" 00:25:10.638 ], 00:25:10.638 "product_name": "NVMe disk", 00:25:10.638 "block_size": 512, 00:25:10.639 "num_blocks": 1953525168, 00:25:10.639 "uuid": "e7ef5743-6cd9-4e86-a815-1b17c653d788", 00:25:10.639 "assigned_rate_limits": { 00:25:10.639 "rw_ios_per_sec": 0, 00:25:10.639 "rw_mbytes_per_sec": 0, 00:25:10.639 "r_mbytes_per_sec": 0, 00:25:10.639 "w_mbytes_per_sec": 0 00:25:10.639 }, 00:25:10.639 "claimed": false, 00:25:10.639 "zoned": false, 00:25:10.639 "supported_io_types": { 00:25:10.639 "read": true, 00:25:10.639 "write": true, 00:25:10.639 "unmap": true, 00:25:10.639 "flush": true, 00:25:10.639 "reset": true, 00:25:10.639 "nvme_admin": true, 00:25:10.639 "nvme_io": true, 00:25:10.639 "nvme_io_md": false, 00:25:10.639 "write_zeroes": true, 00:25:10.639 "zcopy": false, 00:25:10.639 "get_zone_info": false, 00:25:10.639 "zone_management": false, 00:25:10.639 "zone_append": false, 00:25:10.639 "compare": false, 00:25:10.639 "compare_and_write": false, 00:25:10.639 "abort": true, 00:25:10.639 "seek_hole": false, 00:25:10.639 "seek_data": false, 00:25:10.639 "copy": false, 00:25:10.639 "nvme_iov_md": false 00:25:10.639 }, 00:25:10.639 "driver_specific": { 00:25:10.639 "nvme": [ 00:25:10.639 { 00:25:10.639 "pci_address": "0000:5e:00.0", 00:25:10.639 "trid": { 00:25:10.639 "trtype": "PCIe", 00:25:10.639 "traddr": "0000:5e:00.0" 00:25:10.639 }, 00:25:10.639 "ctrlr_data": { 00:25:10.639 "cntlid": 0, 00:25:10.639 "vendor_id": "0x8086", 00:25:10.639 "model_number": "INTEL SSDPE2KX010T8", 00:25:10.639 "serial_number": "BTLJ807001JM1P0FGN", 00:25:10.639 "firmware_revision": "VDV10170", 00:25:10.639 "oacs": { 00:25:10.639 "security": 1, 00:25:10.639 "format": 1, 00:25:10.639 "firmware": 1, 00:25:10.639 "ns_manage": 1 00:25:10.639 }, 00:25:10.639 "multi_ctrlr": false, 00:25:10.639 "ana_reporting": false 00:25:10.639 }, 00:25:10.639 "vs": { 00:25:10.639 "nvme_version": "1.2" 00:25:10.639 }, 00:25:10.639 "ns_data": { 00:25:10.639 "id": 1, 00:25:10.639 "can_share": false 00:25:10.639 }, 00:25:10.639 "security": { 00:25:10.639 "opal": true 00:25:10.639 } 00:25:10.639 } 00:25:10.639 ], 00:25:10.639 "mp_policy": "active_passive" 00:25:10.639 } 00:25:10.639 } 00:25:10.639 ] 00:25:10.639 12:06:00 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:25:10.639 12:06:00 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:25:11.574 6d67c405-62fb-44a5-8b8d-4a66b064e9cc 00:25:11.574 12:06:01 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:25:11.833 1f80a534-fdc0-41a3-9c23-d304f2915479 00:25:11.833 12:06:01 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:25:11.833 12:06:01 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:25:11.834 12:06:01 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:25:11.834 12:06:01 compress_isal -- common/autotest_common.sh@899 -- # local i 00:25:11.834 12:06:01 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:25:11.834 12:06:01 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:25:11.834 12:06:01 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:25:11.834 12:06:02 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:25:12.093 [ 00:25:12.093 { 00:25:12.093 "name": "1f80a534-fdc0-41a3-9c23-d304f2915479", 00:25:12.093 "aliases": [ 00:25:12.093 "lvs0/lv0" 00:25:12.093 ], 00:25:12.093 "product_name": "Logical Volume", 00:25:12.093 "block_size": 512, 00:25:12.093 "num_blocks": 204800, 00:25:12.093 "uuid": "1f80a534-fdc0-41a3-9c23-d304f2915479", 00:25:12.093 "assigned_rate_limits": { 00:25:12.093 "rw_ios_per_sec": 0, 00:25:12.093 "rw_mbytes_per_sec": 0, 00:25:12.093 "r_mbytes_per_sec": 0, 00:25:12.093 "w_mbytes_per_sec": 0 00:25:12.093 }, 00:25:12.093 "claimed": false, 00:25:12.093 "zoned": false, 00:25:12.093 "supported_io_types": { 00:25:12.093 "read": true, 00:25:12.093 "write": true, 00:25:12.093 "unmap": true, 00:25:12.093 "flush": false, 00:25:12.093 "reset": true, 00:25:12.093 "nvme_admin": false, 00:25:12.093 "nvme_io": false, 00:25:12.093 "nvme_io_md": false, 00:25:12.093 "write_zeroes": true, 00:25:12.093 "zcopy": false, 00:25:12.093 "get_zone_info": false, 00:25:12.093 "zone_management": false, 00:25:12.093 "zone_append": false, 00:25:12.093 "compare": false, 00:25:12.093 "compare_and_write": false, 00:25:12.093 "abort": false, 00:25:12.093 "seek_hole": true, 00:25:12.093 "seek_data": true, 00:25:12.093 "copy": false, 00:25:12.093 "nvme_iov_md": false 00:25:12.093 }, 00:25:12.093 "driver_specific": { 00:25:12.093 "lvol": { 00:25:12.093 "lvol_store_uuid": "6d67c405-62fb-44a5-8b8d-4a66b064e9cc", 00:25:12.093 "base_bdev": "Nvme0n1", 00:25:12.093 "thin_provision": true, 00:25:12.093 "num_allocated_clusters": 0, 00:25:12.093 "snapshot": false, 00:25:12.093 "clone": false, 00:25:12.093 "esnap_clone": false 00:25:12.093 } 00:25:12.093 } 00:25:12.093 } 00:25:12.093 ] 00:25:12.093 12:06:02 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:25:12.093 12:06:02 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:25:12.093 12:06:02 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:25:12.352 [2024-07-12 12:06:02.390961] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:25:12.352 COMP_lvs0/lv0 00:25:12.352 12:06:02 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:25:12.352 12:06:02 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:25:12.352 12:06:02 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:25:12.352 12:06:02 compress_isal -- common/autotest_common.sh@899 -- # local i 00:25:12.352 12:06:02 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:25:12.352 12:06:02 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:25:12.353 12:06:02 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:25:12.353 12:06:02 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:25:12.611 [ 00:25:12.611 { 00:25:12.611 "name": "COMP_lvs0/lv0", 00:25:12.611 "aliases": [ 00:25:12.611 "c46c117d-7ccd-5b22-92b5-094ac842e142" 00:25:12.611 ], 00:25:12.611 "product_name": "compress", 00:25:12.611 "block_size": 512, 00:25:12.611 "num_blocks": 200704, 00:25:12.611 "uuid": "c46c117d-7ccd-5b22-92b5-094ac842e142", 00:25:12.611 "assigned_rate_limits": { 00:25:12.611 "rw_ios_per_sec": 0, 00:25:12.611 "rw_mbytes_per_sec": 0, 00:25:12.611 "r_mbytes_per_sec": 0, 00:25:12.611 "w_mbytes_per_sec": 0 00:25:12.611 }, 00:25:12.611 "claimed": false, 00:25:12.611 "zoned": false, 00:25:12.611 "supported_io_types": { 00:25:12.611 "read": true, 00:25:12.611 "write": true, 00:25:12.611 "unmap": false, 00:25:12.611 "flush": false, 00:25:12.611 "reset": false, 00:25:12.611 "nvme_admin": false, 00:25:12.611 "nvme_io": false, 00:25:12.611 "nvme_io_md": false, 00:25:12.611 "write_zeroes": true, 00:25:12.611 "zcopy": false, 00:25:12.611 "get_zone_info": false, 00:25:12.611 "zone_management": false, 00:25:12.611 "zone_append": false, 00:25:12.611 "compare": false, 00:25:12.611 "compare_and_write": false, 00:25:12.611 "abort": false, 00:25:12.611 "seek_hole": false, 00:25:12.611 "seek_data": false, 00:25:12.611 "copy": false, 00:25:12.611 "nvme_iov_md": false 00:25:12.611 }, 00:25:12.611 "driver_specific": { 00:25:12.611 "compress": { 00:25:12.611 "name": "COMP_lvs0/lv0", 00:25:12.611 "base_bdev_name": "1f80a534-fdc0-41a3-9c23-d304f2915479" 00:25:12.611 } 00:25:12.611 } 00:25:12.611 } 00:25:12.611 ] 00:25:12.611 12:06:02 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:25:12.611 12:06:02 compress_isal -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:25:12.611 I/O targets: 00:25:12.611 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:25:12.611 00:25:12.611 00:25:12.611 CUnit - A unit testing framework for C - Version 2.1-3 00:25:12.611 http://cunit.sourceforge.net/ 00:25:12.611 00:25:12.611 00:25:12.611 Suite: bdevio tests on: COMP_lvs0/lv0 00:25:12.611 Test: blockdev write read block ...passed 00:25:12.611 Test: blockdev write zeroes read block ...passed 00:25:12.611 Test: blockdev write zeroes read no split ...passed 00:25:12.869 Test: blockdev write zeroes read split ...passed 00:25:12.869 Test: blockdev write zeroes read split partial ...passed 00:25:12.869 Test: blockdev reset ...[2024-07-12 12:06:02.903194] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:25:12.869 passed 00:25:12.869 Test: blockdev write read 8 blocks ...passed 00:25:12.869 Test: blockdev write read size > 128k ...passed 00:25:12.869 Test: blockdev write read invalid size ...passed 00:25:12.869 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:25:12.870 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:25:12.870 Test: blockdev write read max offset ...passed 00:25:12.870 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:25:12.870 Test: blockdev writev readv 8 blocks ...passed 00:25:12.870 Test: blockdev writev readv 30 x 1block ...passed 00:25:12.870 Test: blockdev writev readv block ...passed 00:25:12.870 Test: blockdev writev readv size > 128k ...passed 00:25:12.870 Test: blockdev writev readv size > 128k in two iovs ...passed 00:25:12.870 Test: blockdev comparev and writev ...passed 00:25:12.870 Test: blockdev nvme passthru rw ...passed 00:25:12.870 Test: blockdev nvme passthru vendor specific ...passed 00:25:12.870 Test: blockdev nvme admin passthru ...passed 00:25:12.870 Test: blockdev copy ...passed 00:25:12.870 00:25:12.870 Run Summary: Type Total Ran Passed Failed Inactive 00:25:12.870 suites 1 1 n/a 0 0 00:25:12.870 tests 23 23 23 0 0 00:25:12.870 asserts 130 130 130 0 n/a 00:25:12.870 00:25:12.870 Elapsed time = 0.205 seconds 00:25:12.870 0 00:25:12.870 12:06:02 compress_isal -- compress/compress.sh@60 -- # destroy_vols 00:25:12.870 12:06:02 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:25:13.128 12:06:03 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:25:13.128 12:06:03 compress_isal -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:25:13.128 12:06:03 compress_isal -- compress/compress.sh@62 -- # killprocess 766741 00:25:13.128 12:06:03 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 766741 ']' 00:25:13.128 12:06:03 compress_isal -- common/autotest_common.sh@952 -- # kill -0 766741 00:25:13.128 12:06:03 compress_isal -- common/autotest_common.sh@953 -- # uname 00:25:13.128 12:06:03 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:13.128 12:06:03 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 766741 00:25:13.128 12:06:03 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:13.128 12:06:03 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:13.128 12:06:03 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 766741' 00:25:13.128 killing process with pid 766741 00:25:13.128 12:06:03 compress_isal -- common/autotest_common.sh@967 -- # kill 766741 00:25:13.128 12:06:03 compress_isal -- common/autotest_common.sh@972 -- # wait 766741 00:25:15.032 12:06:04 compress_isal -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:25:15.032 12:06:04 compress_isal -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:25:15.032 00:25:15.032 real 0m41.929s 00:25:15.032 user 1m34.522s 00:25:15.032 sys 0m2.783s 00:25:15.032 12:06:04 compress_isal -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:15.032 12:06:04 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:25:15.032 ************************************ 00:25:15.032 END TEST compress_isal 00:25:15.032 ************************************ 00:25:15.032 12:06:04 -- common/autotest_common.sh@1142 -- # return 0 00:25:15.032 12:06:04 -- spdk/autotest.sh@352 -- # '[' 0 -eq 1 ']' 00:25:15.032 12:06:04 -- spdk/autotest.sh@356 -- # '[' 1 -eq 1 ']' 00:25:15.032 12:06:04 -- spdk/autotest.sh@357 -- # run_test blockdev_crypto_aesni /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:25:15.032 12:06:04 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:25:15.032 12:06:04 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:15.032 12:06:04 -- common/autotest_common.sh@10 -- # set +x 00:25:15.032 ************************************ 00:25:15.032 START TEST blockdev_crypto_aesni 00:25:15.032 ************************************ 00:25:15.032 12:06:04 blockdev_crypto_aesni -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:25:15.032 * Looking for test storage... 00:25:15.032 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:25:15.032 12:06:04 blockdev_crypto_aesni -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:25:15.032 12:06:04 blockdev_crypto_aesni -- bdev/nbd_common.sh@6 -- # set -e 00:25:15.032 12:06:04 blockdev_crypto_aesni -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:25:15.032 12:06:04 blockdev_crypto_aesni -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:25:15.032 12:06:04 blockdev_crypto_aesni -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:25:15.032 12:06:04 blockdev_crypto_aesni -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:25:15.032 12:06:04 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:25:15.032 12:06:04 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:25:15.032 12:06:04 blockdev_crypto_aesni -- bdev/blockdev.sh@20 -- # : 00:25:15.032 12:06:04 blockdev_crypto_aesni -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:25:15.032 12:06:04 blockdev_crypto_aesni -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:25:15.032 12:06:04 blockdev_crypto_aesni -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:25:15.032 12:06:04 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # uname -s 00:25:15.032 12:06:04 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:25:15.032 12:06:04 blockdev_crypto_aesni -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:25:15.032 12:06:04 blockdev_crypto_aesni -- bdev/blockdev.sh@682 -- # test_type=crypto_aesni 00:25:15.032 12:06:04 blockdev_crypto_aesni -- bdev/blockdev.sh@683 -- # crypto_device= 00:25:15.032 12:06:04 blockdev_crypto_aesni -- bdev/blockdev.sh@684 -- # dek= 00:25:15.032 12:06:04 blockdev_crypto_aesni -- bdev/blockdev.sh@685 -- # env_ctx= 00:25:15.032 12:06:04 blockdev_crypto_aesni -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:25:15.032 12:06:04 blockdev_crypto_aesni -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:25:15.032 12:06:04 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == bdev ]] 00:25:15.032 12:06:04 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == crypto_* ]] 00:25:15.032 12:06:04 blockdev_crypto_aesni -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:25:15.032 12:06:04 blockdev_crypto_aesni -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:25:15.032 12:06:04 blockdev_crypto_aesni -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=768295 00:25:15.032 12:06:04 blockdev_crypto_aesni -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:25:15.032 12:06:04 blockdev_crypto_aesni -- bdev/blockdev.sh@49 -- # waitforlisten 768295 00:25:15.032 12:06:04 blockdev_crypto_aesni -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:25:15.032 12:06:04 blockdev_crypto_aesni -- common/autotest_common.sh@829 -- # '[' -z 768295 ']' 00:25:15.032 12:06:04 blockdev_crypto_aesni -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:25:15.032 12:06:04 blockdev_crypto_aesni -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:15.032 12:06:04 blockdev_crypto_aesni -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:25:15.032 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:25:15.032 12:06:04 blockdev_crypto_aesni -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:15.032 12:06:04 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:15.032 [2024-07-12 12:06:04.977457] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:25:15.032 [2024-07-12 12:06:04.977501] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid768295 ] 00:25:15.032 [2024-07-12 12:06:05.039358] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:15.032 [2024-07-12 12:06:05.112216] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:25:15.600 12:06:05 blockdev_crypto_aesni -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:15.600 12:06:05 blockdev_crypto_aesni -- common/autotest_common.sh@862 -- # return 0 00:25:15.600 12:06:05 blockdev_crypto_aesni -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:25:15.600 12:06:05 blockdev_crypto_aesni -- bdev/blockdev.sh@705 -- # setup_crypto_aesni_conf 00:25:15.600 12:06:05 blockdev_crypto_aesni -- bdev/blockdev.sh@146 -- # rpc_cmd 00:25:15.600 12:06:05 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:25:15.600 12:06:05 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:15.600 [2024-07-12 12:06:05.782248] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:25:15.600 [2024-07-12 12:06:05.790278] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:25:15.600 [2024-07-12 12:06:05.798294] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:25:15.858 [2024-07-12 12:06:05.862258] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:25:18.393 true 00:25:18.393 true 00:25:18.393 true 00:25:18.393 true 00:25:18.393 Malloc0 00:25:18.393 Malloc1 00:25:18.393 Malloc2 00:25:18.393 Malloc3 00:25:18.393 [2024-07-12 12:06:08.138664] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:25:18.393 crypto_ram 00:25:18.393 [2024-07-12 12:06:08.146685] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:25:18.393 crypto_ram2 00:25:18.393 [2024-07-12 12:06:08.154701] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:25:18.393 crypto_ram3 00:25:18.393 [2024-07-12 12:06:08.162722] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:25:18.393 crypto_ram4 00:25:18.393 12:06:08 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:25:18.393 12:06:08 blockdev_crypto_aesni -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:25:18.393 12:06:08 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:25:18.393 12:06:08 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:18.393 12:06:08 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:25:18.393 12:06:08 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # cat 00:25:18.393 12:06:08 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:25:18.393 12:06:08 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:25:18.393 12:06:08 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:18.393 12:06:08 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:25:18.393 12:06:08 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:25:18.393 12:06:08 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:25:18.393 12:06:08 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:18.393 12:06:08 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:25:18.393 12:06:08 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:25:18.393 12:06:08 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:25:18.393 12:06:08 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:18.393 12:06:08 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:25:18.393 12:06:08 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:25:18.393 12:06:08 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:25:18.393 12:06:08 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:25:18.393 12:06:08 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:25:18.393 12:06:08 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:18.393 12:06:08 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:25:18.393 12:06:08 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:25:18.394 12:06:08 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "51984aa2-9e44-56b9-ab73-51732c1c88cc"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "51984aa2-9e44-56b9-ab73-51732c1c88cc",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "c2cd82f8-c31d-5bcf-9b0f-213e6cc6d025"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "c2cd82f8-c31d-5bcf-9b0f-213e6cc6d025",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "b0a66265-8016-5276-8fee-ac9734b78c5c"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "b0a66265-8016-5276-8fee-ac9734b78c5c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "5e17f3a1-3362-5183-a2ba-d6c913e50854"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "5e17f3a1-3362-5183-a2ba-d6c913e50854",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:25:18.394 12:06:08 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # jq -r .name 00:25:18.394 12:06:08 blockdev_crypto_aesni -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:25:18.394 12:06:08 blockdev_crypto_aesni -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:25:18.394 12:06:08 blockdev_crypto_aesni -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:25:18.394 12:06:08 blockdev_crypto_aesni -- bdev/blockdev.sh@754 -- # killprocess 768295 00:25:18.394 12:06:08 blockdev_crypto_aesni -- common/autotest_common.sh@948 -- # '[' -z 768295 ']' 00:25:18.394 12:06:08 blockdev_crypto_aesni -- common/autotest_common.sh@952 -- # kill -0 768295 00:25:18.394 12:06:08 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # uname 00:25:18.394 12:06:08 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:18.394 12:06:08 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 768295 00:25:18.394 12:06:08 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:18.394 12:06:08 blockdev_crypto_aesni -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:18.394 12:06:08 blockdev_crypto_aesni -- common/autotest_common.sh@966 -- # echo 'killing process with pid 768295' 00:25:18.394 killing process with pid 768295 00:25:18.394 12:06:08 blockdev_crypto_aesni -- common/autotest_common.sh@967 -- # kill 768295 00:25:18.394 12:06:08 blockdev_crypto_aesni -- common/autotest_common.sh@972 -- # wait 768295 00:25:18.653 12:06:08 blockdev_crypto_aesni -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:25:18.653 12:06:08 blockdev_crypto_aesni -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:25:18.653 12:06:08 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:25:18.653 12:06:08 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:18.653 12:06:08 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:18.653 ************************************ 00:25:18.653 START TEST bdev_hello_world 00:25:18.653 ************************************ 00:25:18.653 12:06:08 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:25:18.653 [2024-07-12 12:06:08.881321] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:25:18.653 [2024-07-12 12:06:08.881352] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid769173 ] 00:25:18.912 [2024-07-12 12:06:08.938178] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:18.912 [2024-07-12 12:06:09.010620] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:25:18.912 [2024-07-12 12:06:09.031452] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:25:18.912 [2024-07-12 12:06:09.039476] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:25:18.912 [2024-07-12 12:06:09.047506] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:25:18.912 [2024-07-12 12:06:09.147675] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:25:21.446 [2024-07-12 12:06:11.297596] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:25:21.446 [2024-07-12 12:06:11.297647] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:25:21.446 [2024-07-12 12:06:11.297655] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:21.446 [2024-07-12 12:06:11.305615] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:25:21.446 [2024-07-12 12:06:11.305627] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:25:21.446 [2024-07-12 12:06:11.305632] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:21.446 [2024-07-12 12:06:11.313634] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:25:21.446 [2024-07-12 12:06:11.313645] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:25:21.446 [2024-07-12 12:06:11.313649] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:21.446 [2024-07-12 12:06:11.321656] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:25:21.446 [2024-07-12 12:06:11.321667] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:25:21.446 [2024-07-12 12:06:11.321672] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:21.446 [2024-07-12 12:06:11.389095] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:25:21.446 [2024-07-12 12:06:11.389126] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:25:21.446 [2024-07-12 12:06:11.389137] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:25:21.446 [2024-07-12 12:06:11.390036] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:25:21.446 [2024-07-12 12:06:11.390087] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:25:21.446 [2024-07-12 12:06:11.390097] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:25:21.446 [2024-07-12 12:06:11.390125] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:25:21.446 00:25:21.446 [2024-07-12 12:06:11.390136] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:25:21.446 00:25:21.446 real 0m2.841s 00:25:21.446 user 0m2.564s 00:25:21.446 sys 0m0.238s 00:25:21.446 12:06:11 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:21.446 12:06:11 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:25:21.446 ************************************ 00:25:21.446 END TEST bdev_hello_world 00:25:21.446 ************************************ 00:25:21.705 12:06:11 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:25:21.705 12:06:11 blockdev_crypto_aesni -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:25:21.705 12:06:11 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:25:21.705 12:06:11 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:21.705 12:06:11 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:21.705 ************************************ 00:25:21.705 START TEST bdev_bounds 00:25:21.705 ************************************ 00:25:21.705 12:06:11 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:25:21.705 12:06:11 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=769855 00:25:21.705 12:06:11 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:25:21.705 12:06:11 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:25:21.705 12:06:11 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 769855' 00:25:21.705 Process bdevio pid: 769855 00:25:21.705 12:06:11 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 769855 00:25:21.705 12:06:11 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 769855 ']' 00:25:21.705 12:06:11 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:25:21.705 12:06:11 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:21.705 12:06:11 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:25:21.705 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:25:21.705 12:06:11 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:21.706 12:06:11 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:25:21.706 [2024-07-12 12:06:11.797955] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:25:21.706 [2024-07-12 12:06:11.797991] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid769855 ] 00:25:21.706 [2024-07-12 12:06:11.861497] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:25:21.706 [2024-07-12 12:06:11.940769] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:25:21.706 [2024-07-12 12:06:11.940864] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:25:21.706 [2024-07-12 12:06:11.940864] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:25:21.964 [2024-07-12 12:06:11.961756] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:25:21.964 [2024-07-12 12:06:11.969785] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:25:21.964 [2024-07-12 12:06:11.977805] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:25:21.964 [2024-07-12 12:06:12.073566] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:25:24.497 [2024-07-12 12:06:14.219086] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:25:24.497 [2024-07-12 12:06:14.219143] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:25:24.497 [2024-07-12 12:06:14.219152] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:24.497 [2024-07-12 12:06:14.227106] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:25:24.497 [2024-07-12 12:06:14.227119] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:25:24.497 [2024-07-12 12:06:14.227125] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:24.497 [2024-07-12 12:06:14.235128] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:25:24.497 [2024-07-12 12:06:14.235138] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:25:24.497 [2024-07-12 12:06:14.235143] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:24.497 [2024-07-12 12:06:14.243150] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:25:24.497 [2024-07-12 12:06:14.243159] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:25:24.497 [2024-07-12 12:06:14.243165] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:24.497 12:06:14 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:24.497 12:06:14 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:25:24.497 12:06:14 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:25:24.497 I/O targets: 00:25:24.497 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:25:24.497 crypto_ram2: 65536 blocks of 512 bytes (32 MiB) 00:25:24.497 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:25:24.497 crypto_ram4: 8192 blocks of 4096 bytes (32 MiB) 00:25:24.497 00:25:24.497 00:25:24.497 CUnit - A unit testing framework for C - Version 2.1-3 00:25:24.497 http://cunit.sourceforge.net/ 00:25:24.497 00:25:24.497 00:25:24.497 Suite: bdevio tests on: crypto_ram4 00:25:24.497 Test: blockdev write read block ...passed 00:25:24.497 Test: blockdev write zeroes read block ...passed 00:25:24.497 Test: blockdev write zeroes read no split ...passed 00:25:24.497 Test: blockdev write zeroes read split ...passed 00:25:24.497 Test: blockdev write zeroes read split partial ...passed 00:25:24.497 Test: blockdev reset ...passed 00:25:24.497 Test: blockdev write read 8 blocks ...passed 00:25:24.497 Test: blockdev write read size > 128k ...passed 00:25:24.497 Test: blockdev write read invalid size ...passed 00:25:24.497 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:25:24.497 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:25:24.497 Test: blockdev write read max offset ...passed 00:25:24.497 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:25:24.497 Test: blockdev writev readv 8 blocks ...passed 00:25:24.497 Test: blockdev writev readv 30 x 1block ...passed 00:25:24.497 Test: blockdev writev readv block ...passed 00:25:24.497 Test: blockdev writev readv size > 128k ...passed 00:25:24.497 Test: blockdev writev readv size > 128k in two iovs ...passed 00:25:24.497 Test: blockdev comparev and writev ...passed 00:25:24.497 Test: blockdev nvme passthru rw ...passed 00:25:24.497 Test: blockdev nvme passthru vendor specific ...passed 00:25:24.497 Test: blockdev nvme admin passthru ...passed 00:25:24.497 Test: blockdev copy ...passed 00:25:24.497 Suite: bdevio tests on: crypto_ram3 00:25:24.497 Test: blockdev write read block ...passed 00:25:24.497 Test: blockdev write zeroes read block ...passed 00:25:24.497 Test: blockdev write zeroes read no split ...passed 00:25:24.497 Test: blockdev write zeroes read split ...passed 00:25:24.497 Test: blockdev write zeroes read split partial ...passed 00:25:24.497 Test: blockdev reset ...passed 00:25:24.497 Test: blockdev write read 8 blocks ...passed 00:25:24.497 Test: blockdev write read size > 128k ...passed 00:25:24.497 Test: blockdev write read invalid size ...passed 00:25:24.498 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:25:24.498 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:25:24.498 Test: blockdev write read max offset ...passed 00:25:24.498 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:25:24.498 Test: blockdev writev readv 8 blocks ...passed 00:25:24.498 Test: blockdev writev readv 30 x 1block ...passed 00:25:24.498 Test: blockdev writev readv block ...passed 00:25:24.498 Test: blockdev writev readv size > 128k ...passed 00:25:24.498 Test: blockdev writev readv size > 128k in two iovs ...passed 00:25:24.498 Test: blockdev comparev and writev ...passed 00:25:24.498 Test: blockdev nvme passthru rw ...passed 00:25:24.498 Test: blockdev nvme passthru vendor specific ...passed 00:25:24.498 Test: blockdev nvme admin passthru ...passed 00:25:24.498 Test: blockdev copy ...passed 00:25:24.498 Suite: bdevio tests on: crypto_ram2 00:25:24.498 Test: blockdev write read block ...passed 00:25:24.498 Test: blockdev write zeroes read block ...passed 00:25:24.498 Test: blockdev write zeroes read no split ...passed 00:25:24.498 Test: blockdev write zeroes read split ...passed 00:25:24.498 Test: blockdev write zeroes read split partial ...passed 00:25:24.498 Test: blockdev reset ...passed 00:25:24.498 Test: blockdev write read 8 blocks ...passed 00:25:24.498 Test: blockdev write read size > 128k ...passed 00:25:24.498 Test: blockdev write read invalid size ...passed 00:25:24.498 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:25:24.498 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:25:24.498 Test: blockdev write read max offset ...passed 00:25:24.498 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:25:24.498 Test: blockdev writev readv 8 blocks ...passed 00:25:24.498 Test: blockdev writev readv 30 x 1block ...passed 00:25:24.498 Test: blockdev writev readv block ...passed 00:25:24.498 Test: blockdev writev readv size > 128k ...passed 00:25:24.498 Test: blockdev writev readv size > 128k in two iovs ...passed 00:25:24.498 Test: blockdev comparev and writev ...passed 00:25:24.498 Test: blockdev nvme passthru rw ...passed 00:25:24.498 Test: blockdev nvme passthru vendor specific ...passed 00:25:24.498 Test: blockdev nvme admin passthru ...passed 00:25:24.498 Test: blockdev copy ...passed 00:25:24.498 Suite: bdevio tests on: crypto_ram 00:25:24.498 Test: blockdev write read block ...passed 00:25:24.498 Test: blockdev write zeroes read block ...passed 00:25:24.498 Test: blockdev write zeroes read no split ...passed 00:25:24.498 Test: blockdev write zeroes read split ...passed 00:25:24.498 Test: blockdev write zeroes read split partial ...passed 00:25:24.498 Test: blockdev reset ...passed 00:25:24.498 Test: blockdev write read 8 blocks ...passed 00:25:24.498 Test: blockdev write read size > 128k ...passed 00:25:24.498 Test: blockdev write read invalid size ...passed 00:25:24.498 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:25:24.498 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:25:24.498 Test: blockdev write read max offset ...passed 00:25:24.498 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:25:24.498 Test: blockdev writev readv 8 blocks ...passed 00:25:24.498 Test: blockdev writev readv 30 x 1block ...passed 00:25:24.498 Test: blockdev writev readv block ...passed 00:25:24.498 Test: blockdev writev readv size > 128k ...passed 00:25:24.498 Test: blockdev writev readv size > 128k in two iovs ...passed 00:25:24.498 Test: blockdev comparev and writev ...passed 00:25:24.498 Test: blockdev nvme passthru rw ...passed 00:25:24.498 Test: blockdev nvme passthru vendor specific ...passed 00:25:24.498 Test: blockdev nvme admin passthru ...passed 00:25:24.498 Test: blockdev copy ...passed 00:25:24.498 00:25:24.498 Run Summary: Type Total Ran Passed Failed Inactive 00:25:24.498 suites 4 4 n/a 0 0 00:25:24.498 tests 92 92 92 0 0 00:25:24.498 asserts 520 520 520 0 n/a 00:25:24.498 00:25:24.498 Elapsed time = 0.523 seconds 00:25:24.498 0 00:25:24.498 12:06:14 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 769855 00:25:24.498 12:06:14 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 769855 ']' 00:25:24.498 12:06:14 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 769855 00:25:24.498 12:06:14 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:25:24.498 12:06:14 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:24.498 12:06:14 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 769855 00:25:24.498 12:06:14 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:24.498 12:06:14 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:24.498 12:06:14 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 769855' 00:25:24.498 killing process with pid 769855 00:25:24.498 12:06:14 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@967 -- # kill 769855 00:25:24.498 12:06:14 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@972 -- # wait 769855 00:25:25.067 12:06:15 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:25:25.067 00:25:25.067 real 0m3.297s 00:25:25.067 user 0m9.326s 00:25:25.067 sys 0m0.386s 00:25:25.067 12:06:15 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:25.067 12:06:15 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:25:25.067 ************************************ 00:25:25.067 END TEST bdev_bounds 00:25:25.067 ************************************ 00:25:25.067 12:06:15 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:25:25.067 12:06:15 blockdev_crypto_aesni -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:25:25.067 12:06:15 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:25:25.067 12:06:15 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:25.067 12:06:15 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:25.067 ************************************ 00:25:25.067 START TEST bdev_nbd 00:25:25.067 ************************************ 00:25:25.067 12:06:15 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:25:25.067 12:06:15 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:25:25.067 12:06:15 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:25:25.067 12:06:15 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:25.067 12:06:15 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:25:25.067 12:06:15 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:25:25.067 12:06:15 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:25:25.067 12:06:15 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:25:25.067 12:06:15 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:25:25.067 12:06:15 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:25:25.067 12:06:15 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:25:25.067 12:06:15 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:25:25.067 12:06:15 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:25:25.067 12:06:15 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:25:25.067 12:06:15 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:25:25.067 12:06:15 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:25:25.067 12:06:15 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=770341 00:25:25.067 12:06:15 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:25:25.067 12:06:15 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:25:25.067 12:06:15 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 770341 /var/tmp/spdk-nbd.sock 00:25:25.067 12:06:15 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 770341 ']' 00:25:25.067 12:06:15 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:25:25.067 12:06:15 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:25.067 12:06:15 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:25:25.067 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:25:25.067 12:06:15 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:25.067 12:06:15 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:25:25.067 [2024-07-12 12:06:15.160473] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:25:25.067 [2024-07-12 12:06:15.160509] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:25:25.067 [2024-07-12 12:06:15.225270] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:25.067 [2024-07-12 12:06:15.302016] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:25:25.400 [2024-07-12 12:06:15.322880] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:25:25.400 [2024-07-12 12:06:15.330898] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:25:25.400 [2024-07-12 12:06:15.338918] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:25:25.400 [2024-07-12 12:06:15.436385] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:25:27.943 [2024-07-12 12:06:17.583823] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:25:27.943 [2024-07-12 12:06:17.583867] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:25:27.943 [2024-07-12 12:06:17.583875] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:27.943 [2024-07-12 12:06:17.591843] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:25:27.943 [2024-07-12 12:06:17.591854] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:25:27.943 [2024-07-12 12:06:17.591859] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:27.943 [2024-07-12 12:06:17.599863] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:25:27.943 [2024-07-12 12:06:17.599872] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:25:27.943 [2024-07-12 12:06:17.599878] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:27.943 [2024-07-12 12:06:17.607881] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:25:27.943 [2024-07-12 12:06:17.607890] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:25:27.943 [2024-07-12 12:06:17.607895] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:27.943 12:06:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:27.943 12:06:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:25:27.943 12:06:17 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:25:27.943 12:06:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:27.943 12:06:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:25:27.943 12:06:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:25:27.943 12:06:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:25:27.943 12:06:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:27.943 12:06:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:25:27.943 12:06:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:25:27.943 12:06:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:25:27.943 12:06:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:25:27.943 12:06:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:25:27.943 12:06:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:25:27.943 12:06:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:25:27.943 12:06:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:25:27.943 12:06:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:25:27.943 12:06:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:25:27.943 12:06:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:27.943 12:06:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:25:27.943 12:06:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:27.943 12:06:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:27.943 12:06:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:27.943 12:06:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:25:27.943 12:06:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:27.943 12:06:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:27.943 12:06:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:27.943 1+0 records in 00:25:27.943 1+0 records out 00:25:27.943 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000152294 s, 26.9 MB/s 00:25:27.943 12:06:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:27.943 12:06:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:25:27.943 12:06:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:27.943 12:06:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:27.943 12:06:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:25:27.943 12:06:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:25:27.943 12:06:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:25:27.943 12:06:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:25:27.943 12:06:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:25:27.943 12:06:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:25:27.943 12:06:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:25:27.943 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:27.943 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:25:27.943 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:27.943 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:27.943 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:27.943 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:25:27.944 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:27.944 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:27.944 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:27.944 1+0 records in 00:25:27.944 1+0 records out 00:25:27.944 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000229084 s, 17.9 MB/s 00:25:27.944 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:27.944 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:25:27.944 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:27.944 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:27.944 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:25:27.944 12:06:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:25:27.944 12:06:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:25:27.944 12:06:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:25:28.202 12:06:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:25:28.202 12:06:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:25:28.202 12:06:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:25:28.202 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:25:28.202 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:25:28.202 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:28.202 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:28.202 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:25:28.202 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:25:28.202 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:28.202 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:28.202 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:28.202 1+0 records in 00:25:28.202 1+0 records out 00:25:28.202 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000261412 s, 15.7 MB/s 00:25:28.202 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:28.202 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:25:28.202 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:28.202 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:28.202 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:25:28.202 12:06:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:25:28.202 12:06:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:25:28.202 12:06:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 00:25:28.462 12:06:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:25:28.462 12:06:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:25:28.462 12:06:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:25:28.462 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:25:28.462 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:25:28.462 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:28.462 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:28.462 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:25:28.462 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:25:28.462 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:28.462 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:28.462 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:28.462 1+0 records in 00:25:28.462 1+0 records out 00:25:28.462 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000201802 s, 20.3 MB/s 00:25:28.462 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:28.462 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:25:28.462 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:28.462 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:28.462 12:06:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:25:28.462 12:06:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:25:28.462 12:06:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:25:28.462 12:06:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:25:28.462 12:06:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:25:28.462 { 00:25:28.462 "nbd_device": "/dev/nbd0", 00:25:28.462 "bdev_name": "crypto_ram" 00:25:28.462 }, 00:25:28.462 { 00:25:28.462 "nbd_device": "/dev/nbd1", 00:25:28.462 "bdev_name": "crypto_ram2" 00:25:28.462 }, 00:25:28.462 { 00:25:28.462 "nbd_device": "/dev/nbd2", 00:25:28.462 "bdev_name": "crypto_ram3" 00:25:28.462 }, 00:25:28.462 { 00:25:28.462 "nbd_device": "/dev/nbd3", 00:25:28.462 "bdev_name": "crypto_ram4" 00:25:28.462 } 00:25:28.462 ]' 00:25:28.462 12:06:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:25:28.462 12:06:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:25:28.462 { 00:25:28.462 "nbd_device": "/dev/nbd0", 00:25:28.462 "bdev_name": "crypto_ram" 00:25:28.462 }, 00:25:28.462 { 00:25:28.462 "nbd_device": "/dev/nbd1", 00:25:28.462 "bdev_name": "crypto_ram2" 00:25:28.462 }, 00:25:28.462 { 00:25:28.462 "nbd_device": "/dev/nbd2", 00:25:28.462 "bdev_name": "crypto_ram3" 00:25:28.462 }, 00:25:28.462 { 00:25:28.462 "nbd_device": "/dev/nbd3", 00:25:28.462 "bdev_name": "crypto_ram4" 00:25:28.462 } 00:25:28.462 ]' 00:25:28.462 12:06:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:25:28.721 12:06:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:25:28.721 12:06:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:28.721 12:06:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:25:28.721 12:06:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:28.721 12:06:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:25:28.721 12:06:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:28.721 12:06:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:25:28.721 12:06:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:28.721 12:06:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:28.721 12:06:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:28.721 12:06:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:28.721 12:06:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:28.721 12:06:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:28.721 12:06:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:28.721 12:06:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:28.721 12:06:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:28.721 12:06:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:25:28.980 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:28.980 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:28.980 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:28.980 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:28.980 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:28.980 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:28.980 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:28.980 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:28.980 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:28.980 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:25:29.239 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:25:29.239 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:25:29.239 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:25:29.239 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:29.239 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:29.239 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:25:29.239 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:29.239 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:29.239 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:29.239 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:25:29.239 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:25:29.239 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:25:29.239 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:25:29.239 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:29.239 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:29.239 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:25:29.239 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:29.239 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:29.239 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:25:29.239 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:29.239 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:25:29.498 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:25:29.498 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:25:29.498 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:25:29.498 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:25:29.498 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:25:29.498 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:25:29.498 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:25:29.498 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:25:29.498 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:25:29.498 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:25:29.498 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:25:29.498 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:25:29.498 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:25:29.498 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:29.498 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:25:29.498 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:25:29.498 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:25:29.498 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:25:29.498 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:25:29.498 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:29.498 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:25:29.498 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:29.498 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:25:29.498 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:29.498 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:25:29.498 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:29.498 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:25:29.498 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:25:29.756 /dev/nbd0 00:25:29.756 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:29.756 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:29.757 12:06:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:29.757 12:06:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:25:29.757 12:06:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:29.757 12:06:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:29.757 12:06:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:29.757 12:06:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:25:29.757 12:06:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:29.757 12:06:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:29.757 12:06:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:29.757 1+0 records in 00:25:29.757 1+0 records out 00:25:29.757 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000212099 s, 19.3 MB/s 00:25:29.757 12:06:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:29.757 12:06:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:25:29.757 12:06:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:29.757 12:06:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:29.757 12:06:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:25:29.757 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:29.757 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:25:29.757 12:06:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd1 00:25:30.015 /dev/nbd1 00:25:30.015 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:30.015 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:30.015 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:30.015 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:25:30.015 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:30.015 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:30.015 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:30.015 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:25:30.015 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:30.015 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:30.015 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:30.015 1+0 records in 00:25:30.015 1+0 records out 00:25:30.015 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000176546 s, 23.2 MB/s 00:25:30.015 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:30.015 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:25:30.015 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:30.015 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:30.015 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:25:30.015 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:30.015 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:25:30.015 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd10 00:25:30.015 /dev/nbd10 00:25:30.015 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:25:30.015 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:25:30.015 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:25:30.015 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:25:30.015 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:30.015 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:30.015 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:25:30.015 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:25:30.015 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:30.015 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:30.015 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:30.015 1+0 records in 00:25:30.015 1+0 records out 00:25:30.015 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000177108 s, 23.1 MB/s 00:25:30.015 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:30.274 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:25:30.274 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:30.274 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:30.274 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:25:30.274 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:30.274 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:25:30.274 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 /dev/nbd11 00:25:30.274 /dev/nbd11 00:25:30.274 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:25:30.274 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:25:30.274 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:25:30.274 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:25:30.274 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:30.274 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:30.274 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:25:30.274 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:25:30.274 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:30.274 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:30.274 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:30.274 1+0 records in 00:25:30.274 1+0 records out 00:25:30.274 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000233495 s, 17.5 MB/s 00:25:30.274 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:30.274 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:25:30.274 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:30.274 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:30.274 12:06:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:25:30.274 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:30.274 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:25:30.274 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:25:30.274 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:30.274 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:25:30.533 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:25:30.534 { 00:25:30.534 "nbd_device": "/dev/nbd0", 00:25:30.534 "bdev_name": "crypto_ram" 00:25:30.534 }, 00:25:30.534 { 00:25:30.534 "nbd_device": "/dev/nbd1", 00:25:30.534 "bdev_name": "crypto_ram2" 00:25:30.534 }, 00:25:30.534 { 00:25:30.534 "nbd_device": "/dev/nbd10", 00:25:30.534 "bdev_name": "crypto_ram3" 00:25:30.534 }, 00:25:30.534 { 00:25:30.534 "nbd_device": "/dev/nbd11", 00:25:30.534 "bdev_name": "crypto_ram4" 00:25:30.534 } 00:25:30.534 ]' 00:25:30.534 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:25:30.534 { 00:25:30.534 "nbd_device": "/dev/nbd0", 00:25:30.534 "bdev_name": "crypto_ram" 00:25:30.534 }, 00:25:30.534 { 00:25:30.534 "nbd_device": "/dev/nbd1", 00:25:30.534 "bdev_name": "crypto_ram2" 00:25:30.534 }, 00:25:30.534 { 00:25:30.534 "nbd_device": "/dev/nbd10", 00:25:30.534 "bdev_name": "crypto_ram3" 00:25:30.534 }, 00:25:30.534 { 00:25:30.534 "nbd_device": "/dev/nbd11", 00:25:30.534 "bdev_name": "crypto_ram4" 00:25:30.534 } 00:25:30.534 ]' 00:25:30.534 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:25:30.534 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:25:30.534 /dev/nbd1 00:25:30.534 /dev/nbd10 00:25:30.534 /dev/nbd11' 00:25:30.534 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:25:30.534 /dev/nbd1 00:25:30.534 /dev/nbd10 00:25:30.534 /dev/nbd11' 00:25:30.534 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:25:30.534 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:25:30.534 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:25:30.534 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:25:30.534 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:25:30.534 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:25:30.534 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:25:30.534 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:25:30.534 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:25:30.534 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:25:30.534 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:25:30.534 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:25:30.534 256+0 records in 00:25:30.534 256+0 records out 00:25:30.534 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0104348 s, 100 MB/s 00:25:30.534 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:25:30.534 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:25:30.534 256+0 records in 00:25:30.534 256+0 records out 00:25:30.534 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0264184 s, 39.7 MB/s 00:25:30.534 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:25:30.534 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:25:30.534 256+0 records in 00:25:30.534 256+0 records out 00:25:30.534 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0304207 s, 34.5 MB/s 00:25:30.534 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:25:30.534 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:25:30.793 256+0 records in 00:25:30.793 256+0 records out 00:25:30.793 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0270366 s, 38.8 MB/s 00:25:30.793 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:25:30.793 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:25:30.793 256+0 records in 00:25:30.793 256+0 records out 00:25:30.793 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0251141 s, 41.8 MB/s 00:25:30.793 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:25:30.793 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:25:30.793 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:25:30.793 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:25:30.793 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:25:30.793 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:25:30.793 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:25:30.793 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:25:30.793 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:25:30.793 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:25:30.794 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:25:30.794 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:25:30.794 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:25:30.794 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:25:30.794 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:25:30.794 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:25:30.794 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:25:30.794 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:30.794 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:25:30.794 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:30.794 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:25:30.794 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:30.794 12:06:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:25:31.053 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:31.053 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:31.053 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:31.053 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:31.053 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:31.053 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:31.053 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:31.053 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:31.053 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:31.053 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:25:31.053 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:31.053 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:31.053 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:31.053 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:31.053 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:31.053 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:31.053 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:31.053 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:31.053 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:31.053 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:25:31.334 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:25:31.334 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:25:31.334 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:25:31.334 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:31.334 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:31.334 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:25:31.334 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:31.334 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:31.334 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:31.334 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:25:31.593 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:25:31.593 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:25:31.593 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:25:31.593 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:31.593 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:31.593 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:25:31.593 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:31.593 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:31.593 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:25:31.593 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:31.593 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:25:31.593 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:25:31.594 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:25:31.594 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:25:31.853 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:25:31.853 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:25:31.853 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:25:31.853 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:25:31.853 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:25:31.853 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:25:31.853 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:25:31.853 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:25:31.853 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:25:31.853 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:25:31.853 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:31.853 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:25:31.853 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:25:31.853 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:25:31.853 12:06:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:25:31.853 malloc_lvol_verify 00:25:31.853 12:06:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:25:32.111 2e94283d-a8d8-4114-84fd-5608d912db6b 00:25:32.111 12:06:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:25:32.111 1a4e9423-1344-4bb1-a15f-749eb5dbc057 00:25:32.112 12:06:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:25:32.371 /dev/nbd0 00:25:32.371 12:06:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:25:32.371 mke2fs 1.46.5 (30-Dec-2021) 00:25:32.371 Discarding device blocks: 0/4096 done 00:25:32.371 Creating filesystem with 4096 1k blocks and 1024 inodes 00:25:32.371 00:25:32.371 Allocating group tables: 0/1 done 00:25:32.371 Writing inode tables: 0/1 done 00:25:32.371 Creating journal (1024 blocks): done 00:25:32.371 Writing superblocks and filesystem accounting information: 0/1 done 00:25:32.371 00:25:32.371 12:06:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:25:32.371 12:06:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:25:32.371 12:06:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:32.371 12:06:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:32.371 12:06:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:32.371 12:06:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:25:32.371 12:06:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:32.371 12:06:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:25:32.630 12:06:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:32.630 12:06:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:32.630 12:06:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:32.630 12:06:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:32.630 12:06:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:32.630 12:06:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:32.630 12:06:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:32.630 12:06:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:32.630 12:06:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:25:32.630 12:06:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:25:32.630 12:06:22 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 770341 00:25:32.630 12:06:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 770341 ']' 00:25:32.630 12:06:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 770341 00:25:32.630 12:06:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:25:32.630 12:06:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:32.630 12:06:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 770341 00:25:32.630 12:06:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:32.630 12:06:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:32.630 12:06:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 770341' 00:25:32.630 killing process with pid 770341 00:25:32.630 12:06:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@967 -- # kill 770341 00:25:32.630 12:06:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@972 -- # wait 770341 00:25:32.889 12:06:23 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:25:32.889 00:25:32.889 real 0m7.966s 00:25:32.889 user 0m10.639s 00:25:32.889 sys 0m2.433s 00:25:32.889 12:06:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:32.889 12:06:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:25:32.889 ************************************ 00:25:32.889 END TEST bdev_nbd 00:25:32.889 ************************************ 00:25:32.889 12:06:23 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:25:32.889 12:06:23 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:25:32.889 12:06:23 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = nvme ']' 00:25:32.889 12:06:23 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = gpt ']' 00:25:32.889 12:06:23 blockdev_crypto_aesni -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:25:32.889 12:06:23 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:25:32.889 12:06:23 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:32.889 12:06:23 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:33.148 ************************************ 00:25:33.148 START TEST bdev_fio 00:25:33.148 ************************************ 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:25:33.148 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram4]' 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram4 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:25:33.148 ************************************ 00:25:33.148 START TEST bdev_fio_rw_verify 00:25:33.148 ************************************ 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:25:33.148 12:06:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:25:33.149 12:06:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:25:33.149 12:06:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:25:33.149 12:06:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:25:33.149 12:06:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:25:33.149 12:06:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:25:33.149 12:06:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:25:33.149 12:06:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:25:33.149 12:06:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:25:33.149 12:06:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:25:33.149 12:06:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:25:33.149 12:06:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:25:33.149 12:06:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:25:33.149 12:06:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:25:33.149 12:06:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:25:33.408 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:25:33.408 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:25:33.408 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:25:33.408 job_crypto_ram4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:25:33.408 fio-3.35 00:25:33.408 Starting 4 threads 00:25:48.293 00:25:48.293 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=772472: Fri Jul 12 12:06:36 2024 00:25:48.293 read: IOPS=26.8k, BW=105MiB/s (110MB/s)(1045MiB/10001msec) 00:25:48.293 slat (usec): min=11, max=912, avg=51.48, stdev=25.48 00:25:48.293 clat (usec): min=9, max=1969, avg=275.70, stdev=159.07 00:25:48.293 lat (usec): min=36, max=2150, avg=327.17, stdev=171.60 00:25:48.293 clat percentiles (usec): 00:25:48.293 | 50.000th=[ 243], 99.000th=[ 709], 99.900th=[ 865], 99.990th=[ 1156], 00:25:48.293 | 99.999th=[ 1729] 00:25:48.293 write: IOPS=29.5k, BW=115MiB/s (121MB/s)(1120MiB/9732msec); 0 zone resets 00:25:48.293 slat (usec): min=16, max=422, avg=59.18, stdev=23.63 00:25:48.293 clat (usec): min=24, max=2699, avg=322.74, stdev=181.27 00:25:48.293 lat (usec): min=48, max=2752, avg=381.92, stdev=191.24 00:25:48.293 clat percentiles (usec): 00:25:48.293 | 50.000th=[ 293], 99.000th=[ 865], 99.900th=[ 1012], 99.990th=[ 1221], 00:25:48.293 | 99.999th=[ 2180] 00:25:48.293 bw ( KiB/s): min=93840, max=172181, per=97.20%, avg=114558.16, stdev=5672.00, samples=76 00:25:48.293 iops : min=23460, max=43045, avg=28639.53, stdev=1417.99, samples=76 00:25:48.293 lat (usec) : 10=0.01%, 20=0.01%, 50=1.33%, 100=8.05%, 250=36.50% 00:25:48.293 lat (usec) : 500=40.99%, 750=11.40%, 1000=1.65% 00:25:48.293 lat (msec) : 2=0.07%, 4=0.01% 00:25:48.293 cpu : usr=99.69%, sys=0.00%, ctx=49, majf=0, minf=251 00:25:48.293 IO depths : 1=9.9%, 2=25.7%, 4=51.4%, 8=13.1%, 16=0.0%, 32=0.0%, >=64=0.0% 00:25:48.293 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:25:48.293 complete : 0=0.0%, 4=88.7%, 8=11.3%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:25:48.293 issued rwts: total=267549,286746,0,0 short=0,0,0,0 dropped=0,0,0,0 00:25:48.293 latency : target=0, window=0, percentile=100.00%, depth=8 00:25:48.293 00:25:48.293 Run status group 0 (all jobs): 00:25:48.293 READ: bw=105MiB/s (110MB/s), 105MiB/s-105MiB/s (110MB/s-110MB/s), io=1045MiB (1096MB), run=10001-10001msec 00:25:48.293 WRITE: bw=115MiB/s (121MB/s), 115MiB/s-115MiB/s (121MB/s-121MB/s), io=1120MiB (1175MB), run=9732-9732msec 00:25:48.293 00:25:48.293 real 0m13.223s 00:25:48.293 user 0m48.557s 00:25:48.293 sys 0m0.349s 00:25:48.293 12:06:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:48.293 12:06:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:25:48.293 ************************************ 00:25:48.293 END TEST bdev_fio_rw_verify 00:25:48.293 ************************************ 00:25:48.293 12:06:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:25:48.293 12:06:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:25:48.293 12:06:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:25:48.293 12:06:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:25:48.293 12:06:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:25:48.293 12:06:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:25:48.293 12:06:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:25:48.293 12:06:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:25:48.293 12:06:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:25:48.293 12:06:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:25:48.293 12:06:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:25:48.293 12:06:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:25:48.293 12:06:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:25:48.293 12:06:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:25:48.293 12:06:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:25:48.293 12:06:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:25:48.293 12:06:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:25:48.293 12:06:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:25:48.294 12:06:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "51984aa2-9e44-56b9-ab73-51732c1c88cc"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "51984aa2-9e44-56b9-ab73-51732c1c88cc",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "c2cd82f8-c31d-5bcf-9b0f-213e6cc6d025"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "c2cd82f8-c31d-5bcf-9b0f-213e6cc6d025",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "b0a66265-8016-5276-8fee-ac9734b78c5c"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "b0a66265-8016-5276-8fee-ac9734b78c5c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "5e17f3a1-3362-5183-a2ba-d6c913e50854"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "5e17f3a1-3362-5183-a2ba-d6c913e50854",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:25:48.294 12:06:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:25:48.294 crypto_ram2 00:25:48.294 crypto_ram3 00:25:48.294 crypto_ram4 ]] 00:25:48.294 12:06:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:25:48.294 12:06:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "51984aa2-9e44-56b9-ab73-51732c1c88cc"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "51984aa2-9e44-56b9-ab73-51732c1c88cc",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "c2cd82f8-c31d-5bcf-9b0f-213e6cc6d025"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "c2cd82f8-c31d-5bcf-9b0f-213e6cc6d025",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "b0a66265-8016-5276-8fee-ac9734b78c5c"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "b0a66265-8016-5276-8fee-ac9734b78c5c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "5e17f3a1-3362-5183-a2ba-d6c913e50854"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "5e17f3a1-3362-5183-a2ba-d6c913e50854",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:25:48.294 12:06:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:25:48.294 12:06:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:25:48.294 12:06:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:25:48.294 12:06:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:25:48.294 12:06:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:25:48.294 12:06:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:25:48.294 12:06:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:25:48.294 12:06:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:25:48.294 12:06:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:25:48.294 12:06:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:25:48.294 12:06:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram4]' 00:25:48.294 12:06:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram4 00:25:48.294 12:06:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:25:48.295 12:06:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:25:48.295 12:06:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:48.295 12:06:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:25:48.295 ************************************ 00:25:48.295 START TEST bdev_fio_trim 00:25:48.295 ************************************ 00:25:48.295 12:06:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:25:48.295 12:06:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:25:48.295 12:06:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:25:48.295 12:06:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:25:48.295 12:06:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:25:48.295 12:06:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:25:48.295 12:06:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:25:48.295 12:06:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:25:48.295 12:06:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:25:48.295 12:06:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:25:48.295 12:06:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:25:48.295 12:06:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:25:48.295 12:06:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:25:48.295 12:06:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:25:48.295 12:06:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:25:48.295 12:06:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:25:48.295 12:06:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:25:48.295 12:06:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:25:48.295 12:06:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:25:48.295 12:06:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:25:48.295 12:06:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:25:48.295 12:06:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:25:48.295 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:25:48.295 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:25:48.295 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:25:48.295 job_crypto_ram4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:25:48.295 fio-3.35 00:25:48.295 Starting 4 threads 00:26:00.505 00:26:00.505 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=774734: Fri Jul 12 12:06:49 2024 00:26:00.505 write: IOPS=43.3k, BW=169MiB/s (177MB/s)(1691MiB/10001msec); 0 zone resets 00:26:00.505 slat (usec): min=11, max=1269, avg=50.83, stdev=20.85 00:26:00.505 clat (usec): min=21, max=2134, avg=231.97, stdev=116.21 00:26:00.505 lat (usec): min=50, max=2360, avg=282.80, stdev=124.86 00:26:00.505 clat percentiles (usec): 00:26:00.505 | 50.000th=[ 212], 99.000th=[ 562], 99.900th=[ 652], 99.990th=[ 906], 00:26:00.505 | 99.999th=[ 1582] 00:26:00.505 bw ( KiB/s): min=165088, max=261176, per=100.00%, avg=173490.11, stdev=6060.12, samples=76 00:26:00.505 iops : min=41272, max=65294, avg=43372.53, stdev=1515.03, samples=76 00:26:00.505 trim: IOPS=43.3k, BW=169MiB/s (177MB/s)(1691MiB/10001msec); 0 zone resets 00:26:00.505 slat (usec): min=4, max=251, avg=16.55, stdev= 6.81 00:26:00.505 clat (usec): min=41, max=1562, avg=216.86, stdev=87.40 00:26:00.505 lat (usec): min=46, max=1575, avg=233.41, stdev=89.30 00:26:00.505 clat percentiles (usec): 00:26:00.505 | 50.000th=[ 208], 99.000th=[ 420], 99.900th=[ 469], 99.990th=[ 685], 00:26:00.505 | 99.999th=[ 1237] 00:26:00.505 bw ( KiB/s): min=165080, max=261208, per=100.00%, avg=173491.79, stdev=6060.76, samples=76 00:26:00.505 iops : min=41270, max=65302, avg=43372.95, stdev=1515.19, samples=76 00:26:00.505 lat (usec) : 50=0.44%, 100=8.35%, 250=55.09%, 500=34.43%, 750=1.67% 00:26:00.505 lat (usec) : 1000=0.01% 00:26:00.505 lat (msec) : 2=0.01%, 4=0.01% 00:26:00.505 cpu : usr=99.68%, sys=0.00%, ctx=52, majf=0, minf=85 00:26:00.505 IO depths : 1=6.9%, 2=26.6%, 4=53.2%, 8=13.3%, 16=0.0%, 32=0.0%, >=64=0.0% 00:26:00.505 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:26:00.505 complete : 0=0.0%, 4=88.3%, 8=11.7%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:26:00.505 issued rwts: total=0,432965,432965,0 short=0,0,0,0 dropped=0,0,0,0 00:26:00.505 latency : target=0, window=0, percentile=100.00%, depth=8 00:26:00.505 00:26:00.505 Run status group 0 (all jobs): 00:26:00.505 WRITE: bw=169MiB/s (177MB/s), 169MiB/s-169MiB/s (177MB/s-177MB/s), io=1691MiB (1773MB), run=10001-10001msec 00:26:00.505 TRIM: bw=169MiB/s (177MB/s), 169MiB/s-169MiB/s (177MB/s-177MB/s), io=1691MiB (1773MB), run=10001-10001msec 00:26:00.505 00:26:00.505 real 0m13.232s 00:26:00.505 user 0m48.667s 00:26:00.505 sys 0m0.384s 00:26:00.505 12:06:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:00.505 12:06:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:26:00.505 ************************************ 00:26:00.505 END TEST bdev_fio_trim 00:26:00.505 ************************************ 00:26:00.505 12:06:49 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:26:00.505 12:06:49 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:26:00.505 12:06:49 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:26:00.505 12:06:49 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:26:00.505 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:26:00.505 12:06:49 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:26:00.505 00:26:00.505 real 0m26.720s 00:26:00.505 user 1m37.380s 00:26:00.505 sys 0m0.860s 00:26:00.505 12:06:49 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:00.505 12:06:49 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:26:00.505 ************************************ 00:26:00.505 END TEST bdev_fio 00:26:00.505 ************************************ 00:26:00.505 12:06:49 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:26:00.505 12:06:49 blockdev_crypto_aesni -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:26:00.505 12:06:49 blockdev_crypto_aesni -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:26:00.505 12:06:49 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:26:00.505 12:06:49 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:00.505 12:06:49 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:26:00.505 ************************************ 00:26:00.505 START TEST bdev_verify 00:26:00.505 ************************************ 00:26:00.505 12:06:49 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:26:00.505 [2024-07-12 12:06:49.982162] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:26:00.505 [2024-07-12 12:06:49.982203] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid776473 ] 00:26:00.506 [2024-07-12 12:06:50.051017] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:26:00.506 [2024-07-12 12:06:50.127693] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:26:00.506 [2024-07-12 12:06:50.127695] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:26:00.506 [2024-07-12 12:06:50.148634] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:26:00.506 [2024-07-12 12:06:50.156656] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:26:00.506 [2024-07-12 12:06:50.164678] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:26:00.506 [2024-07-12 12:06:50.263281] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:26:02.409 [2024-07-12 12:06:52.413931] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:26:02.410 [2024-07-12 12:06:52.413987] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:26:02.410 [2024-07-12 12:06:52.413995] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:02.410 [2024-07-12 12:06:52.421941] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:26:02.410 [2024-07-12 12:06:52.421954] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:26:02.410 [2024-07-12 12:06:52.421960] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:02.410 [2024-07-12 12:06:52.429961] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:26:02.410 [2024-07-12 12:06:52.429971] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:26:02.410 [2024-07-12 12:06:52.429976] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:02.410 [2024-07-12 12:06:52.437983] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:26:02.410 [2024-07-12 12:06:52.437992] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:26:02.410 [2024-07-12 12:06:52.437997] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:02.410 Running I/O for 5 seconds... 00:26:07.679 00:26:07.679 Latency(us) 00:26:07.679 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:07.679 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:26:07.679 Verification LBA range: start 0x0 length 0x1000 00:26:07.679 crypto_ram : 5.05 734.77 2.87 0.00 0.00 173894.00 3089.55 119337.94 00:26:07.679 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:26:07.679 Verification LBA range: start 0x1000 length 0x1000 00:26:07.679 crypto_ram : 5.05 735.23 2.87 0.00 0.00 173790.31 5367.71 118838.61 00:26:07.679 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:26:07.679 Verification LBA range: start 0x0 length 0x1000 00:26:07.679 crypto_ram2 : 5.05 734.67 2.87 0.00 0.00 173451.80 3136.37 106355.57 00:26:07.679 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:26:07.679 Verification LBA range: start 0x1000 length 0x1000 00:26:07.679 crypto_ram2 : 5.05 735.14 2.87 0.00 0.00 173335.60 5898.24 105856.24 00:26:07.679 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:26:07.679 Verification LBA range: start 0x0 length 0x1000 00:26:07.679 crypto_ram3 : 5.04 5759.54 22.50 0.00 0.00 22054.54 3229.99 17351.44 00:26:07.679 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:26:07.679 Verification LBA range: start 0x1000 length 0x1000 00:26:07.679 crypto_ram3 : 5.04 5786.20 22.60 0.00 0.00 21953.78 3042.74 17351.44 00:26:07.679 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:26:07.679 Verification LBA range: start 0x0 length 0x1000 00:26:07.679 crypto_ram4 : 5.05 5758.10 22.49 0.00 0.00 22014.02 3557.67 16477.62 00:26:07.679 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:26:07.679 Verification LBA range: start 0x1000 length 0x1000 00:26:07.679 crypto_ram4 : 5.04 5785.38 22.60 0.00 0.00 21913.50 3183.18 16103.13 00:26:07.679 =================================================================================================================== 00:26:07.679 Total : 26029.01 101.68 0.00 0.00 39127.66 3042.74 119337.94 00:26:07.679 00:26:07.679 real 0m7.963s 00:26:07.679 user 0m15.306s 00:26:07.679 sys 0m0.262s 00:26:07.679 12:06:57 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:07.679 12:06:57 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:26:07.679 ************************************ 00:26:07.679 END TEST bdev_verify 00:26:07.679 ************************************ 00:26:07.938 12:06:57 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:26:07.938 12:06:57 blockdev_crypto_aesni -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:26:07.938 12:06:57 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:26:07.938 12:06:57 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:07.938 12:06:57 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:26:07.938 ************************************ 00:26:07.938 START TEST bdev_verify_big_io 00:26:07.938 ************************************ 00:26:07.938 12:06:57 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:26:07.938 [2024-07-12 12:06:58.014066] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:26:07.938 [2024-07-12 12:06:58.014102] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid777803 ] 00:26:07.938 [2024-07-12 12:06:58.078470] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:26:07.938 [2024-07-12 12:06:58.151354] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:26:07.938 [2024-07-12 12:06:58.151356] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:26:07.938 [2024-07-12 12:06:58.172296] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:26:07.938 [2024-07-12 12:06:58.180317] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:26:08.197 [2024-07-12 12:06:58.188336] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:26:08.197 [2024-07-12 12:06:58.288529] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:26:10.728 [2024-07-12 12:07:00.436119] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:26:10.728 [2024-07-12 12:07:00.436183] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:26:10.728 [2024-07-12 12:07:00.436191] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:10.728 [2024-07-12 12:07:00.444135] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:26:10.728 [2024-07-12 12:07:00.444149] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:26:10.728 [2024-07-12 12:07:00.444155] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:10.728 [2024-07-12 12:07:00.452156] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:26:10.728 [2024-07-12 12:07:00.452167] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:26:10.728 [2024-07-12 12:07:00.452172] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:10.728 [2024-07-12 12:07:00.460178] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:26:10.729 [2024-07-12 12:07:00.460189] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:26:10.729 [2024-07-12 12:07:00.460194] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:10.729 Running I/O for 5 seconds... 00:26:16.042 00:26:16.042 Latency(us) 00:26:16.042 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:16.042 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:26:16.042 Verification LBA range: start 0x0 length 0x100 00:26:16.042 crypto_ram : 5.64 67.14 4.20 0.00 0.00 1860432.90 36200.84 1637775.85 00:26:16.042 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:26:16.042 Verification LBA range: start 0x100 length 0x100 00:26:16.042 crypto_ram : 5.56 68.30 4.27 0.00 0.00 1835917.62 47685.24 1645765.00 00:26:16.042 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:26:16.042 Verification LBA range: start 0x0 length 0x100 00:26:16.042 crypto_ram2 : 5.65 67.31 4.21 0.00 0.00 1798505.27 36450.50 1637775.85 00:26:16.042 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:26:16.042 Verification LBA range: start 0x100 length 0x100 00:26:16.042 crypto_ram2 : 5.56 68.29 4.27 0.00 0.00 1791061.43 47435.58 1645765.00 00:26:16.042 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:26:16.042 Verification LBA range: start 0x0 length 0x100 00:26:16.042 crypto_ram3 : 5.45 452.39 28.27 0.00 0.00 257590.77 21096.35 341536.18 00:26:16.042 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:26:16.042 Verification LBA range: start 0x100 length 0x100 00:26:16.042 crypto_ram3 : 5.38 454.94 28.43 0.00 0.00 260837.13 20222.54 347528.05 00:26:16.042 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:26:16.042 Verification LBA range: start 0x0 length 0x100 00:26:16.042 crypto_ram4 : 5.57 477.41 29.84 0.00 0.00 240021.45 9549.53 339538.90 00:26:16.042 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:26:16.042 Verification LBA range: start 0x100 length 0x100 00:26:16.042 crypto_ram4 : 5.44 471.69 29.48 0.00 0.00 246668.95 10236.10 347528.05 00:26:16.042 =================================================================================================================== 00:26:16.042 Total : 2127.47 132.97 0.00 0.00 455574.71 9549.53 1645765.00 00:26:16.301 00:26:16.301 real 0m8.564s 00:26:16.301 user 0m16.497s 00:26:16.301 sys 0m0.275s 00:26:16.301 12:07:06 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:16.301 12:07:06 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:26:16.301 ************************************ 00:26:16.301 END TEST bdev_verify_big_io 00:26:16.301 ************************************ 00:26:16.561 12:07:06 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:26:16.561 12:07:06 blockdev_crypto_aesni -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:26:16.561 12:07:06 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:26:16.561 12:07:06 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:16.561 12:07:06 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:26:16.561 ************************************ 00:26:16.561 START TEST bdev_write_zeroes 00:26:16.561 ************************************ 00:26:16.561 12:07:06 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:26:16.561 [2024-07-12 12:07:06.630497] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:26:16.561 [2024-07-12 12:07:06.630537] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid779215 ] 00:26:16.561 [2024-07-12 12:07:06.692858] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:16.561 [2024-07-12 12:07:06.763568] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:26:16.561 [2024-07-12 12:07:06.784484] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:26:16.561 [2024-07-12 12:07:06.792509] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:26:16.561 [2024-07-12 12:07:06.800530] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:26:16.820 [2024-07-12 12:07:06.896532] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:26:19.355 [2024-07-12 12:07:09.047448] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:26:19.355 [2024-07-12 12:07:09.047525] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:26:19.355 [2024-07-12 12:07:09.047535] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:19.355 [2024-07-12 12:07:09.055470] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:26:19.355 [2024-07-12 12:07:09.055482] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:26:19.355 [2024-07-12 12:07:09.055488] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:19.355 [2024-07-12 12:07:09.063487] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:26:19.355 [2024-07-12 12:07:09.063499] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:26:19.355 [2024-07-12 12:07:09.063503] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:19.355 [2024-07-12 12:07:09.071512] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:26:19.355 [2024-07-12 12:07:09.071527] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:26:19.355 [2024-07-12 12:07:09.071532] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:19.355 Running I/O for 1 seconds... 00:26:20.292 00:26:20.292 Latency(us) 00:26:20.292 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:20.292 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:26:20.292 crypto_ram : 1.02 3046.78 11.90 0.00 0.00 41797.13 3464.05 49682.53 00:26:20.292 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:26:20.292 crypto_ram2 : 1.02 3060.22 11.95 0.00 0.00 41494.21 3448.44 46936.26 00:26:20.292 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:26:20.292 crypto_ram3 : 1.01 23759.62 92.81 0.00 0.00 5334.96 1607.19 6896.88 00:26:20.292 Job: crypto_ram4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:26:20.292 crypto_ram4 : 1.01 23797.66 92.96 0.00 0.00 5315.25 1560.38 6709.64 00:26:20.292 =================================================================================================================== 00:26:20.292 Total : 53664.28 209.63 0.00 0.00 9469.75 1560.38 49682.53 00:26:20.292 00:26:20.292 real 0m3.896s 00:26:20.292 user 0m3.598s 00:26:20.292 sys 0m0.255s 00:26:20.292 12:07:10 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:20.292 12:07:10 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:26:20.292 ************************************ 00:26:20.292 END TEST bdev_write_zeroes 00:26:20.292 ************************************ 00:26:20.292 12:07:10 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:26:20.292 12:07:10 blockdev_crypto_aesni -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:26:20.292 12:07:10 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:26:20.292 12:07:10 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:20.292 12:07:10 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:26:20.552 ************************************ 00:26:20.552 START TEST bdev_json_nonenclosed 00:26:20.552 ************************************ 00:26:20.552 12:07:10 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:26:20.552 [2024-07-12 12:07:10.595669] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:26:20.552 [2024-07-12 12:07:10.595704] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid779779 ] 00:26:20.552 [2024-07-12 12:07:10.660845] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:20.552 [2024-07-12 12:07:10.733513] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:26:20.552 [2024-07-12 12:07:10.733576] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:26:20.552 [2024-07-12 12:07:10.733587] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:26:20.552 [2024-07-12 12:07:10.733594] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:26:20.811 00:26:20.811 real 0m0.266s 00:26:20.811 user 0m0.164s 00:26:20.811 sys 0m0.099s 00:26:20.811 12:07:10 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:26:20.811 12:07:10 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:20.811 12:07:10 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:26:20.811 ************************************ 00:26:20.811 END TEST bdev_json_nonenclosed 00:26:20.811 ************************************ 00:26:20.811 12:07:10 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 234 00:26:20.811 12:07:10 blockdev_crypto_aesni -- bdev/blockdev.sh@782 -- # true 00:26:20.811 12:07:10 blockdev_crypto_aesni -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:26:20.811 12:07:10 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:26:20.811 12:07:10 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:20.811 12:07:10 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:26:20.811 ************************************ 00:26:20.811 START TEST bdev_json_nonarray 00:26:20.811 ************************************ 00:26:20.811 12:07:10 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:26:20.811 [2024-07-12 12:07:10.931938] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:26:20.811 [2024-07-12 12:07:10.931977] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid779930 ] 00:26:20.811 [2024-07-12 12:07:10.995465] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:21.070 [2024-07-12 12:07:11.065222] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:26:21.070 [2024-07-12 12:07:11.065294] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:26:21.070 [2024-07-12 12:07:11.065305] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:26:21.070 [2024-07-12 12:07:11.065312] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:26:21.070 00:26:21.070 real 0m0.261s 00:26:21.070 user 0m0.165s 00:26:21.070 sys 0m0.093s 00:26:21.070 12:07:11 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:26:21.070 12:07:11 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:21.070 12:07:11 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:26:21.070 ************************************ 00:26:21.070 END TEST bdev_json_nonarray 00:26:21.070 ************************************ 00:26:21.070 12:07:11 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 234 00:26:21.070 12:07:11 blockdev_crypto_aesni -- bdev/blockdev.sh@785 -- # true 00:26:21.070 12:07:11 blockdev_crypto_aesni -- bdev/blockdev.sh@787 -- # [[ crypto_aesni == bdev ]] 00:26:21.070 12:07:11 blockdev_crypto_aesni -- bdev/blockdev.sh@794 -- # [[ crypto_aesni == gpt ]] 00:26:21.070 12:07:11 blockdev_crypto_aesni -- bdev/blockdev.sh@798 -- # [[ crypto_aesni == crypto_sw ]] 00:26:21.070 12:07:11 blockdev_crypto_aesni -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:26:21.070 12:07:11 blockdev_crypto_aesni -- bdev/blockdev.sh@811 -- # cleanup 00:26:21.070 12:07:11 blockdev_crypto_aesni -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:26:21.070 12:07:11 blockdev_crypto_aesni -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:26:21.070 12:07:11 blockdev_crypto_aesni -- bdev/blockdev.sh@26 -- # [[ crypto_aesni == rbd ]] 00:26:21.070 12:07:11 blockdev_crypto_aesni -- bdev/blockdev.sh@30 -- # [[ crypto_aesni == daos ]] 00:26:21.070 12:07:11 blockdev_crypto_aesni -- bdev/blockdev.sh@34 -- # [[ crypto_aesni = \g\p\t ]] 00:26:21.070 12:07:11 blockdev_crypto_aesni -- bdev/blockdev.sh@40 -- # [[ crypto_aesni == xnvme ]] 00:26:21.070 00:26:21.070 real 1m6.347s 00:26:21.070 user 2m39.853s 00:26:21.070 sys 0m5.823s 00:26:21.070 12:07:11 blockdev_crypto_aesni -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:21.070 12:07:11 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:26:21.070 ************************************ 00:26:21.070 END TEST blockdev_crypto_aesni 00:26:21.070 ************************************ 00:26:21.070 12:07:11 -- common/autotest_common.sh@1142 -- # return 0 00:26:21.070 12:07:11 -- spdk/autotest.sh@358 -- # run_test blockdev_crypto_sw /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:26:21.070 12:07:11 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:26:21.070 12:07:11 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:21.070 12:07:11 -- common/autotest_common.sh@10 -- # set +x 00:26:21.070 ************************************ 00:26:21.070 START TEST blockdev_crypto_sw 00:26:21.070 ************************************ 00:26:21.070 12:07:11 blockdev_crypto_sw -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:26:21.330 * Looking for test storage... 00:26:21.330 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:26:21.330 12:07:11 blockdev_crypto_sw -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:26:21.330 12:07:11 blockdev_crypto_sw -- bdev/nbd_common.sh@6 -- # set -e 00:26:21.330 12:07:11 blockdev_crypto_sw -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:26:21.330 12:07:11 blockdev_crypto_sw -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:26:21.330 12:07:11 blockdev_crypto_sw -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:26:21.330 12:07:11 blockdev_crypto_sw -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:26:21.330 12:07:11 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:26:21.330 12:07:11 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:26:21.330 12:07:11 blockdev_crypto_sw -- bdev/blockdev.sh@20 -- # : 00:26:21.330 12:07:11 blockdev_crypto_sw -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:26:21.330 12:07:11 blockdev_crypto_sw -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:26:21.330 12:07:11 blockdev_crypto_sw -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:26:21.330 12:07:11 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # uname -s 00:26:21.330 12:07:11 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:26:21.330 12:07:11 blockdev_crypto_sw -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:26:21.330 12:07:11 blockdev_crypto_sw -- bdev/blockdev.sh@682 -- # test_type=crypto_sw 00:26:21.330 12:07:11 blockdev_crypto_sw -- bdev/blockdev.sh@683 -- # crypto_device= 00:26:21.330 12:07:11 blockdev_crypto_sw -- bdev/blockdev.sh@684 -- # dek= 00:26:21.330 12:07:11 blockdev_crypto_sw -- bdev/blockdev.sh@685 -- # env_ctx= 00:26:21.330 12:07:11 blockdev_crypto_sw -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:26:21.330 12:07:11 blockdev_crypto_sw -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:26:21.330 12:07:11 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == bdev ]] 00:26:21.330 12:07:11 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == crypto_* ]] 00:26:21.330 12:07:11 blockdev_crypto_sw -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:26:21.330 12:07:11 blockdev_crypto_sw -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:26:21.330 12:07:11 blockdev_crypto_sw -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=780002 00:26:21.330 12:07:11 blockdev_crypto_sw -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:26:21.330 12:07:11 blockdev_crypto_sw -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:26:21.330 12:07:11 blockdev_crypto_sw -- bdev/blockdev.sh@49 -- # waitforlisten 780002 00:26:21.330 12:07:11 blockdev_crypto_sw -- common/autotest_common.sh@829 -- # '[' -z 780002 ']' 00:26:21.330 12:07:11 blockdev_crypto_sw -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:21.330 12:07:11 blockdev_crypto_sw -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:21.330 12:07:11 blockdev_crypto_sw -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:21.330 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:21.330 12:07:11 blockdev_crypto_sw -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:21.330 12:07:11 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:21.330 [2024-07-12 12:07:11.396547] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:26:21.330 [2024-07-12 12:07:11.396588] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid780002 ] 00:26:21.330 [2024-07-12 12:07:11.460603] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:21.330 [2024-07-12 12:07:11.532377] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:26:22.267 12:07:12 blockdev_crypto_sw -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:22.267 12:07:12 blockdev_crypto_sw -- common/autotest_common.sh@862 -- # return 0 00:26:22.267 12:07:12 blockdev_crypto_sw -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:26:22.267 12:07:12 blockdev_crypto_sw -- bdev/blockdev.sh@711 -- # setup_crypto_sw_conf 00:26:22.267 12:07:12 blockdev_crypto_sw -- bdev/blockdev.sh@193 -- # rpc_cmd 00:26:22.267 12:07:12 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:26:22.267 12:07:12 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:22.267 Malloc0 00:26:22.267 Malloc1 00:26:22.267 true 00:26:22.267 true 00:26:22.267 true 00:26:22.267 [2024-07-12 12:07:12.415974] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:26:22.267 crypto_ram 00:26:22.267 [2024-07-12 12:07:12.424002] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:26:22.267 crypto_ram2 00:26:22.267 [2024-07-12 12:07:12.432020] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:26:22.267 crypto_ram3 00:26:22.267 [ 00:26:22.267 { 00:26:22.267 "name": "Malloc1", 00:26:22.267 "aliases": [ 00:26:22.267 "6a307f8e-0a74-4780-ae6a-a17e2ba22153" 00:26:22.267 ], 00:26:22.267 "product_name": "Malloc disk", 00:26:22.267 "block_size": 4096, 00:26:22.267 "num_blocks": 4096, 00:26:22.267 "uuid": "6a307f8e-0a74-4780-ae6a-a17e2ba22153", 00:26:22.267 "assigned_rate_limits": { 00:26:22.267 "rw_ios_per_sec": 0, 00:26:22.267 "rw_mbytes_per_sec": 0, 00:26:22.267 "r_mbytes_per_sec": 0, 00:26:22.267 "w_mbytes_per_sec": 0 00:26:22.267 }, 00:26:22.267 "claimed": true, 00:26:22.267 "claim_type": "exclusive_write", 00:26:22.267 "zoned": false, 00:26:22.267 "supported_io_types": { 00:26:22.267 "read": true, 00:26:22.267 "write": true, 00:26:22.267 "unmap": true, 00:26:22.267 "flush": true, 00:26:22.267 "reset": true, 00:26:22.267 "nvme_admin": false, 00:26:22.267 "nvme_io": false, 00:26:22.267 "nvme_io_md": false, 00:26:22.267 "write_zeroes": true, 00:26:22.267 "zcopy": true, 00:26:22.267 "get_zone_info": false, 00:26:22.267 "zone_management": false, 00:26:22.267 "zone_append": false, 00:26:22.267 "compare": false, 00:26:22.267 "compare_and_write": false, 00:26:22.267 "abort": true, 00:26:22.267 "seek_hole": false, 00:26:22.267 "seek_data": false, 00:26:22.267 "copy": true, 00:26:22.267 "nvme_iov_md": false 00:26:22.267 }, 00:26:22.267 "memory_domains": [ 00:26:22.267 { 00:26:22.267 "dma_device_id": "system", 00:26:22.267 "dma_device_type": 1 00:26:22.267 }, 00:26:22.267 { 00:26:22.267 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:22.267 "dma_device_type": 2 00:26:22.267 } 00:26:22.267 ], 00:26:22.267 "driver_specific": {} 00:26:22.267 } 00:26:22.267 ] 00:26:22.268 12:07:12 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:26:22.268 12:07:12 blockdev_crypto_sw -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:26:22.268 12:07:12 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:26:22.268 12:07:12 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:22.268 12:07:12 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:26:22.268 12:07:12 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # cat 00:26:22.268 12:07:12 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:26:22.268 12:07:12 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:26:22.268 12:07:12 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:22.268 12:07:12 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:26:22.268 12:07:12 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:26:22.268 12:07:12 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:26:22.268 12:07:12 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:22.268 12:07:12 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:26:22.268 12:07:12 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:26:22.268 12:07:12 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:26:22.268 12:07:12 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:22.527 12:07:12 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:26:22.527 12:07:12 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:26:22.527 12:07:12 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:26:22.527 12:07:12 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:26:22.527 12:07:12 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:26:22.527 12:07:12 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:22.527 12:07:12 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:26:22.527 12:07:12 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:26:22.527 12:07:12 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "e3804b03-c6a7-5580-be8b-2b0956903d80"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "e3804b03-c6a7-5580-be8b-2b0956903d80",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "b674a1f9-8f8c-5825-aeda-de1803dbae38"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "b674a1f9-8f8c-5825-aeda-de1803dbae38",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:26:22.527 12:07:12 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # jq -r .name 00:26:22.527 12:07:12 blockdev_crypto_sw -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:26:22.527 12:07:12 blockdev_crypto_sw -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:26:22.527 12:07:12 blockdev_crypto_sw -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:26:22.527 12:07:12 blockdev_crypto_sw -- bdev/blockdev.sh@754 -- # killprocess 780002 00:26:22.527 12:07:12 blockdev_crypto_sw -- common/autotest_common.sh@948 -- # '[' -z 780002 ']' 00:26:22.527 12:07:12 blockdev_crypto_sw -- common/autotest_common.sh@952 -- # kill -0 780002 00:26:22.527 12:07:12 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # uname 00:26:22.527 12:07:12 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:22.527 12:07:12 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 780002 00:26:22.527 12:07:12 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:22.527 12:07:12 blockdev_crypto_sw -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:22.527 12:07:12 blockdev_crypto_sw -- common/autotest_common.sh@966 -- # echo 'killing process with pid 780002' 00:26:22.527 killing process with pid 780002 00:26:22.527 12:07:12 blockdev_crypto_sw -- common/autotest_common.sh@967 -- # kill 780002 00:26:22.527 12:07:12 blockdev_crypto_sw -- common/autotest_common.sh@972 -- # wait 780002 00:26:22.786 12:07:12 blockdev_crypto_sw -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:26:22.786 12:07:12 blockdev_crypto_sw -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:26:22.786 12:07:12 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:26:22.786 12:07:12 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:22.786 12:07:12 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:22.786 ************************************ 00:26:22.786 START TEST bdev_hello_world 00:26:22.786 ************************************ 00:26:22.786 12:07:13 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:26:23.044 [2024-07-12 12:07:13.050586] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:26:23.044 [2024-07-12 12:07:13.050621] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid780252 ] 00:26:23.044 [2024-07-12 12:07:13.113325] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:23.044 [2024-07-12 12:07:13.183027] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:26:23.302 [2024-07-12 12:07:13.338777] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:26:23.302 [2024-07-12 12:07:13.338844] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:26:23.302 [2024-07-12 12:07:13.338854] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:23.302 [2024-07-12 12:07:13.346795] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:26:23.302 [2024-07-12 12:07:13.346806] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:26:23.302 [2024-07-12 12:07:13.346812] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:23.302 [2024-07-12 12:07:13.354816] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:26:23.302 [2024-07-12 12:07:13.354828] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:26:23.302 [2024-07-12 12:07:13.354834] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:23.302 [2024-07-12 12:07:13.393032] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:26:23.302 [2024-07-12 12:07:13.393057] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:26:23.302 [2024-07-12 12:07:13.393068] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:26:23.302 [2024-07-12 12:07:13.394208] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:26:23.302 [2024-07-12 12:07:13.394263] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:26:23.302 [2024-07-12 12:07:13.394271] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:26:23.302 [2024-07-12 12:07:13.394293] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:26:23.302 00:26:23.302 [2024-07-12 12:07:13.394304] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:26:23.561 00:26:23.561 real 0m0.563s 00:26:23.561 user 0m0.399s 00:26:23.561 sys 0m0.151s 00:26:23.561 12:07:13 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:23.561 12:07:13 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:26:23.561 ************************************ 00:26:23.561 END TEST bdev_hello_world 00:26:23.561 ************************************ 00:26:23.561 12:07:13 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:26:23.561 12:07:13 blockdev_crypto_sw -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:26:23.561 12:07:13 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:26:23.561 12:07:13 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:23.561 12:07:13 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:23.561 ************************************ 00:26:23.561 START TEST bdev_bounds 00:26:23.561 ************************************ 00:26:23.561 12:07:13 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:26:23.561 12:07:13 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=780494 00:26:23.561 12:07:13 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:26:23.561 12:07:13 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:26:23.561 12:07:13 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 780494' 00:26:23.561 Process bdevio pid: 780494 00:26:23.561 12:07:13 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 780494 00:26:23.561 12:07:13 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 780494 ']' 00:26:23.561 12:07:13 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:23.561 12:07:13 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:23.561 12:07:13 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:23.561 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:23.561 12:07:13 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:23.561 12:07:13 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:26:23.561 [2024-07-12 12:07:13.683283] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:26:23.561 [2024-07-12 12:07:13.683318] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid780494 ] 00:26:23.561 [2024-07-12 12:07:13.745656] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:26:23.820 [2024-07-12 12:07:13.824661] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:26:23.820 [2024-07-12 12:07:13.824758] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:26:23.820 [2024-07-12 12:07:13.824760] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:26:23.820 [2024-07-12 12:07:13.981286] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:26:23.820 [2024-07-12 12:07:13.981336] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:26:23.820 [2024-07-12 12:07:13.981359] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:23.820 [2024-07-12 12:07:13.989305] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:26:23.820 [2024-07-12 12:07:13.989315] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:26:23.820 [2024-07-12 12:07:13.989320] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:23.820 [2024-07-12 12:07:13.997329] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:26:23.820 [2024-07-12 12:07:13.997338] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:26:23.820 [2024-07-12 12:07:13.997344] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:24.389 12:07:14 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:24.389 12:07:14 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:26:24.389 12:07:14 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:26:24.389 I/O targets: 00:26:24.389 crypto_ram: 32768 blocks of 512 bytes (16 MiB) 00:26:24.389 crypto_ram3: 4096 blocks of 4096 bytes (16 MiB) 00:26:24.389 00:26:24.389 00:26:24.389 CUnit - A unit testing framework for C - Version 2.1-3 00:26:24.389 http://cunit.sourceforge.net/ 00:26:24.389 00:26:24.389 00:26:24.389 Suite: bdevio tests on: crypto_ram3 00:26:24.389 Test: blockdev write read block ...passed 00:26:24.389 Test: blockdev write zeroes read block ...passed 00:26:24.389 Test: blockdev write zeroes read no split ...passed 00:26:24.389 Test: blockdev write zeroes read split ...passed 00:26:24.389 Test: blockdev write zeroes read split partial ...passed 00:26:24.389 Test: blockdev reset ...passed 00:26:24.389 Test: blockdev write read 8 blocks ...passed 00:26:24.389 Test: blockdev write read size > 128k ...passed 00:26:24.389 Test: blockdev write read invalid size ...passed 00:26:24.389 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:26:24.389 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:26:24.389 Test: blockdev write read max offset ...passed 00:26:24.389 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:26:24.389 Test: blockdev writev readv 8 blocks ...passed 00:26:24.389 Test: blockdev writev readv 30 x 1block ...passed 00:26:24.389 Test: blockdev writev readv block ...passed 00:26:24.389 Test: blockdev writev readv size > 128k ...passed 00:26:24.389 Test: blockdev writev readv size > 128k in two iovs ...passed 00:26:24.389 Test: blockdev comparev and writev ...passed 00:26:24.389 Test: blockdev nvme passthru rw ...passed 00:26:24.389 Test: blockdev nvme passthru vendor specific ...passed 00:26:24.389 Test: blockdev nvme admin passthru ...passed 00:26:24.389 Test: blockdev copy ...passed 00:26:24.389 Suite: bdevio tests on: crypto_ram 00:26:24.389 Test: blockdev write read block ...passed 00:26:24.389 Test: blockdev write zeroes read block ...passed 00:26:24.389 Test: blockdev write zeroes read no split ...passed 00:26:24.390 Test: blockdev write zeroes read split ...passed 00:26:24.390 Test: blockdev write zeroes read split partial ...passed 00:26:24.390 Test: blockdev reset ...passed 00:26:24.390 Test: blockdev write read 8 blocks ...passed 00:26:24.390 Test: blockdev write read size > 128k ...passed 00:26:24.390 Test: blockdev write read invalid size ...passed 00:26:24.390 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:26:24.390 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:26:24.390 Test: blockdev write read max offset ...passed 00:26:24.390 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:26:24.390 Test: blockdev writev readv 8 blocks ...passed 00:26:24.390 Test: blockdev writev readv 30 x 1block ...passed 00:26:24.390 Test: blockdev writev readv block ...passed 00:26:24.390 Test: blockdev writev readv size > 128k ...passed 00:26:24.390 Test: blockdev writev readv size > 128k in two iovs ...passed 00:26:24.390 Test: blockdev comparev and writev ...passed 00:26:24.390 Test: blockdev nvme passthru rw ...passed 00:26:24.390 Test: blockdev nvme passthru vendor specific ...passed 00:26:24.390 Test: blockdev nvme admin passthru ...passed 00:26:24.390 Test: blockdev copy ...passed 00:26:24.390 00:26:24.390 Run Summary: Type Total Ran Passed Failed Inactive 00:26:24.390 suites 2 2 n/a 0 0 00:26:24.390 tests 46 46 46 0 0 00:26:24.390 asserts 260 260 260 0 n/a 00:26:24.390 00:26:24.390 Elapsed time = 0.078 seconds 00:26:24.390 0 00:26:24.390 12:07:14 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 780494 00:26:24.390 12:07:14 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 780494 ']' 00:26:24.390 12:07:14 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 780494 00:26:24.390 12:07:14 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:26:24.390 12:07:14 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:24.390 12:07:14 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 780494 00:26:24.649 12:07:14 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:24.649 12:07:14 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:24.649 12:07:14 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 780494' 00:26:24.649 killing process with pid 780494 00:26:24.649 12:07:14 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@967 -- # kill 780494 00:26:24.649 12:07:14 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@972 -- # wait 780494 00:26:24.649 12:07:14 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:26:24.649 00:26:24.649 real 0m1.219s 00:26:24.649 user 0m3.253s 00:26:24.649 sys 0m0.280s 00:26:24.649 12:07:14 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:24.649 12:07:14 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:26:24.649 ************************************ 00:26:24.649 END TEST bdev_bounds 00:26:24.649 ************************************ 00:26:24.649 12:07:14 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:26:24.649 12:07:14 blockdev_crypto_sw -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:26:24.649 12:07:14 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:26:24.649 12:07:14 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:24.649 12:07:14 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:24.909 ************************************ 00:26:24.909 START TEST bdev_nbd 00:26:24.909 ************************************ 00:26:24.909 12:07:14 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:26:24.909 12:07:14 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:26:24.909 12:07:14 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:26:24.909 12:07:14 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:24.909 12:07:14 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:26:24.909 12:07:14 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram3') 00:26:24.909 12:07:14 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:26:24.909 12:07:14 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=2 00:26:24.909 12:07:14 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:26:24.909 12:07:14 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:26:24.909 12:07:14 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:26:24.909 12:07:14 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=2 00:26:24.909 12:07:14 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:24.909 12:07:14 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:26:24.909 12:07:14 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:26:24.909 12:07:14 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:26:24.909 12:07:14 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=780715 00:26:24.909 12:07:14 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:26:24.909 12:07:14 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:26:24.909 12:07:14 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 780715 /var/tmp/spdk-nbd.sock 00:26:24.909 12:07:14 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 780715 ']' 00:26:24.909 12:07:14 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:26:24.909 12:07:14 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:24.909 12:07:14 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:26:24.909 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:26:24.909 12:07:14 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:24.909 12:07:14 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:26:24.909 [2024-07-12 12:07:14.976894] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:26:24.909 [2024-07-12 12:07:14.976934] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:26:24.909 [2024-07-12 12:07:15.041415] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:24.909 [2024-07-12 12:07:15.113696] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:26:25.168 [2024-07-12 12:07:15.267320] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:26:25.168 [2024-07-12 12:07:15.267371] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:26:25.168 [2024-07-12 12:07:15.267380] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:25.168 [2024-07-12 12:07:15.275338] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:26:25.168 [2024-07-12 12:07:15.275350] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:26:25.168 [2024-07-12 12:07:15.275355] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:25.168 [2024-07-12 12:07:15.283361] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:26:25.168 [2024-07-12 12:07:15.283373] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:26:25.168 [2024-07-12 12:07:15.283378] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:25.735 12:07:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:25.735 12:07:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:26:25.735 12:07:15 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:26:25.735 12:07:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:25.735 12:07:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:26:25.735 12:07:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:26:25.735 12:07:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:26:25.735 12:07:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:25.735 12:07:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:26:25.735 12:07:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:26:25.735 12:07:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:26:25.735 12:07:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:26:25.735 12:07:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:26:25.735 12:07:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:26:25.735 12:07:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:26:25.735 12:07:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:26:25.735 12:07:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:26:25.735 12:07:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:26:25.735 12:07:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:26:25.735 12:07:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:26:25.735 12:07:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:25.735 12:07:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:25.735 12:07:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:26:25.735 12:07:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:26:25.735 12:07:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:25.735 12:07:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:25.735 12:07:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:25.735 1+0 records in 00:26:25.735 1+0 records out 00:26:25.735 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000220702 s, 18.6 MB/s 00:26:25.735 12:07:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:25.735 12:07:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:26:25.735 12:07:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:25.735 12:07:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:25.735 12:07:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:26:25.735 12:07:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:26:25.735 12:07:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:26:25.735 12:07:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:26:25.992 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:26:25.992 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:26:25.992 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:26:25.992 12:07:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:26:25.992 12:07:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:26:25.992 12:07:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:25.992 12:07:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:25.992 12:07:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:26:25.992 12:07:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:26:25.992 12:07:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:25.992 12:07:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:25.992 12:07:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:25.992 1+0 records in 00:26:25.992 1+0 records out 00:26:25.992 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000237202 s, 17.3 MB/s 00:26:25.992 12:07:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:25.992 12:07:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:26:25.992 12:07:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:25.992 12:07:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:25.992 12:07:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:26:25.992 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:26:25.992 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:26:25.992 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:26:26.251 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:26:26.251 { 00:26:26.251 "nbd_device": "/dev/nbd0", 00:26:26.251 "bdev_name": "crypto_ram" 00:26:26.251 }, 00:26:26.251 { 00:26:26.251 "nbd_device": "/dev/nbd1", 00:26:26.251 "bdev_name": "crypto_ram3" 00:26:26.251 } 00:26:26.251 ]' 00:26:26.251 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:26:26.251 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:26:26.251 { 00:26:26.251 "nbd_device": "/dev/nbd0", 00:26:26.251 "bdev_name": "crypto_ram" 00:26:26.251 }, 00:26:26.251 { 00:26:26.251 "nbd_device": "/dev/nbd1", 00:26:26.251 "bdev_name": "crypto_ram3" 00:26:26.251 } 00:26:26.251 ]' 00:26:26.251 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:26:26.251 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:26:26.251 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:26.251 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:26.251 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:26.251 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:26:26.251 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:26.251 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:26:26.510 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:26.510 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:26.510 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:26.510 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:26.510 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:26.510 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:26.510 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:26.510 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:26.510 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:26.510 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:26:26.510 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:26.510 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:26.510 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:26.510 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:26.510 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:26.510 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:26.510 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:26.510 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:26.510 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:26:26.510 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:26.510 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:26:26.768 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:26:26.768 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:26:26.768 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:26:26.768 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:26:26.768 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:26:26.768 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:26:26.768 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:26:26.768 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:26:26.768 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:26:26.768 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:26:26.768 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:26:26.768 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:26:26.768 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:26:26.768 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:26.768 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:26:26.768 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:26:26.768 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:26.768 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:26:26.768 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:26:26.768 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:26.768 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:26:26.768 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:26.768 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:26.768 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:26.768 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:26:26.768 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:26.768 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:26.768 12:07:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:26:27.025 /dev/nbd0 00:26:27.025 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:27.025 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:27.025 12:07:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:26:27.025 12:07:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:26:27.025 12:07:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:27.025 12:07:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:27.025 12:07:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:26:27.025 12:07:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:26:27.025 12:07:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:27.025 12:07:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:27.025 12:07:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:27.025 1+0 records in 00:26:27.025 1+0 records out 00:26:27.025 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00018224 s, 22.5 MB/s 00:26:27.025 12:07:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:27.025 12:07:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:26:27.025 12:07:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:27.026 12:07:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:27.026 12:07:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:26:27.026 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:27.026 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:27.026 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd1 00:26:27.283 /dev/nbd1 00:26:27.283 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:27.283 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:27.283 12:07:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:26:27.283 12:07:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:26:27.283 12:07:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:27.283 12:07:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:27.283 12:07:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:26:27.283 12:07:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:26:27.283 12:07:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:27.283 12:07:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:27.283 12:07:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:27.283 1+0 records in 00:26:27.283 1+0 records out 00:26:27.283 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.0002463 s, 16.6 MB/s 00:26:27.283 12:07:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:27.283 12:07:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:26:27.283 12:07:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:27.283 12:07:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:27.283 12:07:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:26:27.283 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:27.283 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:27.283 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:26:27.283 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:27.283 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:26:27.283 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:26:27.283 { 00:26:27.283 "nbd_device": "/dev/nbd0", 00:26:27.283 "bdev_name": "crypto_ram" 00:26:27.283 }, 00:26:27.283 { 00:26:27.283 "nbd_device": "/dev/nbd1", 00:26:27.283 "bdev_name": "crypto_ram3" 00:26:27.283 } 00:26:27.283 ]' 00:26:27.283 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:26:27.283 { 00:26:27.283 "nbd_device": "/dev/nbd0", 00:26:27.283 "bdev_name": "crypto_ram" 00:26:27.283 }, 00:26:27.283 { 00:26:27.283 "nbd_device": "/dev/nbd1", 00:26:27.283 "bdev_name": "crypto_ram3" 00:26:27.283 } 00:26:27.283 ]' 00:26:27.283 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:26:27.542 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:26:27.542 /dev/nbd1' 00:26:27.542 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:26:27.542 /dev/nbd1' 00:26:27.542 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:26:27.542 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=2 00:26:27.542 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 2 00:26:27.542 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=2 00:26:27.542 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:26:27.542 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:26:27.542 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:27.542 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:26:27.542 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:26:27.542 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:26:27.542 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:26:27.542 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:26:27.542 256+0 records in 00:26:27.542 256+0 records out 00:26:27.542 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0103369 s, 101 MB/s 00:26:27.542 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:26:27.542 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:26:27.542 256+0 records in 00:26:27.542 256+0 records out 00:26:27.542 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.014853 s, 70.6 MB/s 00:26:27.542 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:26:27.542 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:26:27.542 256+0 records in 00:26:27.542 256+0 records out 00:26:27.542 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0225807 s, 46.4 MB/s 00:26:27.542 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:26:27.542 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:27.542 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:26:27.542 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:26:27.542 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:26:27.542 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:26:27.542 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:26:27.542 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:26:27.542 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:26:27.542 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:26:27.542 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:26:27.542 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:26:27.542 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:26:27.542 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:27.542 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:27.542 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:27.542 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:26:27.542 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:27.542 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:26:27.801 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:27.801 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:27.801 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:27.801 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:27.801 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:27.801 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:27.801 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:27.801 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:27.801 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:27.801 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:26:27.801 12:07:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:27.801 12:07:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:27.801 12:07:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:27.801 12:07:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:27.801 12:07:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:27.801 12:07:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:27.801 12:07:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:27.801 12:07:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:27.801 12:07:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:26:27.801 12:07:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:27.801 12:07:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:26:28.059 12:07:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:26:28.059 12:07:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:26:28.059 12:07:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:26:28.059 12:07:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:26:28.059 12:07:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:26:28.059 12:07:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:26:28.059 12:07:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:26:28.059 12:07:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:26:28.059 12:07:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:26:28.059 12:07:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:26:28.059 12:07:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:26:28.059 12:07:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:26:28.059 12:07:18 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:26:28.059 12:07:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:28.059 12:07:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:28.059 12:07:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:26:28.059 12:07:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:26:28.059 12:07:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:26:28.316 malloc_lvol_verify 00:26:28.316 12:07:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:26:28.316 ba97224c-ff40-438b-91b5-3e8ccc48a564 00:26:28.575 12:07:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:26:28.575 738f1e0e-ce8d-4099-aba9-75892d785565 00:26:28.575 12:07:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:26:28.835 /dev/nbd0 00:26:28.835 12:07:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:26:28.835 mke2fs 1.46.5 (30-Dec-2021) 00:26:28.835 Discarding device blocks: 0/4096 done 00:26:28.835 Creating filesystem with 4096 1k blocks and 1024 inodes 00:26:28.835 00:26:28.835 Allocating group tables: 0/1 done 00:26:28.835 Writing inode tables: 0/1 done 00:26:28.835 Creating journal (1024 blocks): done 00:26:28.835 Writing superblocks and filesystem accounting information: 0/1 done 00:26:28.835 00:26:28.835 12:07:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:26:28.835 12:07:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:26:28.835 12:07:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:28.835 12:07:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:26:28.835 12:07:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:28.835 12:07:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:26:28.835 12:07:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:28.835 12:07:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:26:28.835 12:07:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:28.835 12:07:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:28.835 12:07:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:28.835 12:07:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:28.835 12:07:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:28.835 12:07:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:29.095 12:07:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:29.095 12:07:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:29.095 12:07:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:26:29.095 12:07:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:26:29.095 12:07:19 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 780715 00:26:29.095 12:07:19 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 780715 ']' 00:26:29.095 12:07:19 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 780715 00:26:29.095 12:07:19 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:26:29.095 12:07:19 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:29.095 12:07:19 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 780715 00:26:29.095 12:07:19 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:29.095 12:07:19 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:29.095 12:07:19 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 780715' 00:26:29.095 killing process with pid 780715 00:26:29.095 12:07:19 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@967 -- # kill 780715 00:26:29.095 12:07:19 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@972 -- # wait 780715 00:26:29.095 12:07:19 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:26:29.095 00:26:29.095 real 0m4.395s 00:26:29.095 user 0m6.365s 00:26:29.095 sys 0m1.473s 00:26:29.095 12:07:19 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:29.095 12:07:19 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:26:29.095 ************************************ 00:26:29.095 END TEST bdev_nbd 00:26:29.095 ************************************ 00:26:29.359 12:07:19 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:26:29.359 12:07:19 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:26:29.359 12:07:19 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = nvme ']' 00:26:29.359 12:07:19 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = gpt ']' 00:26:29.359 12:07:19 blockdev_crypto_sw -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:26:29.359 12:07:19 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:26:29.359 12:07:19 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:29.359 12:07:19 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:29.359 ************************************ 00:26:29.359 START TEST bdev_fio 00:26:29.359 ************************************ 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:26:29.359 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:26:29.359 ************************************ 00:26:29.359 START TEST bdev_fio_rw_verify 00:26:29.359 ************************************ 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:26:29.359 12:07:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:29.616 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:29.617 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:29.617 fio-3.35 00:26:29.617 Starting 2 threads 00:26:41.825 00:26:41.825 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=781697: Fri Jul 12 12:07:30 2024 00:26:41.825 read: IOPS=31.5k, BW=123MiB/s (129MB/s)(1229MiB/10000msec) 00:26:41.825 slat (usec): min=8, max=1195, avg=14.13, stdev= 3.64 00:26:41.825 clat (usec): min=5, max=1366, avg=102.13, stdev=41.75 00:26:41.825 lat (usec): min=17, max=1381, avg=116.26, stdev=42.92 00:26:41.825 clat percentiles (usec): 00:26:41.825 | 50.000th=[ 100], 99.000th=[ 200], 99.900th=[ 217], 99.990th=[ 243], 00:26:41.825 | 99.999th=[ 1319] 00:26:41.825 write: IOPS=37.8k, BW=148MiB/s (155MB/s)(1400MiB/9483msec); 0 zone resets 00:26:41.825 slat (usec): min=9, max=314, avg=23.44, stdev= 3.59 00:26:41.825 clat (usec): min=17, max=845, avg=136.30, stdev=62.74 00:26:41.825 lat (usec): min=35, max=939, avg=159.75, stdev=63.98 00:26:41.825 clat percentiles (usec): 00:26:41.825 | 50.000th=[ 133], 99.000th=[ 273], 99.900th=[ 293], 99.990th=[ 529], 00:26:41.825 | 99.999th=[ 807] 00:26:41.825 bw ( KiB/s): min=137816, max=149832, per=94.98%, avg=143583.16, stdev=1860.80, samples=38 00:26:41.825 iops : min=34454, max=37458, avg=35895.79, stdev=465.20, samples=38 00:26:41.825 lat (usec) : 10=0.01%, 20=0.01%, 50=9.09%, 100=31.98%, 250=56.59% 00:26:41.825 lat (usec) : 500=2.33%, 750=0.01%, 1000=0.01% 00:26:41.825 lat (msec) : 2=0.01% 00:26:41.825 cpu : usr=99.70%, sys=0.00%, ctx=22, majf=0, minf=479 00:26:41.825 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:26:41.825 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:26:41.825 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:26:41.825 issued rwts: total=314739,358408,0,0 short=0,0,0,0 dropped=0,0,0,0 00:26:41.825 latency : target=0, window=0, percentile=100.00%, depth=8 00:26:41.825 00:26:41.825 Run status group 0 (all jobs): 00:26:41.825 READ: bw=123MiB/s (129MB/s), 123MiB/s-123MiB/s (129MB/s-129MB/s), io=1229MiB (1289MB), run=10000-10000msec 00:26:41.825 WRITE: bw=148MiB/s (155MB/s), 148MiB/s-148MiB/s (155MB/s-155MB/s), io=1400MiB (1468MB), run=9483-9483msec 00:26:41.825 00:26:41.825 real 0m10.954s 00:26:41.825 user 0m26.661s 00:26:41.825 sys 0m0.264s 00:26:41.825 12:07:30 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:41.825 12:07:30 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:26:41.825 ************************************ 00:26:41.825 END TEST bdev_fio_rw_verify 00:26:41.825 ************************************ 00:26:41.825 12:07:30 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:26:41.825 12:07:30 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:26:41.825 12:07:30 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:26:41.825 12:07:30 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:26:41.825 12:07:30 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:26:41.825 12:07:30 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:26:41.825 12:07:30 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:26:41.825 12:07:30 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:26:41.825 12:07:30 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:26:41.825 12:07:30 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:26:41.825 12:07:30 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:26:41.825 12:07:30 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:26:41.825 12:07:30 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:26:41.825 12:07:30 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:26:41.825 12:07:30 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:26:41.825 12:07:30 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:26:41.825 12:07:30 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:26:41.825 12:07:30 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:26:41.825 12:07:30 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "e3804b03-c6a7-5580-be8b-2b0956903d80"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "e3804b03-c6a7-5580-be8b-2b0956903d80",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "b674a1f9-8f8c-5825-aeda-de1803dbae38"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "b674a1f9-8f8c-5825-aeda-de1803dbae38",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:26:41.825 12:07:30 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:26:41.825 crypto_ram3 ]] 00:26:41.825 12:07:30 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:26:41.825 12:07:30 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "e3804b03-c6a7-5580-be8b-2b0956903d80"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "e3804b03-c6a7-5580-be8b-2b0956903d80",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "b674a1f9-8f8c-5825-aeda-de1803dbae38"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "b674a1f9-8f8c-5825-aeda-de1803dbae38",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:26:41.825 12:07:30 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:26:41.825 12:07:30 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:26:41.825 12:07:30 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:26:41.825 12:07:30 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:26:41.825 12:07:30 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:26:41.825 12:07:30 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:26:41.826 12:07:30 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:41.826 12:07:30 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:26:41.826 12:07:30 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:41.826 12:07:30 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:26:41.826 ************************************ 00:26:41.826 START TEST bdev_fio_trim 00:26:41.826 ************************************ 00:26:41.826 12:07:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:41.826 12:07:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:41.826 12:07:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:26:41.826 12:07:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:26:41.826 12:07:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:26:41.826 12:07:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:26:41.826 12:07:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:26:41.826 12:07:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:26:41.826 12:07:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:26:41.826 12:07:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:26:41.826 12:07:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:26:41.826 12:07:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:26:41.826 12:07:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:26:41.826 12:07:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:26:41.826 12:07:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:26:41.826 12:07:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:26:41.826 12:07:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:26:41.826 12:07:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:26:41.826 12:07:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:26:41.826 12:07:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:26:41.826 12:07:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:26:41.826 12:07:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:41.826 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:41.826 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:41.826 fio-3.35 00:26:41.826 Starting 2 threads 00:26:51.831 00:26:51.831 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=783638: Fri Jul 12 12:07:41 2024 00:26:51.831 write: IOPS=60.2k, BW=235MiB/s (247MB/s)(2352MiB/10001msec); 0 zone resets 00:26:51.831 slat (usec): min=9, max=1250, avg=15.06, stdev= 3.73 00:26:51.831 clat (usec): min=24, max=1436, avg=107.06, stdev=60.99 00:26:51.831 lat (usec): min=34, max=1487, avg=122.12, stdev=63.40 00:26:51.831 clat percentiles (usec): 00:26:51.831 | 50.000th=[ 86], 99.000th=[ 215], 99.900th=[ 241], 99.990th=[ 469], 00:26:51.831 | 99.999th=[ 603] 00:26:51.831 bw ( KiB/s): min=232944, max=243976, per=100.00%, avg=240837.89, stdev=1071.83, samples=38 00:26:51.831 iops : min=58236, max=60994, avg=60209.47, stdev=267.96, samples=38 00:26:51.831 trim: IOPS=60.2k, BW=235MiB/s (247MB/s)(2352MiB/10001msec); 0 zone resets 00:26:51.831 slat (nsec): min=3805, max=72246, avg=6772.37, stdev=1953.07 00:26:51.831 clat (usec): min=29, max=1336, avg=70.98, stdev=20.97 00:26:51.831 lat (usec): min=35, max=1344, avg=77.75, stdev=21.30 00:26:51.831 clat percentiles (usec): 00:26:51.831 | 50.000th=[ 68], 99.000th=[ 111], 99.900th=[ 130], 99.990th=[ 253], 00:26:51.831 | 99.999th=[ 367] 00:26:51.831 bw ( KiB/s): min=232968, max=243976, per=100.00%, avg=240839.16, stdev=1069.81, samples=38 00:26:51.831 iops : min=58242, max=60994, avg=60209.79, stdev=267.45, samples=38 00:26:51.831 lat (usec) : 50=20.91%, 100=53.61%, 250=25.44%, 500=0.04%, 750=0.01% 00:26:51.831 lat (msec) : 2=0.01% 00:26:51.831 cpu : usr=99.72%, sys=0.00%, ctx=28, majf=0, minf=259 00:26:51.831 IO depths : 1=7.5%, 2=17.5%, 4=60.0%, 8=15.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:26:51.831 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:26:51.831 complete : 0=0.0%, 4=87.0%, 8=13.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:26:51.831 issued rwts: total=0,602152,602152,0 short=0,0,0,0 dropped=0,0,0,0 00:26:51.831 latency : target=0, window=0, percentile=100.00%, depth=8 00:26:51.831 00:26:51.831 Run status group 0 (all jobs): 00:26:51.831 WRITE: bw=235MiB/s (247MB/s), 235MiB/s-235MiB/s (247MB/s-247MB/s), io=2352MiB (2466MB), run=10001-10001msec 00:26:51.831 TRIM: bw=235MiB/s (247MB/s), 235MiB/s-235MiB/s (247MB/s-247MB/s), io=2352MiB (2466MB), run=10001-10001msec 00:26:51.831 00:26:51.831 real 0m10.943s 00:26:51.831 user 0m27.018s 00:26:51.831 sys 0m0.257s 00:26:51.831 12:07:41 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:51.831 12:07:41 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:26:51.831 ************************************ 00:26:51.831 END TEST bdev_fio_trim 00:26:51.831 ************************************ 00:26:51.831 12:07:41 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:26:51.831 12:07:41 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:26:51.831 12:07:41 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:26:51.831 12:07:41 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:26:51.831 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:26:51.831 12:07:41 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:26:51.831 00:26:51.831 real 0m22.197s 00:26:51.831 user 0m53.845s 00:26:51.831 sys 0m0.672s 00:26:51.831 12:07:41 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:51.831 12:07:41 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:26:51.831 ************************************ 00:26:51.831 END TEST bdev_fio 00:26:51.831 ************************************ 00:26:51.831 12:07:41 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:26:51.831 12:07:41 blockdev_crypto_sw -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:26:51.831 12:07:41 blockdev_crypto_sw -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:26:51.831 12:07:41 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:26:51.831 12:07:41 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:51.831 12:07:41 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:51.831 ************************************ 00:26:51.832 START TEST bdev_verify 00:26:51.832 ************************************ 00:26:51.832 12:07:41 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:26:51.832 [2024-07-12 12:07:41.695780] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:26:51.832 [2024-07-12 12:07:41.695816] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid785306 ] 00:26:51.832 [2024-07-12 12:07:41.757743] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:26:51.832 [2024-07-12 12:07:41.839831] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:26:51.832 [2024-07-12 12:07:41.839834] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:26:51.832 [2024-07-12 12:07:41.997988] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:26:51.832 [2024-07-12 12:07:41.998045] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:26:51.832 [2024-07-12 12:07:41.998054] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:51.832 [2024-07-12 12:07:42.006010] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:26:51.832 [2024-07-12 12:07:42.006023] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:26:51.832 [2024-07-12 12:07:42.006029] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:51.832 [2024-07-12 12:07:42.014032] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:26:51.832 [2024-07-12 12:07:42.014045] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:26:51.832 [2024-07-12 12:07:42.014050] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:51.832 Running I/O for 5 seconds... 00:26:57.100 00:26:57.100 Latency(us) 00:26:57.100 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:57.100 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:26:57.101 Verification LBA range: start 0x0 length 0x800 00:26:57.101 crypto_ram : 5.01 8763.44 34.23 0.00 0.00 14553.58 1162.48 19348.72 00:26:57.101 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:26:57.101 Verification LBA range: start 0x800 length 0x800 00:26:57.101 crypto_ram : 5.01 8797.31 34.36 0.00 0.00 14499.07 1131.28 19348.72 00:26:57.101 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:26:57.101 Verification LBA range: start 0x0 length 0x800 00:26:57.101 crypto_ram3 : 5.02 4389.68 17.15 0.00 0.00 29039.94 1458.96 22469.49 00:26:57.101 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:26:57.101 Verification LBA range: start 0x800 length 0x800 00:26:57.101 crypto_ram3 : 5.02 4414.29 17.24 0.00 0.00 28875.99 1287.31 22594.32 00:26:57.101 =================================================================================================================== 00:26:57.101 Total : 26364.72 102.99 0.00 0.00 19350.75 1131.28 22594.32 00:26:57.101 00:26:57.101 real 0m5.623s 00:26:57.101 user 0m10.766s 00:26:57.101 sys 0m0.157s 00:26:57.101 12:07:47 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:57.101 12:07:47 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:26:57.101 ************************************ 00:26:57.101 END TEST bdev_verify 00:26:57.101 ************************************ 00:26:57.101 12:07:47 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:26:57.101 12:07:47 blockdev_crypto_sw -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:26:57.101 12:07:47 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:26:57.101 12:07:47 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:57.101 12:07:47 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:57.101 ************************************ 00:26:57.101 START TEST bdev_verify_big_io 00:26:57.101 ************************************ 00:26:57.101 12:07:47 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:26:57.359 [2024-07-12 12:07:47.393107] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:26:57.359 [2024-07-12 12:07:47.393144] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid786242 ] 00:26:57.359 [2024-07-12 12:07:47.457111] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:26:57.359 [2024-07-12 12:07:47.529571] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:26:57.359 [2024-07-12 12:07:47.529573] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:26:57.618 [2024-07-12 12:07:47.686341] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:26:57.618 [2024-07-12 12:07:47.686387] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:26:57.618 [2024-07-12 12:07:47.686395] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:57.618 [2024-07-12 12:07:47.694361] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:26:57.618 [2024-07-12 12:07:47.694373] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:26:57.618 [2024-07-12 12:07:47.694378] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:57.618 [2024-07-12 12:07:47.702384] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:26:57.618 [2024-07-12 12:07:47.702396] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:26:57.618 [2024-07-12 12:07:47.702401] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:57.618 Running I/O for 5 seconds... 00:27:02.890 00:27:02.890 Latency(us) 00:27:02.891 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:02.891 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:27:02.891 Verification LBA range: start 0x0 length 0x80 00:27:02.891 crypto_ram : 5.01 689.27 43.08 0.00 0.00 182364.14 5118.05 250659.60 00:27:02.891 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:27:02.891 Verification LBA range: start 0x80 length 0x80 00:27:02.891 crypto_ram : 5.02 688.19 43.01 0.00 0.00 182625.78 4899.60 251658.24 00:27:02.891 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:27:02.891 Verification LBA range: start 0x0 length 0x80 00:27:02.891 crypto_ram3 : 5.24 390.83 24.43 0.00 0.00 313678.36 4587.52 261644.68 00:27:02.891 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:27:02.891 Verification LBA range: start 0x80 length 0x80 00:27:02.891 crypto_ram3 : 5.25 390.31 24.39 0.00 0.00 314249.17 3885.35 263641.97 00:27:02.891 =================================================================================================================== 00:27:02.891 Total : 2158.61 134.91 0.00 0.00 231413.58 3885.35 263641.97 00:27:03.148 00:27:03.148 real 0m5.861s 00:27:03.148 user 0m11.234s 00:27:03.148 sys 0m0.166s 00:27:03.148 12:07:53 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:03.148 12:07:53 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:27:03.148 ************************************ 00:27:03.148 END TEST bdev_verify_big_io 00:27:03.148 ************************************ 00:27:03.148 12:07:53 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:27:03.148 12:07:53 blockdev_crypto_sw -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:27:03.148 12:07:53 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:27:03.148 12:07:53 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:03.148 12:07:53 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:27:03.148 ************************************ 00:27:03.148 START TEST bdev_write_zeroes 00:27:03.148 ************************************ 00:27:03.148 12:07:53 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:27:03.148 [2024-07-12 12:07:53.323862] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:27:03.148 [2024-07-12 12:07:53.323908] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid787305 ] 00:27:03.148 [2024-07-12 12:07:53.384917] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:03.405 [2024-07-12 12:07:53.455121] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:27:03.405 [2024-07-12 12:07:53.605031] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:27:03.405 [2024-07-12 12:07:53.605084] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:27:03.405 [2024-07-12 12:07:53.605092] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:03.405 [2024-07-12 12:07:53.613050] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:27:03.405 [2024-07-12 12:07:53.613062] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:27:03.405 [2024-07-12 12:07:53.613067] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:03.405 [2024-07-12 12:07:53.621069] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:27:03.405 [2024-07-12 12:07:53.621078] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:27:03.405 [2024-07-12 12:07:53.621083] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:03.663 Running I/O for 1 seconds... 00:27:04.597 00:27:04.597 Latency(us) 00:27:04.597 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:04.597 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:27:04.597 crypto_ram : 1.01 41759.50 163.12 0.00 0.00 3059.38 807.50 4587.52 00:27:04.597 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:27:04.597 crypto_ram3 : 1.01 20853.47 81.46 0.00 0.00 6108.83 3776.12 6834.47 00:27:04.597 =================================================================================================================== 00:27:04.597 Total : 62612.97 244.58 0.00 0.00 4075.86 807.50 6834.47 00:27:04.855 00:27:04.855 real 0m1.576s 00:27:04.855 user 0m1.409s 00:27:04.855 sys 0m0.151s 00:27:04.855 12:07:54 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:04.855 12:07:54 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:27:04.855 ************************************ 00:27:04.855 END TEST bdev_write_zeroes 00:27:04.855 ************************************ 00:27:04.855 12:07:54 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:27:04.855 12:07:54 blockdev_crypto_sw -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:27:04.855 12:07:54 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:27:04.855 12:07:54 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:04.855 12:07:54 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:27:04.855 ************************************ 00:27:04.855 START TEST bdev_json_nonenclosed 00:27:04.855 ************************************ 00:27:04.855 12:07:54 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:27:04.855 [2024-07-12 12:07:54.967864] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:27:04.855 [2024-07-12 12:07:54.967898] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid787554 ] 00:27:04.855 [2024-07-12 12:07:55.029791] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:04.855 [2024-07-12 12:07:55.100793] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:27:04.855 [2024-07-12 12:07:55.100843] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:27:04.855 [2024-07-12 12:07:55.100854] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:27:04.855 [2024-07-12 12:07:55.100860] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:27:05.112 00:27:05.112 real 0m0.258s 00:27:05.112 user 0m0.169s 00:27:05.112 sys 0m0.086s 00:27:05.112 12:07:55 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:27:05.112 12:07:55 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:05.112 12:07:55 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:27:05.112 ************************************ 00:27:05.112 END TEST bdev_json_nonenclosed 00:27:05.112 ************************************ 00:27:05.112 12:07:55 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 234 00:27:05.113 12:07:55 blockdev_crypto_sw -- bdev/blockdev.sh@782 -- # true 00:27:05.113 12:07:55 blockdev_crypto_sw -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:27:05.113 12:07:55 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:27:05.113 12:07:55 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:05.113 12:07:55 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:27:05.113 ************************************ 00:27:05.113 START TEST bdev_json_nonarray 00:27:05.113 ************************************ 00:27:05.113 12:07:55 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:27:05.113 [2024-07-12 12:07:55.295092] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:27:05.113 [2024-07-12 12:07:55.295126] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid787579 ] 00:27:05.113 [2024-07-12 12:07:55.355415] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:05.371 [2024-07-12 12:07:55.426124] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:27:05.371 [2024-07-12 12:07:55.426178] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:27:05.371 [2024-07-12 12:07:55.426189] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:27:05.371 [2024-07-12 12:07:55.426196] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:27:05.371 00:27:05.371 real 0m0.254s 00:27:05.371 user 0m0.157s 00:27:05.371 sys 0m0.095s 00:27:05.371 12:07:55 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:27:05.371 12:07:55 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:05.371 12:07:55 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:27:05.371 ************************************ 00:27:05.371 END TEST bdev_json_nonarray 00:27:05.371 ************************************ 00:27:05.371 12:07:55 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 234 00:27:05.371 12:07:55 blockdev_crypto_sw -- bdev/blockdev.sh@785 -- # true 00:27:05.371 12:07:55 blockdev_crypto_sw -- bdev/blockdev.sh@787 -- # [[ crypto_sw == bdev ]] 00:27:05.371 12:07:55 blockdev_crypto_sw -- bdev/blockdev.sh@794 -- # [[ crypto_sw == gpt ]] 00:27:05.371 12:07:55 blockdev_crypto_sw -- bdev/blockdev.sh@798 -- # [[ crypto_sw == crypto_sw ]] 00:27:05.371 12:07:55 blockdev_crypto_sw -- bdev/blockdev.sh@799 -- # run_test bdev_crypto_enomem bdev_crypto_enomem 00:27:05.371 12:07:55 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:27:05.371 12:07:55 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:05.371 12:07:55 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:27:05.371 ************************************ 00:27:05.371 START TEST bdev_crypto_enomem 00:27:05.371 ************************************ 00:27:05.371 12:07:55 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1123 -- # bdev_crypto_enomem 00:27:05.371 12:07:55 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@635 -- # local base_dev=base0 00:27:05.371 12:07:55 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@636 -- # local test_dev=crypt0 00:27:05.371 12:07:55 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@637 -- # local err_dev=EE_base0 00:27:05.371 12:07:55 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@638 -- # local qd=32 00:27:05.371 12:07:55 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@641 -- # ERR_PID=787610 00:27:05.371 12:07:55 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@642 -- # trap 'cleanup; killprocess $ERR_PID; exit 1' SIGINT SIGTERM EXIT 00:27:05.371 12:07:55 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@640 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 32 -o 4096 -w randwrite -t 5 -f '' 00:27:05.371 12:07:55 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@643 -- # waitforlisten 787610 00:27:05.371 12:07:55 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@829 -- # '[' -z 787610 ']' 00:27:05.371 12:07:55 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:05.371 12:07:55 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:05.371 12:07:55 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:05.371 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:05.371 12:07:55 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:05.371 12:07:55 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:27:05.629 [2024-07-12 12:07:55.623024] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:27:05.629 [2024-07-12 12:07:55.623060] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid787610 ] 00:27:05.629 [2024-07-12 12:07:55.686824] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:05.629 [2024-07-12 12:07:55.764140] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:27:06.195 12:07:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:06.195 12:07:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@862 -- # return 0 00:27:06.195 12:07:56 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@645 -- # rpc_cmd 00:27:06.195 12:07:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:06.195 12:07:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:27:06.195 true 00:27:06.195 base0 00:27:06.195 true 00:27:06.454 [2024-07-12 12:07:56.442856] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:27:06.454 crypt0 00:27:06.454 12:07:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:06.454 12:07:56 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@652 -- # waitforbdev crypt0 00:27:06.454 12:07:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@897 -- # local bdev_name=crypt0 00:27:06.454 12:07:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:06.454 12:07:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@899 -- # local i 00:27:06.454 12:07:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:06.454 12:07:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:06.454 12:07:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:27:06.454 12:07:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:06.454 12:07:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:27:06.454 12:07:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:06.454 12:07:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b crypt0 -t 2000 00:27:06.454 12:07:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:06.454 12:07:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:27:06.454 [ 00:27:06.454 { 00:27:06.454 "name": "crypt0", 00:27:06.454 "aliases": [ 00:27:06.454 "9f780666-d0f5-5871-a631-1acbb4c94103" 00:27:06.454 ], 00:27:06.454 "product_name": "crypto", 00:27:06.454 "block_size": 512, 00:27:06.454 "num_blocks": 2097152, 00:27:06.454 "uuid": "9f780666-d0f5-5871-a631-1acbb4c94103", 00:27:06.454 "assigned_rate_limits": { 00:27:06.454 "rw_ios_per_sec": 0, 00:27:06.454 "rw_mbytes_per_sec": 0, 00:27:06.454 "r_mbytes_per_sec": 0, 00:27:06.454 "w_mbytes_per_sec": 0 00:27:06.454 }, 00:27:06.454 "claimed": false, 00:27:06.454 "zoned": false, 00:27:06.454 "supported_io_types": { 00:27:06.454 "read": true, 00:27:06.454 "write": true, 00:27:06.454 "unmap": false, 00:27:06.454 "flush": false, 00:27:06.454 "reset": true, 00:27:06.454 "nvme_admin": false, 00:27:06.454 "nvme_io": false, 00:27:06.454 "nvme_io_md": false, 00:27:06.454 "write_zeroes": true, 00:27:06.454 "zcopy": false, 00:27:06.454 "get_zone_info": false, 00:27:06.454 "zone_management": false, 00:27:06.454 "zone_append": false, 00:27:06.454 "compare": false, 00:27:06.454 "compare_and_write": false, 00:27:06.454 "abort": false, 00:27:06.454 "seek_hole": false, 00:27:06.454 "seek_data": false, 00:27:06.454 "copy": false, 00:27:06.454 "nvme_iov_md": false 00:27:06.454 }, 00:27:06.454 "memory_domains": [ 00:27:06.454 { 00:27:06.454 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:06.454 "dma_device_type": 2 00:27:06.454 } 00:27:06.454 ], 00:27:06.454 "driver_specific": { 00:27:06.454 "crypto": { 00:27:06.454 "base_bdev_name": "EE_base0", 00:27:06.454 "name": "crypt0", 00:27:06.454 "key_name": "test_dek_sw" 00:27:06.454 } 00:27:06.454 } 00:27:06.454 } 00:27:06.454 ] 00:27:06.454 12:07:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:06.454 12:07:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@905 -- # return 0 00:27:06.454 12:07:56 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@655 -- # rpcpid=787832 00:27:06.454 12:07:56 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@654 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:27:06.454 12:07:56 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@657 -- # sleep 1 00:27:06.455 Running I/O for 5 seconds... 00:27:07.393 12:07:57 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@658 -- # rpc_cmd bdev_error_inject_error EE_base0 -n 5 -q 31 write nomem 00:27:07.393 12:07:57 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:07.393 12:07:57 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:27:07.393 12:07:57 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:07.393 12:07:57 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@660 -- # wait 787832 00:27:11.581 00:27:11.581 Latency(us) 00:27:11.581 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:11.581 Job: crypt0 (Core Mask 0x2, workload: randwrite, depth: 32, IO size: 4096) 00:27:11.581 crypt0 : 5.00 56415.22 220.37 0.00 0.00 564.63 278.92 1217.10 00:27:11.581 =================================================================================================================== 00:27:11.581 Total : 56415.22 220.37 0.00 0.00 564.63 278.92 1217.10 00:27:11.581 0 00:27:11.581 12:08:01 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@662 -- # rpc_cmd bdev_crypto_delete crypt0 00:27:11.581 12:08:01 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:11.581 12:08:01 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:27:11.581 12:08:01 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:11.581 12:08:01 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@664 -- # killprocess 787610 00:27:11.581 12:08:01 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@948 -- # '[' -z 787610 ']' 00:27:11.581 12:08:01 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@952 -- # kill -0 787610 00:27:11.581 12:08:01 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # uname 00:27:11.581 12:08:01 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:11.581 12:08:01 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 787610 00:27:11.581 12:08:01 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:27:11.581 12:08:01 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:27:11.581 12:08:01 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@966 -- # echo 'killing process with pid 787610' 00:27:11.581 killing process with pid 787610 00:27:11.581 12:08:01 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@967 -- # kill 787610 00:27:11.581 Received shutdown signal, test time was about 5.000000 seconds 00:27:11.581 00:27:11.581 Latency(us) 00:27:11.581 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:11.581 =================================================================================================================== 00:27:11.581 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:27:11.581 12:08:01 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@972 -- # wait 787610 00:27:11.581 12:08:01 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@665 -- # trap - SIGINT SIGTERM EXIT 00:27:11.581 00:27:11.581 real 0m6.219s 00:27:11.581 user 0m6.411s 00:27:11.581 sys 0m0.260s 00:27:11.581 12:08:01 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:11.581 12:08:01 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:27:11.581 ************************************ 00:27:11.581 END TEST bdev_crypto_enomem 00:27:11.581 ************************************ 00:27:11.581 12:08:01 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:27:11.839 12:08:01 blockdev_crypto_sw -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:27:11.839 12:08:01 blockdev_crypto_sw -- bdev/blockdev.sh@811 -- # cleanup 00:27:11.839 12:08:01 blockdev_crypto_sw -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:27:11.839 12:08:01 blockdev_crypto_sw -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:27:11.839 12:08:01 blockdev_crypto_sw -- bdev/blockdev.sh@26 -- # [[ crypto_sw == rbd ]] 00:27:11.839 12:08:01 blockdev_crypto_sw -- bdev/blockdev.sh@30 -- # [[ crypto_sw == daos ]] 00:27:11.839 12:08:01 blockdev_crypto_sw -- bdev/blockdev.sh@34 -- # [[ crypto_sw = \g\p\t ]] 00:27:11.839 12:08:01 blockdev_crypto_sw -- bdev/blockdev.sh@40 -- # [[ crypto_sw == xnvme ]] 00:27:11.839 00:27:11.839 real 0m50.593s 00:27:11.839 user 1m36.091s 00:27:11.839 sys 0m4.404s 00:27:11.839 12:08:01 blockdev_crypto_sw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:11.839 12:08:01 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:27:11.839 ************************************ 00:27:11.839 END TEST blockdev_crypto_sw 00:27:11.839 ************************************ 00:27:11.839 12:08:01 -- common/autotest_common.sh@1142 -- # return 0 00:27:11.839 12:08:01 -- spdk/autotest.sh@359 -- # run_test blockdev_crypto_qat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:27:11.839 12:08:01 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:27:11.839 12:08:01 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:11.840 12:08:01 -- common/autotest_common.sh@10 -- # set +x 00:27:11.840 ************************************ 00:27:11.840 START TEST blockdev_crypto_qat 00:27:11.840 ************************************ 00:27:11.840 12:08:01 blockdev_crypto_qat -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:27:11.840 * Looking for test storage... 00:27:11.840 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:27:11.840 12:08:01 blockdev_crypto_qat -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:27:11.840 12:08:01 blockdev_crypto_qat -- bdev/nbd_common.sh@6 -- # set -e 00:27:11.840 12:08:01 blockdev_crypto_qat -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:27:11.840 12:08:01 blockdev_crypto_qat -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:27:11.840 12:08:01 blockdev_crypto_qat -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:27:11.840 12:08:01 blockdev_crypto_qat -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:27:11.840 12:08:01 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:27:11.840 12:08:01 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:27:11.840 12:08:01 blockdev_crypto_qat -- bdev/blockdev.sh@20 -- # : 00:27:11.840 12:08:01 blockdev_crypto_qat -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:27:11.840 12:08:01 blockdev_crypto_qat -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:27:11.840 12:08:01 blockdev_crypto_qat -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:27:11.840 12:08:01 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # uname -s 00:27:11.840 12:08:01 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:27:11.840 12:08:01 blockdev_crypto_qat -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:27:11.840 12:08:01 blockdev_crypto_qat -- bdev/blockdev.sh@682 -- # test_type=crypto_qat 00:27:11.840 12:08:01 blockdev_crypto_qat -- bdev/blockdev.sh@683 -- # crypto_device= 00:27:11.840 12:08:01 blockdev_crypto_qat -- bdev/blockdev.sh@684 -- # dek= 00:27:11.840 12:08:01 blockdev_crypto_qat -- bdev/blockdev.sh@685 -- # env_ctx= 00:27:11.840 12:08:01 blockdev_crypto_qat -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:27:11.840 12:08:01 blockdev_crypto_qat -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:27:11.840 12:08:01 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == bdev ]] 00:27:11.840 12:08:01 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == crypto_* ]] 00:27:11.840 12:08:01 blockdev_crypto_qat -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:27:11.840 12:08:01 blockdev_crypto_qat -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:27:11.840 12:08:01 blockdev_crypto_qat -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=788796 00:27:11.840 12:08:01 blockdev_crypto_qat -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:27:11.840 12:08:01 blockdev_crypto_qat -- bdev/blockdev.sh@49 -- # waitforlisten 788796 00:27:11.840 12:08:01 blockdev_crypto_qat -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:27:11.840 12:08:01 blockdev_crypto_qat -- common/autotest_common.sh@829 -- # '[' -z 788796 ']' 00:27:11.840 12:08:01 blockdev_crypto_qat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:11.840 12:08:01 blockdev_crypto_qat -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:11.840 12:08:01 blockdev_crypto_qat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:11.840 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:11.840 12:08:01 blockdev_crypto_qat -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:11.840 12:08:01 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:11.840 [2024-07-12 12:08:02.047124] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:27:11.840 [2024-07-12 12:08:02.047167] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid788796 ] 00:27:12.098 [2024-07-12 12:08:02.110849] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:12.098 [2024-07-12 12:08:02.182076] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:27:12.666 12:08:02 blockdev_crypto_qat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:12.666 12:08:02 blockdev_crypto_qat -- common/autotest_common.sh@862 -- # return 0 00:27:12.666 12:08:02 blockdev_crypto_qat -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:27:12.666 12:08:02 blockdev_crypto_qat -- bdev/blockdev.sh@708 -- # setup_crypto_qat_conf 00:27:12.666 12:08:02 blockdev_crypto_qat -- bdev/blockdev.sh@170 -- # rpc_cmd 00:27:12.666 12:08:02 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:12.666 12:08:02 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:12.666 [2024-07-12 12:08:02.844030] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:27:12.666 [2024-07-12 12:08:02.852060] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:27:12.666 [2024-07-12 12:08:02.860076] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:27:12.930 [2024-07-12 12:08:02.922299] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:27:15.462 true 00:27:15.462 true 00:27:15.462 true 00:27:15.462 true 00:27:15.462 Malloc0 00:27:15.462 Malloc1 00:27:15.462 Malloc2 00:27:15.462 Malloc3 00:27:15.462 [2024-07-12 12:08:05.192681] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:27:15.462 crypto_ram 00:27:15.462 [2024-07-12 12:08:05.200696] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:27:15.462 crypto_ram1 00:27:15.462 [2024-07-12 12:08:05.208716] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:27:15.462 crypto_ram2 00:27:15.462 [2024-07-12 12:08:05.216738] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:27:15.462 crypto_ram3 00:27:15.462 [ 00:27:15.462 { 00:27:15.462 "name": "Malloc1", 00:27:15.462 "aliases": [ 00:27:15.462 "da8abc91-fe7f-4f26-8980-ef63e66da98b" 00:27:15.462 ], 00:27:15.462 "product_name": "Malloc disk", 00:27:15.462 "block_size": 512, 00:27:15.462 "num_blocks": 65536, 00:27:15.462 "uuid": "da8abc91-fe7f-4f26-8980-ef63e66da98b", 00:27:15.462 "assigned_rate_limits": { 00:27:15.462 "rw_ios_per_sec": 0, 00:27:15.462 "rw_mbytes_per_sec": 0, 00:27:15.462 "r_mbytes_per_sec": 0, 00:27:15.462 "w_mbytes_per_sec": 0 00:27:15.462 }, 00:27:15.462 "claimed": true, 00:27:15.462 "claim_type": "exclusive_write", 00:27:15.462 "zoned": false, 00:27:15.462 "supported_io_types": { 00:27:15.462 "read": true, 00:27:15.462 "write": true, 00:27:15.462 "unmap": true, 00:27:15.462 "flush": true, 00:27:15.462 "reset": true, 00:27:15.462 "nvme_admin": false, 00:27:15.462 "nvme_io": false, 00:27:15.462 "nvme_io_md": false, 00:27:15.462 "write_zeroes": true, 00:27:15.462 "zcopy": true, 00:27:15.462 "get_zone_info": false, 00:27:15.462 "zone_management": false, 00:27:15.462 "zone_append": false, 00:27:15.462 "compare": false, 00:27:15.462 "compare_and_write": false, 00:27:15.462 "abort": true, 00:27:15.462 "seek_hole": false, 00:27:15.462 "seek_data": false, 00:27:15.462 "copy": true, 00:27:15.462 "nvme_iov_md": false 00:27:15.462 }, 00:27:15.462 "memory_domains": [ 00:27:15.462 { 00:27:15.462 "dma_device_id": "system", 00:27:15.462 "dma_device_type": 1 00:27:15.462 }, 00:27:15.462 { 00:27:15.462 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:15.462 "dma_device_type": 2 00:27:15.462 } 00:27:15.462 ], 00:27:15.462 "driver_specific": {} 00:27:15.462 } 00:27:15.462 ] 00:27:15.462 12:08:05 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:15.462 12:08:05 blockdev_crypto_qat -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:27:15.462 12:08:05 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:15.462 12:08:05 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:15.462 12:08:05 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:15.462 12:08:05 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # cat 00:27:15.462 12:08:05 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:27:15.462 12:08:05 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:15.462 12:08:05 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:15.462 12:08:05 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:15.462 12:08:05 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:27:15.462 12:08:05 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:15.462 12:08:05 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:15.462 12:08:05 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:15.462 12:08:05 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:27:15.462 12:08:05 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:15.462 12:08:05 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:15.462 12:08:05 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:15.462 12:08:05 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:27:15.462 12:08:05 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:27:15.462 12:08:05 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:27:15.462 12:08:05 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:15.462 12:08:05 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:15.462 12:08:05 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:15.462 12:08:05 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:27:15.462 12:08:05 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # jq -r .name 00:27:15.462 12:08:05 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "0477fee9-cb12-52ba-b258-1ec7ce0efeab"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "0477fee9-cb12-52ba-b258-1ec7ce0efeab",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "4719791a-ea95-5d37-babc-0b1878281014"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "4719791a-ea95-5d37-babc-0b1878281014",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "2dcc4c74-59e9-5687-998d-833fd7b6257c"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "2dcc4c74-59e9-5687-998d-833fd7b6257c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "a85915dd-efe6-58ba-8c17-2097467e24f8"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "a85915dd-efe6-58ba-8c17-2097467e24f8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:27:15.462 12:08:05 blockdev_crypto_qat -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:27:15.462 12:08:05 blockdev_crypto_qat -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:27:15.462 12:08:05 blockdev_crypto_qat -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:27:15.462 12:08:05 blockdev_crypto_qat -- bdev/blockdev.sh@754 -- # killprocess 788796 00:27:15.462 12:08:05 blockdev_crypto_qat -- common/autotest_common.sh@948 -- # '[' -z 788796 ']' 00:27:15.462 12:08:05 blockdev_crypto_qat -- common/autotest_common.sh@952 -- # kill -0 788796 00:27:15.462 12:08:05 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # uname 00:27:15.462 12:08:05 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:15.462 12:08:05 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 788796 00:27:15.462 12:08:05 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:15.462 12:08:05 blockdev_crypto_qat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:15.462 12:08:05 blockdev_crypto_qat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 788796' 00:27:15.462 killing process with pid 788796 00:27:15.462 12:08:05 blockdev_crypto_qat -- common/autotest_common.sh@967 -- # kill 788796 00:27:15.462 12:08:05 blockdev_crypto_qat -- common/autotest_common.sh@972 -- # wait 788796 00:27:15.722 12:08:05 blockdev_crypto_qat -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:27:15.722 12:08:05 blockdev_crypto_qat -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:27:15.722 12:08:05 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:27:15.722 12:08:05 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:15.722 12:08:05 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:15.722 ************************************ 00:27:15.722 START TEST bdev_hello_world 00:27:15.722 ************************************ 00:27:15.722 12:08:05 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:27:15.722 [2024-07-12 12:08:05.948809] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:27:15.722 [2024-07-12 12:08:05.948848] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid789487 ] 00:27:15.980 [2024-07-12 12:08:06.010934] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:15.980 [2024-07-12 12:08:06.080456] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:27:15.980 [2024-07-12 12:08:06.101312] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:27:15.980 [2024-07-12 12:08:06.109337] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:27:15.980 [2024-07-12 12:08:06.117356] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:27:15.980 [2024-07-12 12:08:06.212587] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:27:18.516 [2024-07-12 12:08:08.352245] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:27:18.516 [2024-07-12 12:08:08.352318] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:27:18.516 [2024-07-12 12:08:08.352327] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:18.516 [2024-07-12 12:08:08.360263] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:27:18.516 [2024-07-12 12:08:08.360277] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:27:18.516 [2024-07-12 12:08:08.360283] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:18.516 [2024-07-12 12:08:08.368282] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:27:18.516 [2024-07-12 12:08:08.368293] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:27:18.516 [2024-07-12 12:08:08.368298] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:18.516 [2024-07-12 12:08:08.376306] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:27:18.516 [2024-07-12 12:08:08.376321] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:27:18.516 [2024-07-12 12:08:08.376327] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:18.516 [2024-07-12 12:08:08.443495] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:27:18.516 [2024-07-12 12:08:08.443539] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:27:18.516 [2024-07-12 12:08:08.443549] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:27:18.516 [2024-07-12 12:08:08.444393] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:27:18.516 [2024-07-12 12:08:08.444446] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:27:18.516 [2024-07-12 12:08:08.444456] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:27:18.516 [2024-07-12 12:08:08.444486] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:27:18.516 00:27:18.516 [2024-07-12 12:08:08.444497] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:27:18.516 00:27:18.516 real 0m2.831s 00:27:18.516 user 0m2.546s 00:27:18.516 sys 0m0.249s 00:27:18.516 12:08:08 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:18.516 12:08:08 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:27:18.516 ************************************ 00:27:18.516 END TEST bdev_hello_world 00:27:18.516 ************************************ 00:27:18.775 12:08:08 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:27:18.775 12:08:08 blockdev_crypto_qat -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:27:18.775 12:08:08 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:27:18.775 12:08:08 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:18.775 12:08:08 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:18.775 ************************************ 00:27:18.775 START TEST bdev_bounds 00:27:18.775 ************************************ 00:27:18.775 12:08:08 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:27:18.775 12:08:08 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=789964 00:27:18.775 12:08:08 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:27:18.775 12:08:08 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:27:18.775 12:08:08 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 789964' 00:27:18.775 Process bdevio pid: 789964 00:27:18.775 12:08:08 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 789964 00:27:18.775 12:08:08 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 789964 ']' 00:27:18.775 12:08:08 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:18.775 12:08:08 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:18.775 12:08:08 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:18.775 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:18.776 12:08:08 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:18.776 12:08:08 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:27:18.776 [2024-07-12 12:08:08.848416] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:27:18.776 [2024-07-12 12:08:08.848454] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid789964 ] 00:27:18.776 [2024-07-12 12:08:08.912869] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:27:18.776 [2024-07-12 12:08:08.993017] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:27:18.776 [2024-07-12 12:08:08.993115] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:27:18.776 [2024-07-12 12:08:08.993117] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:27:18.776 [2024-07-12 12:08:09.014008] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:27:19.073 [2024-07-12 12:08:09.022038] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:27:19.073 [2024-07-12 12:08:09.030059] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:27:19.073 [2024-07-12 12:08:09.124331] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:27:21.645 [2024-07-12 12:08:11.263083] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:27:21.645 [2024-07-12 12:08:11.263142] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:27:21.645 [2024-07-12 12:08:11.263151] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:21.645 [2024-07-12 12:08:11.271100] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:27:21.645 [2024-07-12 12:08:11.271112] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:27:21.645 [2024-07-12 12:08:11.271118] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:21.645 [2024-07-12 12:08:11.279120] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:27:21.645 [2024-07-12 12:08:11.279131] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:27:21.645 [2024-07-12 12:08:11.279137] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:21.645 [2024-07-12 12:08:11.287141] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:27:21.645 [2024-07-12 12:08:11.287154] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:27:21.645 [2024-07-12 12:08:11.287159] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:21.645 12:08:11 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:21.645 12:08:11 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:27:21.645 12:08:11 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:27:21.645 I/O targets: 00:27:21.645 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:27:21.645 crypto_ram1: 65536 blocks of 512 bytes (32 MiB) 00:27:21.645 crypto_ram2: 8192 blocks of 4096 bytes (32 MiB) 00:27:21.645 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:27:21.645 00:27:21.645 00:27:21.645 CUnit - A unit testing framework for C - Version 2.1-3 00:27:21.645 http://cunit.sourceforge.net/ 00:27:21.645 00:27:21.645 00:27:21.645 Suite: bdevio tests on: crypto_ram3 00:27:21.645 Test: blockdev write read block ...passed 00:27:21.645 Test: blockdev write zeroes read block ...passed 00:27:21.645 Test: blockdev write zeroes read no split ...passed 00:27:21.645 Test: blockdev write zeroes read split ...passed 00:27:21.645 Test: blockdev write zeroes read split partial ...passed 00:27:21.645 Test: blockdev reset ...passed 00:27:21.645 Test: blockdev write read 8 blocks ...passed 00:27:21.645 Test: blockdev write read size > 128k ...passed 00:27:21.645 Test: blockdev write read invalid size ...passed 00:27:21.645 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:27:21.645 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:27:21.645 Test: blockdev write read max offset ...passed 00:27:21.645 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:27:21.645 Test: blockdev writev readv 8 blocks ...passed 00:27:21.645 Test: blockdev writev readv 30 x 1block ...passed 00:27:21.645 Test: blockdev writev readv block ...passed 00:27:21.645 Test: blockdev writev readv size > 128k ...passed 00:27:21.645 Test: blockdev writev readv size > 128k in two iovs ...passed 00:27:21.645 Test: blockdev comparev and writev ...passed 00:27:21.645 Test: blockdev nvme passthru rw ...passed 00:27:21.645 Test: blockdev nvme passthru vendor specific ...passed 00:27:21.645 Test: blockdev nvme admin passthru ...passed 00:27:21.645 Test: blockdev copy ...passed 00:27:21.645 Suite: bdevio tests on: crypto_ram2 00:27:21.645 Test: blockdev write read block ...passed 00:27:21.645 Test: blockdev write zeroes read block ...passed 00:27:21.645 Test: blockdev write zeroes read no split ...passed 00:27:21.645 Test: blockdev write zeroes read split ...passed 00:27:21.645 Test: blockdev write zeroes read split partial ...passed 00:27:21.645 Test: blockdev reset ...passed 00:27:21.645 Test: blockdev write read 8 blocks ...passed 00:27:21.645 Test: blockdev write read size > 128k ...passed 00:27:21.645 Test: blockdev write read invalid size ...passed 00:27:21.645 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:27:21.645 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:27:21.645 Test: blockdev write read max offset ...passed 00:27:21.645 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:27:21.645 Test: blockdev writev readv 8 blocks ...passed 00:27:21.645 Test: blockdev writev readv 30 x 1block ...passed 00:27:21.645 Test: blockdev writev readv block ...passed 00:27:21.645 Test: blockdev writev readv size > 128k ...passed 00:27:21.646 Test: blockdev writev readv size > 128k in two iovs ...passed 00:27:21.646 Test: blockdev comparev and writev ...passed 00:27:21.646 Test: blockdev nvme passthru rw ...passed 00:27:21.646 Test: blockdev nvme passthru vendor specific ...passed 00:27:21.646 Test: blockdev nvme admin passthru ...passed 00:27:21.646 Test: blockdev copy ...passed 00:27:21.646 Suite: bdevio tests on: crypto_ram1 00:27:21.646 Test: blockdev write read block ...passed 00:27:21.646 Test: blockdev write zeroes read block ...passed 00:27:21.646 Test: blockdev write zeroes read no split ...passed 00:27:21.646 Test: blockdev write zeroes read split ...passed 00:27:21.646 Test: blockdev write zeroes read split partial ...passed 00:27:21.646 Test: blockdev reset ...passed 00:27:21.646 Test: blockdev write read 8 blocks ...passed 00:27:21.646 Test: blockdev write read size > 128k ...passed 00:27:21.646 Test: blockdev write read invalid size ...passed 00:27:21.646 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:27:21.646 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:27:21.646 Test: blockdev write read max offset ...passed 00:27:21.646 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:27:21.646 Test: blockdev writev readv 8 blocks ...passed 00:27:21.646 Test: blockdev writev readv 30 x 1block ...passed 00:27:21.646 Test: blockdev writev readv block ...passed 00:27:21.646 Test: blockdev writev readv size > 128k ...passed 00:27:21.646 Test: blockdev writev readv size > 128k in two iovs ...passed 00:27:21.646 Test: blockdev comparev and writev ...passed 00:27:21.646 Test: blockdev nvme passthru rw ...passed 00:27:21.646 Test: blockdev nvme passthru vendor specific ...passed 00:27:21.646 Test: blockdev nvme admin passthru ...passed 00:27:21.646 Test: blockdev copy ...passed 00:27:21.646 Suite: bdevio tests on: crypto_ram 00:27:21.646 Test: blockdev write read block ...passed 00:27:21.646 Test: blockdev write zeroes read block ...passed 00:27:21.646 Test: blockdev write zeroes read no split ...passed 00:27:21.646 Test: blockdev write zeroes read split ...passed 00:27:21.646 Test: blockdev write zeroes read split partial ...passed 00:27:21.646 Test: blockdev reset ...passed 00:27:21.646 Test: blockdev write read 8 blocks ...passed 00:27:21.646 Test: blockdev write read size > 128k ...passed 00:27:21.646 Test: blockdev write read invalid size ...passed 00:27:21.646 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:27:21.646 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:27:21.646 Test: blockdev write read max offset ...passed 00:27:21.646 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:27:21.646 Test: blockdev writev readv 8 blocks ...passed 00:27:21.646 Test: blockdev writev readv 30 x 1block ...passed 00:27:21.646 Test: blockdev writev readv block ...passed 00:27:21.646 Test: blockdev writev readv size > 128k ...passed 00:27:21.646 Test: blockdev writev readv size > 128k in two iovs ...passed 00:27:21.646 Test: blockdev comparev and writev ...passed 00:27:21.646 Test: blockdev nvme passthru rw ...passed 00:27:21.646 Test: blockdev nvme passthru vendor specific ...passed 00:27:21.646 Test: blockdev nvme admin passthru ...passed 00:27:21.646 Test: blockdev copy ...passed 00:27:21.646 00:27:21.646 Run Summary: Type Total Ran Passed Failed Inactive 00:27:21.646 suites 4 4 n/a 0 0 00:27:21.646 tests 92 92 92 0 0 00:27:21.646 asserts 520 520 520 0 n/a 00:27:21.646 00:27:21.646 Elapsed time = 0.507 seconds 00:27:21.646 0 00:27:21.646 12:08:11 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 789964 00:27:21.646 12:08:11 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 789964 ']' 00:27:21.646 12:08:11 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 789964 00:27:21.646 12:08:11 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:27:21.646 12:08:11 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:21.646 12:08:11 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 789964 00:27:21.646 12:08:11 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:21.646 12:08:11 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:21.646 12:08:11 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 789964' 00:27:21.646 killing process with pid 789964 00:27:21.646 12:08:11 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@967 -- # kill 789964 00:27:21.646 12:08:11 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@972 -- # wait 789964 00:27:21.904 12:08:12 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:27:21.904 00:27:21.904 real 0m3.277s 00:27:21.904 user 0m9.263s 00:27:21.904 sys 0m0.395s 00:27:21.904 12:08:12 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:21.904 12:08:12 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:27:21.904 ************************************ 00:27:21.904 END TEST bdev_bounds 00:27:21.904 ************************************ 00:27:21.904 12:08:12 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:27:21.904 12:08:12 blockdev_crypto_qat -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:27:21.904 12:08:12 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:27:21.904 12:08:12 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:21.904 12:08:12 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:21.904 ************************************ 00:27:21.904 START TEST bdev_nbd 00:27:21.904 ************************************ 00:27:21.904 12:08:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:27:21.904 12:08:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:27:21.904 12:08:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:27:21.905 12:08:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:21.905 12:08:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:27:21.905 12:08:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:27:21.905 12:08:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:27:21.905 12:08:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:27:21.905 12:08:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:27:21.905 12:08:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:27:21.905 12:08:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:27:21.905 12:08:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:27:21.905 12:08:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:27:21.905 12:08:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:27:21.905 12:08:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:27:21.905 12:08:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:27:21.905 12:08:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=790456 00:27:21.905 12:08:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:27:21.905 12:08:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:27:21.905 12:08:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 790456 /var/tmp/spdk-nbd.sock 00:27:22.164 12:08:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 790456 ']' 00:27:22.164 12:08:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:27:22.164 12:08:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:22.164 12:08:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:27:22.164 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:27:22.164 12:08:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:22.164 12:08:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:27:22.164 [2024-07-12 12:08:12.197976] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:27:22.164 [2024-07-12 12:08:12.198014] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:27:22.164 [2024-07-12 12:08:12.261711] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:22.164 [2024-07-12 12:08:12.339590] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:27:22.164 [2024-07-12 12:08:12.360440] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:27:22.164 [2024-07-12 12:08:12.368461] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:27:22.164 [2024-07-12 12:08:12.376484] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:27:22.423 [2024-07-12 12:08:12.474286] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:27:24.960 [2024-07-12 12:08:14.613210] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:27:24.960 [2024-07-12 12:08:14.613255] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:27:24.960 [2024-07-12 12:08:14.613283] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:24.960 [2024-07-12 12:08:14.621230] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:27:24.960 [2024-07-12 12:08:14.621241] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:27:24.960 [2024-07-12 12:08:14.621246] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:24.960 [2024-07-12 12:08:14.629248] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:27:24.960 [2024-07-12 12:08:14.629258] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:27:24.960 [2024-07-12 12:08:14.629263] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:24.960 [2024-07-12 12:08:14.637270] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:27:24.960 [2024-07-12 12:08:14.637279] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:27:24.960 [2024-07-12 12:08:14.637284] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:24.960 12:08:14 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:24.960 12:08:14 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:27:24.960 12:08:14 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:27:24.960 12:08:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:24.960 12:08:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:27:24.960 12:08:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:27:24.960 12:08:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:27:24.960 12:08:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:24.960 12:08:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:27:24.960 12:08:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:27:24.960 12:08:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:27:24.960 12:08:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:27:24.960 12:08:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:27:24.960 12:08:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:27:24.960 12:08:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:27:24.960 12:08:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:27:24.960 12:08:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:27:24.960 12:08:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:27:24.960 12:08:14 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:27:24.960 12:08:14 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:27:24.960 12:08:14 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:24.960 12:08:14 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:24.960 12:08:14 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:27:24.960 12:08:14 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:27:24.960 12:08:14 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:24.960 12:08:14 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:24.960 12:08:14 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:24.960 1+0 records in 00:27:24.960 1+0 records out 00:27:24.960 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000209411 s, 19.6 MB/s 00:27:24.960 12:08:14 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:24.960 12:08:14 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:27:24.960 12:08:14 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:24.960 12:08:14 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:24.960 12:08:14 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:27:24.960 12:08:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:27:24.960 12:08:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:27:24.960 12:08:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 00:27:24.960 12:08:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:27:24.960 12:08:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:27:24.960 12:08:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:27:24.960 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:27:24.960 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:27:24.960 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:24.960 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:24.960 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:27:24.960 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:27:24.960 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:24.960 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:24.960 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:24.960 1+0 records in 00:27:24.960 1+0 records out 00:27:24.960 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000230517 s, 17.8 MB/s 00:27:24.960 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:24.960 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:27:24.960 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:24.960 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:24.960 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:27:24.960 12:08:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:27:24.960 12:08:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:27:24.960 12:08:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:27:25.219 12:08:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:27:25.219 12:08:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:27:25.219 12:08:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:27:25.219 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:27:25.219 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:27:25.219 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:25.219 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:25.219 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:27:25.219 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:27:25.219 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:25.219 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:25.219 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:25.219 1+0 records in 00:27:25.219 1+0 records out 00:27:25.219 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000237149 s, 17.3 MB/s 00:27:25.219 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:25.219 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:27:25.219 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:25.219 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:25.219 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:27:25.219 12:08:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:27:25.219 12:08:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:27:25.220 12:08:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:27:25.478 12:08:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:27:25.478 12:08:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:27:25.479 12:08:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:27:25.479 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:27:25.479 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:27:25.479 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:25.479 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:25.479 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:27:25.479 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:27:25.479 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:25.479 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:25.479 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:25.479 1+0 records in 00:27:25.479 1+0 records out 00:27:25.479 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000218034 s, 18.8 MB/s 00:27:25.479 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:25.479 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:27:25.479 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:25.479 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:25.479 12:08:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:27:25.479 12:08:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:27:25.479 12:08:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:27:25.479 12:08:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:27:25.479 12:08:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:27:25.479 { 00:27:25.479 "nbd_device": "/dev/nbd0", 00:27:25.479 "bdev_name": "crypto_ram" 00:27:25.479 }, 00:27:25.479 { 00:27:25.479 "nbd_device": "/dev/nbd1", 00:27:25.479 "bdev_name": "crypto_ram1" 00:27:25.479 }, 00:27:25.479 { 00:27:25.479 "nbd_device": "/dev/nbd2", 00:27:25.479 "bdev_name": "crypto_ram2" 00:27:25.479 }, 00:27:25.479 { 00:27:25.479 "nbd_device": "/dev/nbd3", 00:27:25.479 "bdev_name": "crypto_ram3" 00:27:25.479 } 00:27:25.479 ]' 00:27:25.479 12:08:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:27:25.479 12:08:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:27:25.479 { 00:27:25.479 "nbd_device": "/dev/nbd0", 00:27:25.479 "bdev_name": "crypto_ram" 00:27:25.479 }, 00:27:25.479 { 00:27:25.479 "nbd_device": "/dev/nbd1", 00:27:25.479 "bdev_name": "crypto_ram1" 00:27:25.479 }, 00:27:25.479 { 00:27:25.479 "nbd_device": "/dev/nbd2", 00:27:25.479 "bdev_name": "crypto_ram2" 00:27:25.479 }, 00:27:25.479 { 00:27:25.479 "nbd_device": "/dev/nbd3", 00:27:25.479 "bdev_name": "crypto_ram3" 00:27:25.479 } 00:27:25.479 ]' 00:27:25.479 12:08:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:27:25.738 12:08:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:27:25.738 12:08:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:25.738 12:08:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:27:25.738 12:08:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:25.738 12:08:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:27:25.738 12:08:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:25.738 12:08:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:27:25.738 12:08:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:25.738 12:08:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:25.738 12:08:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:25.738 12:08:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:25.738 12:08:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:25.738 12:08:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:25.738 12:08:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:25.738 12:08:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:25.738 12:08:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:25.738 12:08:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:27:25.998 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:25.998 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:25.998 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:25.998 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:25.998 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:25.998 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:25.998 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:25.998 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:25.998 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:25.998 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:27:26.257 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:27:26.257 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:27:26.257 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:27:26.257 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:26.257 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:26.257 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:27:26.257 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:26.257 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:26.257 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:26.257 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:27:26.257 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:27:26.257 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:27:26.257 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:27:26.257 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:26.257 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:26.257 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:27:26.257 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:26.257 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:26.257 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:27:26.257 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:26.257 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:27:26.543 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:27:26.543 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:27:26.543 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:27:26.543 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:27:26.543 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:27:26.544 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:27:26.544 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:27:26.544 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:27:26.544 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:27:26.544 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:27:26.544 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:27:26.544 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:27:26.544 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:27:26.544 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:26.544 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:27:26.544 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:27:26.544 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:27:26.544 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:27:26.544 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:27:26.544 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:26.544 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:27:26.544 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:26.544 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:27:26.544 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:26.544 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:27:26.544 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:26.544 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:27:26.544 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:27:26.803 /dev/nbd0 00:27:26.803 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:26.803 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:26.803 12:08:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:27:26.803 12:08:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:27:26.803 12:08:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:26.803 12:08:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:26.803 12:08:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:27:26.803 12:08:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:27:26.803 12:08:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:26.803 12:08:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:26.803 12:08:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:26.803 1+0 records in 00:27:26.803 1+0 records out 00:27:26.803 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000226837 s, 18.1 MB/s 00:27:26.803 12:08:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:26.803 12:08:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:27:26.803 12:08:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:26.803 12:08:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:26.803 12:08:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:27:26.803 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:26.803 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:27:26.803 12:08:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 /dev/nbd1 00:27:26.803 /dev/nbd1 00:27:27.062 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:27:27.062 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:27:27.062 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:27:27.062 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:27:27.062 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:27.062 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:27.062 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:27:27.062 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:27:27.062 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:27.062 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:27.062 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:27.062 1+0 records in 00:27:27.062 1+0 records out 00:27:27.062 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000225345 s, 18.2 MB/s 00:27:27.062 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:27.062 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:27:27.062 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:27.062 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:27.062 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:27:27.062 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:27.062 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:27:27.062 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd10 00:27:27.062 /dev/nbd10 00:27:27.062 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:27:27.062 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:27:27.062 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:27:27.062 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:27:27.062 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:27.062 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:27.062 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:27:27.062 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:27:27.062 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:27.063 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:27.063 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:27.063 1+0 records in 00:27:27.063 1+0 records out 00:27:27.063 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000154088 s, 26.6 MB/s 00:27:27.063 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:27.063 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:27:27.063 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:27.063 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:27.063 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:27:27.063 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:27.063 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:27:27.063 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd11 00:27:27.322 /dev/nbd11 00:27:27.322 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:27:27.322 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:27:27.322 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:27:27.322 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:27:27.322 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:27.322 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:27.322 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:27:27.322 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:27:27.322 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:27.322 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:27.322 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:27.322 1+0 records in 00:27:27.322 1+0 records out 00:27:27.322 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000198517 s, 20.6 MB/s 00:27:27.322 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:27.322 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:27:27.322 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:27.322 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:27.322 12:08:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:27:27.322 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:27.322 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:27:27.322 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:27:27.322 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:27.322 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:27:27.581 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:27:27.581 { 00:27:27.581 "nbd_device": "/dev/nbd0", 00:27:27.581 "bdev_name": "crypto_ram" 00:27:27.581 }, 00:27:27.581 { 00:27:27.581 "nbd_device": "/dev/nbd1", 00:27:27.581 "bdev_name": "crypto_ram1" 00:27:27.581 }, 00:27:27.581 { 00:27:27.581 "nbd_device": "/dev/nbd10", 00:27:27.581 "bdev_name": "crypto_ram2" 00:27:27.581 }, 00:27:27.581 { 00:27:27.581 "nbd_device": "/dev/nbd11", 00:27:27.581 "bdev_name": "crypto_ram3" 00:27:27.581 } 00:27:27.581 ]' 00:27:27.581 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:27:27.581 { 00:27:27.581 "nbd_device": "/dev/nbd0", 00:27:27.581 "bdev_name": "crypto_ram" 00:27:27.581 }, 00:27:27.581 { 00:27:27.581 "nbd_device": "/dev/nbd1", 00:27:27.581 "bdev_name": "crypto_ram1" 00:27:27.581 }, 00:27:27.581 { 00:27:27.581 "nbd_device": "/dev/nbd10", 00:27:27.581 "bdev_name": "crypto_ram2" 00:27:27.581 }, 00:27:27.581 { 00:27:27.581 "nbd_device": "/dev/nbd11", 00:27:27.581 "bdev_name": "crypto_ram3" 00:27:27.581 } 00:27:27.581 ]' 00:27:27.581 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:27:27.581 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:27:27.581 /dev/nbd1 00:27:27.581 /dev/nbd10 00:27:27.581 /dev/nbd11' 00:27:27.581 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:27:27.581 /dev/nbd1 00:27:27.581 /dev/nbd10 00:27:27.581 /dev/nbd11' 00:27:27.581 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:27:27.581 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:27:27.581 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:27:27.581 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:27:27.581 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:27:27.581 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:27:27.581 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:27:27.581 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:27:27.581 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:27:27.581 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:27:27.581 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:27:27.581 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:27:27.581 256+0 records in 00:27:27.581 256+0 records out 00:27:27.581 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0102705 s, 102 MB/s 00:27:27.582 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:27:27.582 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:27:27.582 256+0 records in 00:27:27.582 256+0 records out 00:27:27.582 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0346172 s, 30.3 MB/s 00:27:27.582 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:27:27.582 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:27:27.582 256+0 records in 00:27:27.582 256+0 records out 00:27:27.582 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0298175 s, 35.2 MB/s 00:27:27.582 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:27:27.582 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:27:27.582 256+0 records in 00:27:27.582 256+0 records out 00:27:27.582 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0246809 s, 42.5 MB/s 00:27:27.582 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:27:27.582 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:27:27.840 256+0 records in 00:27:27.840 256+0 records out 00:27:27.840 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.023059 s, 45.5 MB/s 00:27:27.840 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:27:27.840 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:27:27.840 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:27:27.840 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:27:27.840 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:27:27.840 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:27:27.840 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:27:27.840 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:27:27.840 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:27:27.840 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:27:27.840 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:27:27.840 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:27:27.840 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:27:27.840 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:27:27.840 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:27:27.840 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:27:27.840 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:27:27.840 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:27.840 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:27:27.840 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:27.840 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:27:27.840 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:27.840 12:08:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:27:27.840 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:27.840 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:27.840 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:27.840 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:27.840 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:27.840 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:27.840 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:27.840 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:27.840 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:27.840 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:27:28.099 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:28.099 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:28.099 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:28.099 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:28.099 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:28.099 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:28.099 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:28.099 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:28.099 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:28.099 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:27:28.357 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:27:28.357 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:27:28.357 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:27:28.357 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:28.357 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:28.357 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:27:28.357 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:28.357 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:28.357 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:28.357 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:27:28.357 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:27:28.357 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:27:28.357 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:27:28.357 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:28.357 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:28.357 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:27:28.357 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:28.357 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:28.357 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:27:28.357 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:28.357 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:27:28.616 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:27:28.616 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:27:28.616 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:27:28.616 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:27:28.616 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:27:28.616 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:27:28.616 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:27:28.616 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:27:28.616 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:27:28.616 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:27:28.616 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:27:28.616 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:27:28.617 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:27:28.617 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:28.617 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:27:28.617 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:27:28.617 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:27:28.617 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:27:28.875 malloc_lvol_verify 00:27:28.875 12:08:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:27:29.134 c561c6cd-0481-404e-a706-d086f376a3b2 00:27:29.134 12:08:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:27:29.134 eebd03cc-8c8b-47ca-9795-4b51cbc98126 00:27:29.134 12:08:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:27:29.392 /dev/nbd0 00:27:29.392 12:08:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:27:29.392 mke2fs 1.46.5 (30-Dec-2021) 00:27:29.392 Discarding device blocks: 0/4096 done 00:27:29.392 Creating filesystem with 4096 1k blocks and 1024 inodes 00:27:29.392 00:27:29.393 Allocating group tables: 0/1 done 00:27:29.393 Writing inode tables: 0/1 done 00:27:29.393 Creating journal (1024 blocks): done 00:27:29.393 Writing superblocks and filesystem accounting information: 0/1 done 00:27:29.393 00:27:29.393 12:08:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:27:29.393 12:08:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:27:29.393 12:08:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:29.393 12:08:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:27:29.393 12:08:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:29.393 12:08:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:27:29.393 12:08:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:29.393 12:08:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:27:29.651 12:08:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:29.651 12:08:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:29.651 12:08:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:29.651 12:08:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:29.651 12:08:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:29.651 12:08:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:29.651 12:08:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:29.651 12:08:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:29.651 12:08:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:27:29.651 12:08:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:27:29.651 12:08:19 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 790456 00:27:29.651 12:08:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 790456 ']' 00:27:29.651 12:08:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 790456 00:27:29.651 12:08:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:27:29.651 12:08:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:29.651 12:08:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 790456 00:27:29.651 12:08:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:29.651 12:08:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:29.651 12:08:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 790456' 00:27:29.651 killing process with pid 790456 00:27:29.651 12:08:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@967 -- # kill 790456 00:27:29.651 12:08:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@972 -- # wait 790456 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:27:29.911 00:27:29.911 real 0m7.873s 00:27:29.911 user 0m10.470s 00:27:29.911 sys 0m2.400s 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:27:29.911 ************************************ 00:27:29.911 END TEST bdev_nbd 00:27:29.911 ************************************ 00:27:29.911 12:08:20 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:27:29.911 12:08:20 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:27:29.911 12:08:20 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = nvme ']' 00:27:29.911 12:08:20 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = gpt ']' 00:27:29.911 12:08:20 blockdev_crypto_qat -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:27:29.911 12:08:20 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:27:29.911 12:08:20 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:29.911 12:08:20 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:29.911 ************************************ 00:27:29.911 START TEST bdev_fio 00:27:29.911 ************************************ 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:27:29.911 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram1]' 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram1 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:29.911 12:08:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:27:30.170 ************************************ 00:27:30.170 START TEST bdev_fio_rw_verify 00:27:30.170 ************************************ 00:27:30.170 12:08:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:30.170 12:08:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:30.170 12:08:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:27:30.170 12:08:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:27:30.170 12:08:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:27:30.170 12:08:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:27:30.170 12:08:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:27:30.170 12:08:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:27:30.170 12:08:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:27:30.170 12:08:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:27:30.170 12:08:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:27:30.170 12:08:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:27:30.170 12:08:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:27:30.170 12:08:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:27:30.170 12:08:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:27:30.170 12:08:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:27:30.170 12:08:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:27:30.170 12:08:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:27:30.170 12:08:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:27:30.170 12:08:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:27:30.171 12:08:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:27:30.171 12:08:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:30.429 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:30.429 job_crypto_ram1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:30.429 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:30.429 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:30.429 fio-3.35 00:27:30.429 Starting 4 threads 00:27:45.311 00:27:45.311 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=792577: Fri Jul 12 12:08:33 2024 00:27:45.311 read: IOPS=25.4k, BW=99.3MiB/s (104MB/s)(993MiB/10001msec) 00:27:45.311 slat (usec): min=11, max=1302, avg=56.03, stdev=27.51 00:27:45.311 clat (usec): min=18, max=2094, avg=307.54, stdev=176.28 00:27:45.311 lat (usec): min=30, max=2142, avg=363.57, stdev=186.90 00:27:45.311 clat percentiles (usec): 00:27:45.311 | 50.000th=[ 269], 99.000th=[ 816], 99.900th=[ 1020], 99.990th=[ 1401], 00:27:45.311 | 99.999th=[ 1811] 00:27:45.311 write: IOPS=27.9k, BW=109MiB/s (114MB/s)(1062MiB/9738msec); 0 zone resets 00:27:45.311 slat (usec): min=17, max=256, avg=63.67, stdev=25.88 00:27:45.311 clat (usec): min=14, max=1323, avg=334.25, stdev=179.54 00:27:45.311 lat (usec): min=50, max=1421, avg=397.92, stdev=188.91 00:27:45.311 clat percentiles (usec): 00:27:45.311 | 50.000th=[ 306], 99.000th=[ 840], 99.900th=[ 1037], 99.990th=[ 1156], 00:27:45.311 | 99.999th=[ 1254] 00:27:45.311 bw ( KiB/s): min=88032, max=160501, per=97.81%, avg=109172.47, stdev=4421.82, samples=76 00:27:45.311 iops : min=22008, max=40125, avg=27293.11, stdev=1105.45, samples=76 00:27:45.311 lat (usec) : 20=0.01%, 50=0.07%, 100=6.09%, 250=35.80%, 500=41.28% 00:27:45.311 lat (usec) : 750=14.62%, 1000=2.00% 00:27:45.311 lat (msec) : 2=0.14%, 4=0.01% 00:27:45.311 cpu : usr=99.69%, sys=0.00%, ctx=74, majf=0, minf=292 00:27:45.311 IO depths : 1=4.6%, 2=27.3%, 4=54.5%, 8=13.6%, 16=0.0%, 32=0.0%, >=64=0.0% 00:27:45.311 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:27:45.311 complete : 0=0.0%, 4=88.0%, 8=12.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:27:45.311 issued rwts: total=254124,271744,0,0 short=0,0,0,0 dropped=0,0,0,0 00:27:45.311 latency : target=0, window=0, percentile=100.00%, depth=8 00:27:45.311 00:27:45.311 Run status group 0 (all jobs): 00:27:45.311 READ: bw=99.3MiB/s (104MB/s), 99.3MiB/s-99.3MiB/s (104MB/s-104MB/s), io=993MiB (1041MB), run=10001-10001msec 00:27:45.311 WRITE: bw=109MiB/s (114MB/s), 109MiB/s-109MiB/s (114MB/s-114MB/s), io=1062MiB (1113MB), run=9738-9738msec 00:27:45.311 00:27:45.311 real 0m13.211s 00:27:45.311 user 0m48.590s 00:27:45.311 sys 0m0.365s 00:27:45.311 12:08:33 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:45.311 12:08:33 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:27:45.311 ************************************ 00:27:45.311 END TEST bdev_fio_rw_verify 00:27:45.311 ************************************ 00:27:45.311 12:08:33 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:27:45.311 12:08:33 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:27:45.311 12:08:33 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:27:45.311 12:08:33 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:27:45.311 12:08:33 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:27:45.311 12:08:33 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:27:45.311 12:08:33 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:27:45.311 12:08:33 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:27:45.311 12:08:33 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:27:45.311 12:08:33 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:27:45.311 12:08:33 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:27:45.311 12:08:33 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:27:45.311 12:08:33 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:27:45.311 12:08:33 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:27:45.311 12:08:33 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:27:45.311 12:08:33 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:27:45.311 12:08:33 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:27:45.311 12:08:33 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "0477fee9-cb12-52ba-b258-1ec7ce0efeab"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "0477fee9-cb12-52ba-b258-1ec7ce0efeab",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "4719791a-ea95-5d37-babc-0b1878281014"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "4719791a-ea95-5d37-babc-0b1878281014",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "2dcc4c74-59e9-5687-998d-833fd7b6257c"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "2dcc4c74-59e9-5687-998d-833fd7b6257c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "a85915dd-efe6-58ba-8c17-2097467e24f8"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "a85915dd-efe6-58ba-8c17-2097467e24f8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:27:45.311 12:08:33 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:27:45.311 12:08:33 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:27:45.311 crypto_ram1 00:27:45.311 crypto_ram2 00:27:45.311 crypto_ram3 ]] 00:27:45.311 12:08:33 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:27:45.312 12:08:33 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "0477fee9-cb12-52ba-b258-1ec7ce0efeab"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "0477fee9-cb12-52ba-b258-1ec7ce0efeab",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "4719791a-ea95-5d37-babc-0b1878281014"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "4719791a-ea95-5d37-babc-0b1878281014",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "2dcc4c74-59e9-5687-998d-833fd7b6257c"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "2dcc4c74-59e9-5687-998d-833fd7b6257c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "a85915dd-efe6-58ba-8c17-2097467e24f8"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "a85915dd-efe6-58ba-8c17-2097467e24f8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:27:45.312 12:08:33 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:27:45.312 12:08:33 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:27:45.312 12:08:33 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:27:45.312 12:08:33 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:27:45.312 12:08:33 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram1]' 00:27:45.312 12:08:33 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram1 00:27:45.312 12:08:33 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:27:45.312 12:08:33 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:27:45.312 12:08:33 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:27:45.312 12:08:33 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:27:45.312 12:08:33 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:27:45.312 12:08:33 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:27:45.312 12:08:33 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:45.312 12:08:33 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:27:45.312 12:08:33 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:45.312 12:08:33 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:27:45.312 ************************************ 00:27:45.312 START TEST bdev_fio_trim 00:27:45.312 ************************************ 00:27:45.312 12:08:33 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:45.312 12:08:33 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:45.312 12:08:33 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:27:45.312 12:08:33 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:27:45.312 12:08:33 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:27:45.312 12:08:33 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:27:45.312 12:08:33 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:27:45.312 12:08:33 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:27:45.312 12:08:33 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:27:45.312 12:08:33 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:27:45.312 12:08:33 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:27:45.312 12:08:33 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:27:45.312 12:08:33 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:27:45.312 12:08:33 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:27:45.312 12:08:33 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:27:45.312 12:08:33 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:27:45.312 12:08:33 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:27:45.312 12:08:33 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:27:45.312 12:08:33 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:27:45.312 12:08:33 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:27:45.312 12:08:33 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:27:45.312 12:08:33 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:45.312 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:45.312 job_crypto_ram1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:45.312 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:45.312 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:45.312 fio-3.35 00:27:45.312 Starting 4 threads 00:27:57.518 00:27:57.518 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=794753: Fri Jul 12 12:08:46 2024 00:27:57.518 write: IOPS=46.4k, BW=181MiB/s (190MB/s)(1814MiB/10001msec); 0 zone resets 00:27:57.518 slat (usec): min=12, max=883, avg=49.12, stdev=21.10 00:27:57.518 clat (usec): min=24, max=1083, avg=177.80, stdev=87.77 00:27:57.518 lat (usec): min=40, max=1297, avg=226.92, stdev=97.15 00:27:57.518 clat percentiles (usec): 00:27:57.518 | 50.000th=[ 161], 99.000th=[ 392], 99.900th=[ 474], 99.990th=[ 668], 00:27:57.518 | 99.999th=[ 1057] 00:27:57.518 bw ( KiB/s): min=166496, max=280384, per=100.00%, avg=186639.16, stdev=11637.32, samples=76 00:27:57.518 iops : min=41624, max=70096, avg=46659.79, stdev=2909.33, samples=76 00:27:57.518 trim: IOPS=46.4k, BW=181MiB/s (190MB/s)(1814MiB/10001msec); 0 zone resets 00:27:57.518 slat (nsec): min=4250, max=84048, avg=15269.66, stdev=6866.45 00:27:57.518 clat (usec): min=26, max=1298, avg=227.08, stdev=97.16 00:27:57.518 lat (usec): min=30, max=1340, avg=242.35, stdev=99.63 00:27:57.518 clat percentiles (usec): 00:27:57.518 | 50.000th=[ 210], 99.000th=[ 461], 99.900th=[ 545], 99.990th=[ 824], 00:27:57.518 | 99.999th=[ 1270] 00:27:57.518 bw ( KiB/s): min=166496, max=280384, per=100.00%, avg=186638.32, stdev=11637.47, samples=76 00:27:57.518 iops : min=41624, max=70096, avg=46659.79, stdev=2909.34, samples=76 00:27:57.518 lat (usec) : 50=1.20%, 100=13.57%, 250=53.66%, 500=31.39%, 750=0.17% 00:27:57.518 lat (usec) : 1000=0.01% 00:27:57.518 lat (msec) : 2=0.01% 00:27:57.518 cpu : usr=99.69%, sys=0.00%, ctx=54, majf=0, minf=96 00:27:57.518 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:27:57.518 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:27:57.518 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:27:57.518 issued rwts: total=0,464267,464268,0 short=0,0,0,0 dropped=0,0,0,0 00:27:57.518 latency : target=0, window=0, percentile=100.00%, depth=8 00:27:57.518 00:27:57.518 Run status group 0 (all jobs): 00:27:57.518 WRITE: bw=181MiB/s (190MB/s), 181MiB/s-181MiB/s (190MB/s-190MB/s), io=1814MiB (1902MB), run=10001-10001msec 00:27:57.518 TRIM: bw=181MiB/s (190MB/s), 181MiB/s-181MiB/s (190MB/s-190MB/s), io=1814MiB (1902MB), run=10001-10001msec 00:27:57.518 00:27:57.518 real 0m13.226s 00:27:57.518 user 0m48.425s 00:27:57.518 sys 0m0.365s 00:27:57.518 12:08:46 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:57.518 12:08:46 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:27:57.518 ************************************ 00:27:57.518 END TEST bdev_fio_trim 00:27:57.518 ************************************ 00:27:57.518 12:08:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:27:57.518 12:08:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:27:57.518 12:08:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:27:57.518 12:08:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:27:57.518 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:27:57.518 12:08:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:27:57.518 00:27:57.518 real 0m26.726s 00:27:57.518 user 1m37.197s 00:27:57.518 sys 0m0.851s 00:27:57.518 12:08:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:57.518 12:08:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:27:57.518 ************************************ 00:27:57.518 END TEST bdev_fio 00:27:57.518 ************************************ 00:27:57.518 12:08:46 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:27:57.518 12:08:46 blockdev_crypto_qat -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:27:57.518 12:08:46 blockdev_crypto_qat -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:27:57.518 12:08:46 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:27:57.518 12:08:46 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:57.518 12:08:46 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:57.518 ************************************ 00:27:57.518 START TEST bdev_verify 00:27:57.518 ************************************ 00:27:57.518 12:08:46 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:27:57.518 [2024-07-12 12:08:46.924733] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:27:57.518 [2024-07-12 12:08:46.924774] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid796525 ] 00:27:57.518 [2024-07-12 12:08:46.991116] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:27:57.518 [2024-07-12 12:08:47.066447] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:27:57.518 [2024-07-12 12:08:47.066449] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:27:57.518 [2024-07-12 12:08:47.087388] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:27:57.518 [2024-07-12 12:08:47.095412] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:27:57.518 [2024-07-12 12:08:47.103433] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:27:57.519 [2024-07-12 12:08:47.203979] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:27:59.470 [2024-07-12 12:08:49.336388] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:27:59.470 [2024-07-12 12:08:49.336443] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:27:59.470 [2024-07-12 12:08:49.336452] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:59.470 [2024-07-12 12:08:49.344407] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:27:59.470 [2024-07-12 12:08:49.344419] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:27:59.470 [2024-07-12 12:08:49.344425] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:59.470 [2024-07-12 12:08:49.352431] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:27:59.470 [2024-07-12 12:08:49.352441] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:27:59.470 [2024-07-12 12:08:49.352446] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:59.470 [2024-07-12 12:08:49.360455] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:27:59.470 [2024-07-12 12:08:49.360465] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:27:59.470 [2024-07-12 12:08:49.360470] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:59.470 Running I/O for 5 seconds... 00:28:04.771 00:28:04.771 Latency(us) 00:28:04.771 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:04.771 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:28:04.771 Verification LBA range: start 0x0 length 0x1000 00:28:04.771 crypto_ram : 5.06 707.17 2.76 0.00 0.00 180623.01 2793.08 121335.22 00:28:04.771 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:28:04.771 Verification LBA range: start 0x1000 length 0x1000 00:28:04.771 crypto_ram : 5.05 709.04 2.77 0.00 0.00 180220.29 3323.61 121335.22 00:28:04.771 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:28:04.771 Verification LBA range: start 0x0 length 0x1000 00:28:04.771 crypto_ram1 : 5.06 708.62 2.77 0.00 0.00 179967.23 2886.70 114344.72 00:28:04.771 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:28:04.771 Verification LBA range: start 0x1000 length 0x1000 00:28:04.771 crypto_ram1 : 5.06 708.94 2.77 0.00 0.00 179868.14 3666.90 113845.39 00:28:04.771 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:28:04.771 Verification LBA range: start 0x0 length 0x1000 00:28:04.771 crypto_ram2 : 5.03 5517.46 21.55 0.00 0.00 23039.04 5554.96 19848.05 00:28:04.771 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:28:04.771 Verification LBA range: start 0x1000 length 0x1000 00:28:04.771 crypto_ram2 : 5.03 5543.17 21.65 0.00 0.00 22931.23 5430.13 19848.05 00:28:04.771 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:28:04.771 Verification LBA range: start 0x0 length 0x1000 00:28:04.771 crypto_ram3 : 5.05 5529.07 21.60 0.00 0.00 22954.39 2668.25 19848.05 00:28:04.771 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:28:04.771 Verification LBA range: start 0x1000 length 0x1000 00:28:04.771 crypto_ram3 : 5.05 5554.66 21.70 0.00 0.00 22847.64 2761.87 19848.05 00:28:04.771 =================================================================================================================== 00:28:04.771 Total : 24978.13 97.57 0.00 0.00 40829.71 2668.25 121335.22 00:28:04.771 00:28:04.771 real 0m7.929s 00:28:04.771 user 0m15.252s 00:28:04.771 sys 0m0.269s 00:28:04.771 12:08:54 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:04.771 12:08:54 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:28:04.771 ************************************ 00:28:04.771 END TEST bdev_verify 00:28:04.771 ************************************ 00:28:04.771 12:08:54 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:28:04.771 12:08:54 blockdev_crypto_qat -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:28:04.771 12:08:54 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:28:04.771 12:08:54 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:04.771 12:08:54 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:28:04.771 ************************************ 00:28:04.771 START TEST bdev_verify_big_io 00:28:04.771 ************************************ 00:28:04.771 12:08:54 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:28:04.771 [2024-07-12 12:08:54.921025] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:28:04.771 [2024-07-12 12:08:54.921057] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid797755 ] 00:28:04.772 [2024-07-12 12:08:54.982735] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:28:05.030 [2024-07-12 12:08:55.054724] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:28:05.030 [2024-07-12 12:08:55.054726] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:28:05.030 [2024-07-12 12:08:55.075664] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:28:05.030 [2024-07-12 12:08:55.083688] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:28:05.030 [2024-07-12 12:08:55.091710] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:28:05.030 [2024-07-12 12:08:55.197283] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:28:07.564 [2024-07-12 12:08:57.330618] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:28:07.564 [2024-07-12 12:08:57.330677] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:28:07.564 [2024-07-12 12:08:57.330685] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:07.564 [2024-07-12 12:08:57.338636] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:28:07.564 [2024-07-12 12:08:57.338647] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:28:07.564 [2024-07-12 12:08:57.338653] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:07.564 [2024-07-12 12:08:57.346656] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:28:07.564 [2024-07-12 12:08:57.346666] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:28:07.564 [2024-07-12 12:08:57.346671] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:07.564 [2024-07-12 12:08:57.354677] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:28:07.564 [2024-07-12 12:08:57.354686] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:28:07.564 [2024-07-12 12:08:57.354692] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:07.564 Running I/O for 5 seconds... 00:28:07.826 [2024-07-12 12:08:57.938564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.938843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.938897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.938938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.938976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.939010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.939348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.939357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.941881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.941919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.941944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.941970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.942286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.942313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.942348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.942373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.942710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.942719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.945545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.945586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.945618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.945642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.945932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.945962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.945987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.946011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.946271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.946280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.948886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.948914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.948950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.948983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.949266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.949306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.949331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.949356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.949705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.949715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.952187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.952225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.952250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.952274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.952575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.952602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.952626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.952654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.952965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.952974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.955412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.955443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.955468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.955494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.826 [2024-07-12 12:08:57.955853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.955883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.955908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.955933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.956248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.956258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.958565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.958594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.958620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.958648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.958986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.959013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.959037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.959062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.959340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.959349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.961796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.961827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.961869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.961895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.962243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.962270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.962295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.962321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.962680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.962690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.965089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.965120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.965151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.965177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.965478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.965506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.965535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.965575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.965905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.965914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.968249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.968278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.968303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.968341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.968739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.968768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.968793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.968821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.969142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.969152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.971592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.971620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.971645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.971669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.972015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.972042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.972067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.972092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.972413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.972423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.974741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.974769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.974798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.974823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.975167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.975194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.975219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.975244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.975475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.975485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.977803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.977832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.977858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.977885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.978236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.978274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.978299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.978348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.978654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.978663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.981173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.981202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.981227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.981252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.981560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.981589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.981615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.981650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.981934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.981943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.984689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.984739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.984770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.984809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.985165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.985204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.985230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.985254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.985524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.985533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.987959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.988017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.988043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.827 [2024-07-12 12:08:57.988096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:57.988428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:57.988458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:57.988485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:57.988511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:57.988857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:57.988867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:57.991133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:57.991162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:57.991187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:57.991212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:57.991534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:57.991564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:57.991589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:57.991614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:57.991933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:57.991943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:57.994155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:57.994184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:57.994227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:57.994256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:57.994607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:57.994637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:57.994665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:57.994690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:57.995020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:57.995029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:57.997279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:57.997310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:57.997334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:57.997359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:57.997717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:57.997745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:57.997771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:57.997797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:57.998114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:57.998124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.000358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.000387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.000412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.000437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.000761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.000789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.000814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.000838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.001160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.001169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.003472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.003502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.003549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.003576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.003943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.003971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.003998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.004022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.004336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.004345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.006636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.006667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.006708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.006733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.007090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.007118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.007143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.007169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.007492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.007501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.009670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.009699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.009725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.009752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.010095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.010124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.010149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.010175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.010470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.010480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.012684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.012713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.012738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.012767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.013124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.013163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.013189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.013214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.013503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.013512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.015859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.015908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.015939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.015967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.016322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.016350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.016388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.016437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.016737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.016747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.828 [2024-07-12 12:08:58.019000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.019030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.019072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.019099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.019404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.019433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.019458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.019496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.019856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.019866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.022034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.022090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.022125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.022150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.022472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.022502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.022531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.022556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.022867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.022877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.024935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.024963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.025004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.025029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.025377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.025405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.025432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.025457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.025758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.025768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.027445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.027473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.027498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.027526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.027738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.027764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.027789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.027813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.028071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.028080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.029450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.029479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.029504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.029534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.029899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.029927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.029956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.029981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.030261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.030270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.031969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.031997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.032022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.032046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.032249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.032275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.032299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.032323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.032583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.032593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.033844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.033873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.033898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.033931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.034281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.034310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.034335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.034361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.034683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.034693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.037350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.038492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.039155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.039980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.041155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.041987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.042260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.042528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.042845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.042854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.045279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.045794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.046759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.047813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.048978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.049254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.049522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.049786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.050129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.050139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.052493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.053268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.054104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.055096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.056063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.056330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.056594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.056858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.829 [2024-07-12 12:08:58.057180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.830 [2024-07-12 12:08:58.057190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.830 [2024-07-12 12:08:58.058946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.830 [2024-07-12 12:08:58.059947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.830 [2024-07-12 12:08:58.061057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.830 [2024-07-12 12:08:58.062101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.830 [2024-07-12 12:08:58.062596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.830 [2024-07-12 12:08:58.062863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.830 [2024-07-12 12:08:58.063126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.830 [2024-07-12 12:08:58.063393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.830 [2024-07-12 12:08:58.063720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.830 [2024-07-12 12:08:58.063730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.830 [2024-07-12 12:08:58.065500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.830 [2024-07-12 12:08:58.066346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.830 [2024-07-12 12:08:58.067338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.830 [2024-07-12 12:08:58.068415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.830 [2024-07-12 12:08:58.069050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:07.830 [2024-07-12 12:08:58.069318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.091 [2024-07-12 12:08:58.070350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.091 [2024-07-12 12:08:58.071304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.091 [2024-07-12 12:08:58.071488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.091 [2024-07-12 12:08:58.071498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.091 [2024-07-12 12:08:58.073759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.091 [2024-07-12 12:08:58.074468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.091 [2024-07-12 12:08:58.074752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.091 [2024-07-12 12:08:58.075017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.091 [2024-07-12 12:08:58.075617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.091 [2024-07-12 12:08:58.076252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.091 [2024-07-12 12:08:58.077091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.091 [2024-07-12 12:08:58.078087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.091 [2024-07-12 12:08:58.078271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.091 [2024-07-12 12:08:58.078279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.091 [2024-07-12 12:08:58.080495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.091 [2024-07-12 12:08:58.080792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.091 [2024-07-12 12:08:58.081051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.091 [2024-07-12 12:08:58.081307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.091 [2024-07-12 12:08:58.081890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.091 [2024-07-12 12:08:58.082896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.091 [2024-07-12 12:08:58.083995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.091 [2024-07-12 12:08:58.085019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.091 [2024-07-12 12:08:58.085203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.091 [2024-07-12 12:08:58.085212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.091 [2024-07-12 12:08:58.087106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.091 [2024-07-12 12:08:58.087392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.091 [2024-07-12 12:08:58.087663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.091 [2024-07-12 12:08:58.087939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.091 [2024-07-12 12:08:58.088999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.091 [2024-07-12 12:08:58.089805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.090801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.091792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.092069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.092080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.093469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.093735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.093995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.094251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.095355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.096347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.097328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.098178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.098398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.098406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.099885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.100145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.100417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.100698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.101704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.102681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.103661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.104089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.104272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.104280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.105853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.106113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.106373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.107082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.108292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.109295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.110076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.111000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.111208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.111217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.112849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.113134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.113402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.114343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.115523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.116498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.116965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.117801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.117985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.117994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.119841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.120102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.121014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.121843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.123023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.123629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.124689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.125768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.125953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.125962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.127847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.128424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.129241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.130214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.131326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.132152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.132968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.133939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.134118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.134127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.136235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.137116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.138082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.139071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.139803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.140615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.141592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.142582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.142805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.142814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.145637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.146515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.147524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.148554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.149833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.150851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.151844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.152743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.153040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.153054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.155580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.156575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.157560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.158010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.159005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.159990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.160962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.161233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.161583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.161597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.164247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.165263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.166134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.092 [2024-07-12 12:08:58.166956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.168166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.169174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.169843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.170106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.170428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.170437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.172941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.173968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.174395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.175211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.176420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.177541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.177808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.178070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.178374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.178386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.180755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.181506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.182524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.183435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.184639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.185124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.185389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.185657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.186010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.186020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.188281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.188855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.189706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.190686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.191801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.192069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.192333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.192600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.192920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.192929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.194701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.195790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.196776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.197822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.198444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.198713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.198977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.199243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.199547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.199557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.201248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.202076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.203067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.204058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.204579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.204844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.205107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.205370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.205557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.205566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.207789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.208897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.209944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.210911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.211463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.211733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.211995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.212725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.212972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.212982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.214949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.215933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.216908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.217342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.217989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.218271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.218541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.219472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.219660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.219670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.221932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.222934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.223842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.224102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.224693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.224953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.225823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.226647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.226832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.226841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.228955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.229928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.230204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.230463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.231070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.231338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.232318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.233371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.233554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.233564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.235743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.093 [2024-07-12 12:08:58.236696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.236960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.237218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.237802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.238652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.239474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.240409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.240654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.240664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.242043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.242308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.242572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.242831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.243327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.243604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.243860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.244117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.244437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.244445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.246405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.246678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.246944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.247203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.247783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.248041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.248300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.248584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.248895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.248905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.250977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.251246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.251502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.251531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.252138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.252401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.252689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.252956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.253268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.253277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.255289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.255552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.255813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.256074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.256105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.256386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.256659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.256917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.257173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.257432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.257703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.257713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.259465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.259494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.259524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.259550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.259881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.259923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.259948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.259985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.260026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.260329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.260339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.262156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.262186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.262211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.262235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.262469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.262510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.262541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.262583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.262608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.262895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.262904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.264824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.264864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.264889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.264926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.265230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.265271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.265299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.265338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.265386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.265684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.265694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.267545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.267584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.267610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.267634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.267920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.267961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.267986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.268011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.268036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.268333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.268342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.270132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.270161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.270204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.270229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.094 [2024-07-12 12:08:58.270495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.270544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.270572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.270602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.270627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.270967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.270976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.272646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.272700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.272726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.272752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.273069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.273102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.273128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.273154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.273181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.273501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.273510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.275194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.275223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.275261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.275285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.275648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.275682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.275708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.275733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.275759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.276064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.276073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.277894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.277923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.277966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.277992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.278335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.278369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.278395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.278420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.278445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.278810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.278819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.280437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.280465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.280490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.280521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.280827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.280864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.280889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.280913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.280938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.281267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.281276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.283022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.283050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.283076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.283100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.283421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.283453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.283479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.283504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.283535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.283876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.283886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.285570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.285598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.285643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.285683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.286038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.286072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.286098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.286124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.286150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.286472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.286481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.288259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.288287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.288315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.288340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.288659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.288693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.288720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.288745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.288771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.289002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.095 [2024-07-12 12:08:58.289011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.290776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.290804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.290846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.290872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.291191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.291227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.291263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.291288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.291343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.291627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.291638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.293434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.293463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.293488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.293513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.293806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.293847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.293874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.293900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.293925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.294187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.294196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.296042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.296070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.296112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.296146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.296453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.296496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.296536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.296562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.296605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.296884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.296893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.298898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.298953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.298983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.299010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.299303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.299345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.299373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.299398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.299428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.299769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.299778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.301548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.301578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.301613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.301638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.301961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.302022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.302048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.302073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.302098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.302427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.302437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.304133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.304163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.304188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.304212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.304526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.304573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.304600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.304626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.304652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.304971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.304981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.306776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.306819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.306843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.306868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.307207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.307240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.307269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.307294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.307319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.307588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.307598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.309391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.309420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.309465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.309494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.309834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.309867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.309894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.309919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.309944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.310273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.310282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.311950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.311979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.312003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.312028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.312330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.096 [2024-07-12 12:08:58.312366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.312391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.312415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.312439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.312794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.312805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.314118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.314147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.314173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.314201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.314497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.314534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.314562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.314587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.314612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.314949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.314958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.316487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.316514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.316542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.316584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.316769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.316806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.316832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.316862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.316891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.317069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.317077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.318246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.318273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.318300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.318324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.318536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.318576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.318603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.318627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.318665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.319026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.319036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.320614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.320646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.320691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.320716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.320901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.320943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.320970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.320995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.321020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.321195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.321202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.322372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.322401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.322426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.322451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.322637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.322676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.322701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.322726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.322750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.323056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.323065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.324889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.324916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.324956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.324981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.325203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.325240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.325265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.325290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.325315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.325495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.325503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.326598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.326628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.326655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.326679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.326858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.326896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.326920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.326944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.326974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.327149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.327160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.329043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.329071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.329113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.329138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.329334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.329371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.329396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.329420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.329447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.329654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.329664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.330795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.330831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.330858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.330887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.331073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.097 [2024-07-12 12:08:58.331105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.098 [2024-07-12 12:08:58.331140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.098 [2024-07-12 12:08:58.331168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.098 [2024-07-12 12:08:58.331196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.098 [2024-07-12 12:08:58.331373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.098 [2024-07-12 12:08:58.331382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.098 [2024-07-12 12:08:58.332970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.098 [2024-07-12 12:08:58.333000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.098 [2024-07-12 12:08:58.333266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.098 [2024-07-12 12:08:58.333295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.098 [2024-07-12 12:08:58.333480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.098 [2024-07-12 12:08:58.333514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.098 [2024-07-12 12:08:58.333545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.098 [2024-07-12 12:08:58.333575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.098 [2024-07-12 12:08:58.333601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.098 [2024-07-12 12:08:58.333782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.098 [2024-07-12 12:08:58.333791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.098 [2024-07-12 12:08:58.334921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.098 [2024-07-12 12:08:58.334955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.098 [2024-07-12 12:08:58.334981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.335962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.336150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.336191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.336217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.336242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.336267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.336503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.336513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.339289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.340103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.341093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.342080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.342425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.343456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.344565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.345637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.346636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.346960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.346970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.349495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.350492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.351462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.352118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.352296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.353121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.354113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.355118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.355576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.355948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.355957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.358465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.359515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.360613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.361228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.361439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.362417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.363401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.364298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.364568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.364904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.361 [2024-07-12 12:08:58.364914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.367248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.368222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.368841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.369909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.370087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.371066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.372037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.372387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.372665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.372968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.372977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.375511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.376579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.377274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.378094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.378278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.379272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.380060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.380322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.380598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.380959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.380968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.383242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.383753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.384742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.385830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.386007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.386989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.387255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.387513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.387773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.388104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.388114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.390232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.391010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.391820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.392804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.392987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.393743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.394014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.394271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.394532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.394855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.394867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.396367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.397350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.398408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.399392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.399574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.399844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.400101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.400356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.400635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.400864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.400873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.402616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.403427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.404411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.405418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.405713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.405997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.406253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.406510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.406872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.407051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.407060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.409036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.410060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.411038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.411906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.412198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.412461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.412742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.413006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.413916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.414133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.414142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.416021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.417014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.418014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.418409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.418766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.419031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.419288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.419676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.420489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.420689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.420698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.422880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.362 [2024-07-12 12:08:58.423886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.424716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.424979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.425301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.425568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.425830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.426812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.427719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.427896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.427905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.429990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.430982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.431247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.431505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.431778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.432042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.432603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.433415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.434396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.434578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.434587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.436679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.437345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.437610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.437867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.438217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.438480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.439551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.440604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.441691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.441874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.441883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.444092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.444365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.444630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.444900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.445226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.445878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.446702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.447698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.448689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.448908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.448917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.450622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.450896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.451160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.451424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.451761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.452870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.453877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.454944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.456054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.456305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.456314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.457590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.457856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.458119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.458383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.458608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.459431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.460431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.461425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.461992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.462175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.462184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.463563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.463833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.464096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.464485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.464672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.465673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.466726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.467824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.468444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.468661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.468670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.470093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.470360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.470626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.471507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.471741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.472754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.473724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.474322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.475406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.475589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.475597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.477142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.477405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.477820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.478633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.478812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.479903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.480942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.481606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.482414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.482594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.363 [2024-07-12 12:08:58.482606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.484288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.484581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.485553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.486425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.486607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.487597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.488091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.489056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.490116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.490293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.490302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.492117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.492675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.493489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.494462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.494648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.495552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.496375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.497191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.498151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.498330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.498338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.500234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.501251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.502371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.503378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.503561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.504023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.504829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.505814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.506792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.506970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.506979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.509473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.510291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.511264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.512238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.512613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.513647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.514734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.515758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.516703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.516978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.516987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.519441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.520414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.521401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.522024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.522210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.523044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.524031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.525020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.525343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.525673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.525682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.528206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.529301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.529570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.530592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.530771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.531787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.532052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.532311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.532570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.532930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.532939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.534820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.535096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.535354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.535624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.535988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.536253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.536509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.536773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.537034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.537315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.537324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.539470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.539747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.540008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.540265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.540613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.540880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.541145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.541407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.541671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.541950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.541959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.543788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.544046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.544310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.544574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.544837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.545101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.545357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.545617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.545876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.546135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.546144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.364 [2024-07-12 12:08:58.548079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.548345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.548611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.548871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.549182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.549446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.549708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.549978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.550254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.550623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.550633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.552590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.552852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.552882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.553137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.553460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.553729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.553995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.554253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.554511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.554811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.554820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.556687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.556947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.557206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.557241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.557503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.557781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.558041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.558296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.558558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.558858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.558867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.560647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.560673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.560697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.560722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.561040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.561071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.561097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.561124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.561149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.561380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.561389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.563104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.563133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.563158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.563183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.563507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.563542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.563578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.563603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.563644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.563908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.563917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.565687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.565721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.565746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.565772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.566049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.566094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.566120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.566145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.566169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.566403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.566412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.568159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.568186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.568211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.568253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.568535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.568579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.568619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.568654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.568680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.569009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.569017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.570994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.571033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.571068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.571101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.571403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.571467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.571508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.571536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.571561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.571869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.571877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.573655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.573682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.573707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.573732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.574016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.574055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.574081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.574118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.574144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.574491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.365 [2024-07-12 12:08:58.574500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.576206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.576235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.576270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.576317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.576606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.576645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.576671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.576695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.576719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.577022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.577032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.578731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.578759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.578786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.578811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.579141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.579173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.579199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.579224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.579250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.579579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.579588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.581342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.581372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.581396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.581420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.581761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.581798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.581823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.581848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.581875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.582146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.582155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.583887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.583914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.583955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.583980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.584312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.584344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.584370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.584395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.584421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.584776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.584784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.586419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.586449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.586474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.586498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.586812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.586848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.586873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.586897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.586922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.587255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.587264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.589004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.589031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.589058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.589083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.589377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.589407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.589432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.589457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.589481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.589806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.589816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.591454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.591481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.591506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.591546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.591887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.591922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.591948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.591974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.591999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.592299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.592311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.594016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.594043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.594068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.594092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.594398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.594429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.594455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.594481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.594506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.594740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.594749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.596437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.596465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.596490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.596515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.596846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.596879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.596915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.596940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.596975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.597153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.597161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.366 [2024-07-12 12:08:58.598694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.367 [2024-07-12 12:08:58.598724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.367 [2024-07-12 12:08:58.598768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.367 [2024-07-12 12:08:58.598796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.367 [2024-07-12 12:08:58.599089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.367 [2024-07-12 12:08:58.599121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.367 [2024-07-12 12:08:58.599146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.367 [2024-07-12 12:08:58.599174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.367 [2024-07-12 12:08:58.599200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.367 [2024-07-12 12:08:58.599542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.367 [2024-07-12 12:08:58.599551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.367 [2024-07-12 12:08:58.601232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.367 [2024-07-12 12:08:58.601260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.367 [2024-07-12 12:08:58.601301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.367 [2024-07-12 12:08:58.601339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.367 [2024-07-12 12:08:58.601666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.367 [2024-07-12 12:08:58.601711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.367 [2024-07-12 12:08:58.601738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.367 [2024-07-12 12:08:58.601764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.367 [2024-07-12 12:08:58.601790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.367 [2024-07-12 12:08:58.602115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.367 [2024-07-12 12:08:58.602125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.367 [2024-07-12 12:08:58.603173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.367 [2024-07-12 12:08:58.603201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.367 [2024-07-12 12:08:58.603226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.367 [2024-07-12 12:08:58.603252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.367 [2024-07-12 12:08:58.603569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.367 [2024-07-12 12:08:58.603608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.367 [2024-07-12 12:08:58.603634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.367 [2024-07-12 12:08:58.603658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.367 [2024-07-12 12:08:58.603682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.367 [2024-07-12 12:08:58.603894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.367 [2024-07-12 12:08:58.603902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.630 [2024-07-12 12:08:58.605072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.630 [2024-07-12 12:08:58.605102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.630 [2024-07-12 12:08:58.605128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.630 [2024-07-12 12:08:58.605154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.630 [2024-07-12 12:08:58.605489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.630 [2024-07-12 12:08:58.605529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.630 [2024-07-12 12:08:58.605555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.630 [2024-07-12 12:08:58.605580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.630 [2024-07-12 12:08:58.605620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.630 [2024-07-12 12:08:58.605964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.630 [2024-07-12 12:08:58.605973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.630 [2024-07-12 12:08:58.607200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.630 [2024-07-12 12:08:58.607227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.630 [2024-07-12 12:08:58.607254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.630 [2024-07-12 12:08:58.607284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.630 [2024-07-12 12:08:58.607457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.630 [2024-07-12 12:08:58.607490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.630 [2024-07-12 12:08:58.607514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.630 [2024-07-12 12:08:58.607549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.630 [2024-07-12 12:08:58.607574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.607814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.607822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.608819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.608847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.608873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.608897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.609146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.609189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.609214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.609239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.609265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.609594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.609604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.610977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.611003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.611030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.611054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.611229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.611266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.611290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.611314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.611337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.611510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.611520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.612653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.612686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.612713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.612738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.612913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.612950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.612975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.613000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.613026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.613306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.613315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.614975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.615002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.615045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.615076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.615260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.615293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.615319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.615343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.615377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.615557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.615564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.616726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.616758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.616783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.616807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.616987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.617030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.617056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.617081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.617105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.617307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.617315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.619066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.619095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.619121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.619146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.619321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.619353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.619378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.619409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.619434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.619613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.619622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.620752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.620796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.620821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.620846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.621023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.621064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.621091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.621115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.621142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.621318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.621327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.622902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.622931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.622960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.622986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.623331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.623363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.623406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.623434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.623457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.623634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.623642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.624750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.624778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.624803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.624827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.625040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.625081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.625106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.631 [2024-07-12 12:08:58.625131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.625155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.625332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.625341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.626711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.626740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.626764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.626789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.627070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.627109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.627138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.627162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.627187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.627508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.627520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.628532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.628559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.628995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.629022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.629199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.629237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.629267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.629294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.629318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.629490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.629499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.630889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.630916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.630942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.631201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.631526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.631559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.631585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.631609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.631634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.631810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.631818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.633946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.635048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.636074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.637036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.637300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.637570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.637830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.638089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.638756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.638973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.638982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.640855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.641564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.642633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.643658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.643928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.644190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.644445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.644704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.645416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.645653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.645662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.647583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.648564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.649545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.649957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.650323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.650589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.650862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.651120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.652064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.652243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.652251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.654299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.655391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.656407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.656665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.656971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.657233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.657489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.658234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.659049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.659225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.659234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.661294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.662268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.662671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.662931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.663257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.663539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.663798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.664745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.665802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.665981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.665990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.668155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.669184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.669452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.669712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.670014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.670276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.671084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.632 [2024-07-12 12:08:58.671894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.672866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.673046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.673058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.675166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.675435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.675695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.675955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.676294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.676689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.677490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.678472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.679449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.679632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.679641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.681594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.681859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.682117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.682373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.682717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.683650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.684485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.685471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.686432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.686716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.686725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.688021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.688284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.688548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.688808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.689061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.689879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.690884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.691881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.692621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.692809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.692817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.694157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.694438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.694705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.694970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.695152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.695989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.696980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.697966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.698427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.698613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.698623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.700079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.700351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.700619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.701331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.701591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.702600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.703589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.704361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.705321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.705544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.705552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.707051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.707320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.707586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.708553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.708740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.709734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.710714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.711178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.712012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.712196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.712204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.713856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.714126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.714861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.715693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.715877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.716883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.717625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.718600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.719476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.719664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.719673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.721473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.721763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.722701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.723746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.723926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.724889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.725363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.726168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.727147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.727326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.727335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.729149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.730003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.730818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.731791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.731971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.732576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.733636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.633 [2024-07-12 12:08:58.734661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.735750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.735930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.735938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.738247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.739066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.740031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.741000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.741215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.742033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.742844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.743811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.744788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.745056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.745065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.748001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.749073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.750072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.751003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.751236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.752046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.753018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.753995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.754692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.754992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.755002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.757333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.758330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.759304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.759757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.759937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.760892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.761913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.763002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.763265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.763595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.763605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.766099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.767073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.767831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.768776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.768992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.769989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.770966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.771451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.771710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.772035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.772043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.774446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.775485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.776037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.776855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.777037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.778055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.778972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.779231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.779492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.779760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.779770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.782084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.782716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.783819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.784807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.785000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.785994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.786348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.786608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.786864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.787200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.787209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.789397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.790021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.790840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.791809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.791989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.792861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.793123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.793381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.793643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.793984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.793993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.796081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.797001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.797933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.798234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.798585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.798850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.799110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.799587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.800397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.800579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.800588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.634 [2024-07-12 12:08:58.802843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.803863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.804803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.805064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.805416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.805687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.805945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.806205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.806468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.806802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.806811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.808801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.809072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.809330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.809606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.809946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.810212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.810476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.810750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.811007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.811351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.811361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.813245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.813512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.813773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.814039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.814321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.814595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.814851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.815107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.815367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.815678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.815687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.817747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.818013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.818272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.818531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.818841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.819103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.819362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.819627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.819890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.820217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.820228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.822093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.822378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.822645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.822910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.823175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.823452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.823734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.823991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.824251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.824593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.824603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.826605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.826871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.827137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.827400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.827760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.828025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.828288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.828550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.828816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.829108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.829116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.831265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.831539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.831798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.832056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.832416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.832686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.832952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.833222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.833478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.833771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.833780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.835694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.835957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.835987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.836242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.836501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.836772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.837030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.837285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.635 [2024-07-12 12:08:58.837541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.837861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.837877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.839718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.839981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.840240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.840270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.840533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.840803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.841060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.841314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.841580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.841863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.841872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.843569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.843597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.843623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.843649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.843938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.843989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.844027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.844062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.844087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.844337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.844345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.846140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.846169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.846194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.846219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.846451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.846492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.846522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.846551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.846588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.846897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.846906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.848938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.848976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.849002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.849036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.849310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.849350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.849387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.849426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.849457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.849732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.849741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.851575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.851612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.851637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.851662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.851913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.851952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.851978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.852002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.852028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.852295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.852304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.854040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.854070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.854094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.854129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.854426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.854474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.854510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.854540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.854564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.854917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.854926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.856591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.856643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.856676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.856702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.856970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.857011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.857036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.857060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.857085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.857397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.857406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.859089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.859117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.859142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.859166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.859501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.859535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.859561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.859586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.859611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.859942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.859952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.861729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.861757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.861821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.861861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.862228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.862259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.862285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.636 [2024-07-12 12:08:58.862311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.862336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.862615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.862624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.864415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.864443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.864468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.864492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.864839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.864875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.864901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.864926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.864952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.865292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.865301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.866971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.866999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.867025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.867050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.867369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.867415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.867450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.867493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.867521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.867862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.867872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.869311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.869366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.869392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.869417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.869694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.869735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.869761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.869786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.869812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.870144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.870154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.871767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.871796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.871827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.871852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.872164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.872205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.872233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.872258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.872284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.872609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.637 [2024-07-12 12:08:58.872620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.874315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.874354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.874379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.874404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.874584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.874627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.874652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.874678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.874702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.874921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.874929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.875996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.876023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.876048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.876072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.876321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.876358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.876396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.876431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.876456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.876799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.876815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.878421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.878448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.878472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.878496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.878676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.878713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.878737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.878762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.878786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.878961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.878969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.880104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.880132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.880159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.880183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.880361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.880398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.880425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.880449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.880480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.880752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.880761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.882507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.882536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.882561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.882585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.900 [2024-07-12 12:08:58.882791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.882831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.882856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.882880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.882904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.883078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.883085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.884207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.884235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.884261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.884293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.884476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.884510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.884538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.884582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.884609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.884790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.884798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.886674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.886703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.886744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.886770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.886958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.886991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.887017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.887042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.887072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.887251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.887259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.888378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.888406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.888442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.888466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.888647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.888682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.888711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.888734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.888759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.888931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.888939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.890515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.890546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.890573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.890598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.890917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.890950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.890976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.891002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.891030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.891207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.891216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.892313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.892344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.892369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.892392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.892633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.892675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.892700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.892725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.892749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.892924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.892932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.894271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.894299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.894324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.894357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.894697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.894739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.894765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.894789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.894815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.895128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.895137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.896148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.896175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.896202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.896234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.896501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.896537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.896562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.896586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.896611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.896824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.896835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.898080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.898109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.898134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.898161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.898480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.898514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.898555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.898587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.898614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.898950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.898959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.900115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.900143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.901 [2024-07-12 12:08:58.900175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.900203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.900380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.900410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.900442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.900466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.900491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.900734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.900743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.901870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.901898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.901934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.901971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.902338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.902370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.902396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.902423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.902450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.902751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.902760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.904272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.904304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.904335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.904359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.904536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.904572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.904600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.904624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.904647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.904821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.904829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.905914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.905943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.905967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.905991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.906241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.906281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.906320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.906347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.906371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.906727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.906736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.908241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.908277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.908302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.908325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.908502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.908545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.908569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.908609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.908633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.908812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.908820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.910040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.910068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.911044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.911073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.911361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.911406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.911442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.911466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.911490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.911829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.911839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.913350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.913378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.913402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.914382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.914564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.914604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.914628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.914652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.914676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.914930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.914940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.916447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.916737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.917004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.917274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.917592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.918628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.919717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.920742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.921705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.921975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.921984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.923361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.923647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.923923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.924183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.924366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.925183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.926175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.927131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.927664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.927842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.927850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.929225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.929488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.902 [2024-07-12 12:08:58.929755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.930137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.930313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.931285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.932331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.933412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.934043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.934249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.934257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.935721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.935987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.936248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.937198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.937404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.938392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.939366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.939856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.940815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.941012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.941021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.942695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.942966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.943471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.944302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.944484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.945543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.946527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.947301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.948126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.948308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.948317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.950006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.950277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.951301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.952217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.952398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.953401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.953889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.954866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.955948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.956134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.956143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.957975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.958516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.959344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.960330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.960511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.961491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.962272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.963100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.964094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.964277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.964286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.966245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.967318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.968394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.969495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.969681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.970146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.970977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.971965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.972962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.973144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.973153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.975539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.976357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.977328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.978324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.978584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.979561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.980446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.981409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.982434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.982732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.982741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.985423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.986388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.987356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.988166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.988381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.989202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.990175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.991140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.991715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.992034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.992043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.994433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.995410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.996390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.996860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.997047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.998147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:58.999159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:59.000096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:59.000357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:59.000681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:59.000691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:59.003115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:59.004093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.903 [2024-07-12 12:08:59.004653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.005680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.005861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.006846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.007821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.008116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.008377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.008678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.008687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.011151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.012205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.012860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.013675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.013853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.014836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.015647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.015911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.016168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.016439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.016448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.018755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.019337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.020394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.021420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.021602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.022605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.022963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.023228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.023493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.023846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.023856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.026115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.026767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.027579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.028550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.028728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.029556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.029820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.030078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.030337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.030661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.030671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.032288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.033300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.034397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.035400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.035581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.035850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.036108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.036365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.036631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.036887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.036896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.038561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.039383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.040362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.041334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.041578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.041849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.042107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.042364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.042624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.042803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.042814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.044943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.046041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.047065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.048038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.048319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.048588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.048848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.049107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.049749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.049957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.049966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.051927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.052938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.053915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.054414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.054748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.055020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.904 [2024-07-12 12:08:59.055278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.055539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.056592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.056770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.056778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.058914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.060004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.060981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.061246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.061560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.061826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.062084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.062482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.063465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.063647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.063656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.065510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.065804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.066070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.066354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.066682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.067564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.068373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.069365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.070341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.070602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.070611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.072105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.072376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.072638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.072908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.073226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.073491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.073762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.074020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.074277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.074637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.074647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.076554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.076819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.077079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.077352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.077636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.077908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.078167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.078425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.078688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.078962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.078971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.080936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.081210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.081475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.081758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.082059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.082334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.082609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.082891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.083161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.083511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.083537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.085466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.085755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.086022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.086287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.086595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.086877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.087141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.087400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.087666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.088011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.088023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.089970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.090238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.090499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.090771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.091041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.091311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.091576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.091834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.092094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.092396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.092405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.094445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.094718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.094986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.095244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.095563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.095839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.096096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.096357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.096642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.096989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.096998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.098984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.099263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.099525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.099788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.100108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.100372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.905 [2024-07-12 12:08:59.100663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.100942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.101207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.101549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.101560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.103452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.103718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.104700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.104970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.105289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.105569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.105849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.106761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.107038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.107357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.107366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.109127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.110014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.110345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.110606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.110787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.111261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.111525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.111784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.112047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.112228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.112236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.114006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.114270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.114301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.114570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.114751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.115190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.115449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.116364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.116656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.116986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.116996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.118699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.119644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.119909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.119940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.120262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.120532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.120821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.121906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.122193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.122526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.122537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.124096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.124136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.124162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.124203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.124501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.124550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.124579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.124603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.124628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.124809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.124818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.126540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.126568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.126610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.126635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.126933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.126988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.127026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.127062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.127088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.127354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.127362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.128833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.128861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.128902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.128927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.129169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.129208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.129234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.129259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.129285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.129608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.129621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.131047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.131077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.131102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.131128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.131450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.131482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.131508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.131546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.131576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.131755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.131764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.133628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.133660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.133684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.133709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.133899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.133943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.133969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.906 [2024-07-12 12:08:59.133994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.907 [2024-07-12 12:08:59.134019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.907 [2024-07-12 12:08:59.134364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.907 [2024-07-12 12:08:59.134373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.907 [2024-07-12 12:08:59.135919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.907 [2024-07-12 12:08:59.135967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.907 [2024-07-12 12:08:59.135994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.907 [2024-07-12 12:08:59.136020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.907 [2024-07-12 12:08:59.136278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.907 [2024-07-12 12:08:59.136323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.907 [2024-07-12 12:08:59.136351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.907 [2024-07-12 12:08:59.136376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.907 [2024-07-12 12:08:59.136401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.907 [2024-07-12 12:08:59.136612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.907 [2024-07-12 12:08:59.136621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.907 [2024-07-12 12:08:59.138078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.907 [2024-07-12 12:08:59.138106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.907 [2024-07-12 12:08:59.138132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.907 [2024-07-12 12:08:59.138157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.907 [2024-07-12 12:08:59.138472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.907 [2024-07-12 12:08:59.138504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.907 [2024-07-12 12:08:59.138534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.907 [2024-07-12 12:08:59.138560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.907 [2024-07-12 12:08:59.138586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.907 [2024-07-12 12:08:59.138861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.907 [2024-07-12 12:08:59.138871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.907 [2024-07-12 12:08:59.140500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.907 [2024-07-12 12:08:59.140533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.907 [2024-07-12 12:08:59.140565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.907 [2024-07-12 12:08:59.140604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.907 [2024-07-12 12:08:59.140786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.907 [2024-07-12 12:08:59.140817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.907 [2024-07-12 12:08:59.140843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.907 [2024-07-12 12:08:59.140875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.907 [2024-07-12 12:08:59.140911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.907 [2024-07-12 12:08:59.141267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:08.907 [2024-07-12 12:08:59.141277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.142793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.142850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.142877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.142903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.143253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.143289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.143316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.143342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.143368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.143633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.143641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.145106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.145143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.145169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.145195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.145500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.145546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.145572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.145597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.145622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.145818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.145830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.147307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.147335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.147360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.147383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.147702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.147743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.147769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.147794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.147819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.148137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.148146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.149897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.149947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.149986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.150012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.150352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.150387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.150413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.150439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.150465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.150760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.150780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.152546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.152574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.152598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.152622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.152925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.152958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.152984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.153009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.153037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.153235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.153244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.154399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.154428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.154466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.154491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.154675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.154714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.154740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.154765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.154793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.154971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.154980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.156413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.156444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.156470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.156495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.156738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.156778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.169 [2024-07-12 12:08:59.156803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.156828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.156853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.157176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.157186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.158315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.158343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.158369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.158393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.158642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.158682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.158710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.158734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.158758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.158928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.158937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.161720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.161752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.161793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.161818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.162134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.162168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.162195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.162221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.162246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.162432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.162440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.164942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.164974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.165032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.165061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.165242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.165279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.165307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.165332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.165356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.165538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.165547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.167605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.167637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.167677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.167708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.167886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.167923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.167948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.167972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.167997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.168174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.168181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.170939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.170971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.171015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.171042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.171349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.171382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.171409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.171435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.171460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.171757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.171766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.174171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.174205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.174230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.174254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.174438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.174477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.174506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.174534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.174558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.174735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.174742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.177534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.177567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.177592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.177616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.177795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.177827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.177852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.177899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.177925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.178109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.178116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.181079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.181110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.181135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.181159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.181402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.181443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.181469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.181493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.181534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.181863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.181871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.184571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.184602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.184866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.184893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.184925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.185200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.170 [2024-07-12 12:08:59.234911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.240986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.241048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.241296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.241332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.241577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.241613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.242471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.242722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.242732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.242738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.247880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.248166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.249140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.249319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.249327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.251558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.252545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.253414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.253673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.254239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.254494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.255235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.256049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.256227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.256235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.258330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.259312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.259621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.259878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.260418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.260771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.261600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.262582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.262762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.262771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.264936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.265685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.265951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.266215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.266805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.267769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.268636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.269606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.269783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.269792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.271935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.272202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.272460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.272718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.273509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.274323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.275294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.276270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.276449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.276459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.278149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.278423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.278682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.278938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.280275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.281221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.282230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.283306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.283599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.283608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.284936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.285220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.285483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.285753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.286759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.287742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.288717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.289267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.289442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.289451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.290841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.291107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.291363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.291839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.293062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.294103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.295046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.295816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.296028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.296036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.297530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.297792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.298050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.299093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.300244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.301249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.301703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.302541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.302718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.302730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.304391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.304656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.305363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.171 [2024-07-12 12:08:59.306174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.307330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.308116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.309051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.309878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.310054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.310062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.312014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.312326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.313194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.314177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.315371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.315938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.316761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.317737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.317914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.317922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.319732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.320529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.321338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.322306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.323253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.324258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.325158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.326137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.326314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.326326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.328279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.329232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.330281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.331263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.331903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.332737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.333729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.334705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.334884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.334893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.337324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.338163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.339058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.339671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.340677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.341600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.342133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.343216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.343574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.343586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.346061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.346888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.347861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.348853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.350230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.351292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.352371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.353392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.353701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.353711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.356236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.357263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.358260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.358886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.359909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.360934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.361936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.362577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.362865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.362874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.364847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.365139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.365403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.365670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.366256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.366535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.366809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.367065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.367406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.367415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.369311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.369596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.369872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.370136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.370698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.370956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.371212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.371471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.371846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.371855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.373966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.374234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.374496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.374756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.375291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.375557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.375819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.376077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.376394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.376403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.378307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.378595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.378859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.172 [2024-07-12 12:08:59.379124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.173 [2024-07-12 12:08:59.379634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.173 [2024-07-12 12:08:59.379899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.173 [2024-07-12 12:08:59.380155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.173 [2024-07-12 12:08:59.380413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.173 [2024-07-12 12:08:59.380766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.173 [2024-07-12 12:08:59.380776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.173 [2024-07-12 12:08:59.382702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.173 [2024-07-12 12:08:59.382962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.173 [2024-07-12 12:08:59.383222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.173 [2024-07-12 12:08:59.383494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.173 [2024-07-12 12:08:59.384078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.173 [2024-07-12 12:08:59.384335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.173 [2024-07-12 12:08:59.384617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.173 [2024-07-12 12:08:59.384889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.173 [2024-07-12 12:08:59.385163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.173 [2024-07-12 12:08:59.385173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.173 [2024-07-12 12:08:59.387402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.173 [2024-07-12 12:08:59.387677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.173 [2024-07-12 12:08:59.387939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.173 [2024-07-12 12:08:59.388195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.173 [2024-07-12 12:08:59.388808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.173 [2024-07-12 12:08:59.389078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.173 [2024-07-12 12:08:59.389349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.173 [2024-07-12 12:08:59.389621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.173 [2024-07-12 12:08:59.389940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.173 [2024-07-12 12:08:59.389950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.173 [2024-07-12 12:08:59.391875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.173 [2024-07-12 12:08:59.392134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.173 [2024-07-12 12:08:59.392394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.173 [2024-07-12 12:08:59.392676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.173 [2024-07-12 12:08:59.393311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.173 [2024-07-12 12:08:59.393583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.173 [2024-07-12 12:08:59.393853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.173 [2024-07-12 12:08:59.394110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.173 [2024-07-12 12:08:59.394401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.173 [2024-07-12 12:08:59.394410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.173 [2024-07-12 12:08:59.396404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.396694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.396969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.397002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.397565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.397839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.398094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.398353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.398636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.398646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.400646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.400696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.400963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.400996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.401573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.401603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.401871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.401898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.402221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.402232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.404109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.404142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.404399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.404436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.404968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.405011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.405275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.405302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.405636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.405650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.408125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.408162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.408448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.408485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.409099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.409131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.409394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.409421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.409723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.409733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.411714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.411748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.412033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.174 [2024-07-12 12:08:59.412064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.436 [2024-07-12 12:08:59.412664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.436 [2024-07-12 12:08:59.412701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.436 [2024-07-12 12:08:59.412965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.436 [2024-07-12 12:08:59.412996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.436 [2024-07-12 12:08:59.413335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.436 [2024-07-12 12:08:59.413346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.436 [2024-07-12 12:08:59.415350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.436 [2024-07-12 12:08:59.415384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.436 [2024-07-12 12:08:59.415650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.436 [2024-07-12 12:08:59.415677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.436 [2024-07-12 12:08:59.416272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.436 [2024-07-12 12:08:59.416302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.436 [2024-07-12 12:08:59.416566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.436 [2024-07-12 12:08:59.416604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.436 [2024-07-12 12:08:59.416870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.436 [2024-07-12 12:08:59.416880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.436 [2024-07-12 12:08:59.418903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.436 [2024-07-12 12:08:59.418947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.436 [2024-07-12 12:08:59.419212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.436 [2024-07-12 12:08:59.419250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.436 [2024-07-12 12:08:59.419842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.436 [2024-07-12 12:08:59.419874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.436 [2024-07-12 12:08:59.420128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.436 [2024-07-12 12:08:59.420153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.436 [2024-07-12 12:08:59.420484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.436 [2024-07-12 12:08:59.420493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.436 [2024-07-12 12:08:59.422008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.436 [2024-07-12 12:08:59.422040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.436 [2024-07-12 12:08:59.422304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.436 [2024-07-12 12:08:59.422336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.436 [2024-07-12 12:08:59.422916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.436 [2024-07-12 12:08:59.422947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.436 [2024-07-12 12:08:59.423213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.436 [2024-07-12 12:08:59.423245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.436 [2024-07-12 12:08:59.423522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.436 [2024-07-12 12:08:59.423532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.425739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.425781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.425806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.425828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.426356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.426386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.426412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.426690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.427023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.427035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.428221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.428253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.428278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.428303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.428510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.428542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.428568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.428593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.428771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.428780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.429888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.429926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.429953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.429981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.430326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.430353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.430394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.430420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.430716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.430726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.432208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.432238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.432264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.432289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.432500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.432531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.432556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.432581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.432824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.432834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.433885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.433913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.433938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.433965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.434254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.434297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.434324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.434349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.434699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.434709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.436266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.436302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.436329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.436354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.436581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.436608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.436633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.436658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.436833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.436841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.438016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.438044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.438070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.438094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.438302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.438330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.438355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.438379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.438675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.438685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.440610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.440639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.440663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.440687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.440963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.440997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.441022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.441046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.441228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.441237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.442345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.442374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.442399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.442426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.442644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.442672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.442697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.442721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.442899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.442908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.444634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.444664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.444689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.444713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.445015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.445041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.437 [2024-07-12 12:08:59.445065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.445089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.445295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.445304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.446393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.446422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.446446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.446470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.446724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.446751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.446777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.446801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.446978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.446986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.448614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.448645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.448686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.448721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.449095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.449126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.449152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.449177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.449435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.449443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.450461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.450491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.450516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.450786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.450812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.450836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.451045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.536173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.537276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.542405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.542951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.542990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.543797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.543837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.544804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.545796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.546020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.546031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.546038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.553649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.554650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.555711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.556064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.556073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.558791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.559781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.560773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.561615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.562690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.563669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.564662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.565313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.565650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.565659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.568086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.569073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.570048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.570480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.571648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.572692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.573771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.574037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.574365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.574375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.576884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.577864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.578647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.579617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.580844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.581815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.582381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.582654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.582986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.582996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.585565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.586579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.587017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.587852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.589097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.590200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.590461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.590720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.591020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.591029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.593345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.594168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.595086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.595917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.597108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.597708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.597987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.598249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.598608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.598619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.600756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.601186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.438 [2024-07-12 12:08:59.602091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.603088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.604267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.604540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.604798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.605057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.605386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.605397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.606931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.607948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.608926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.609509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.610119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.610377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.610657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.611686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.611871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.611880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.613926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.614932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.616022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.616296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.616896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.617161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.617829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.618633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.618815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.618826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.620950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.621956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.622751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.623033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.623620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.623886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.624152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.624421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.624775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.624786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.626755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.627022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.627278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.627311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.627921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.628183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.628444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.628730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.629042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.629052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.631083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.631116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.631371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.631398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.631969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.632009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.632271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.632302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.632630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.632639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.634664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.634716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.634983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.635010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.635598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.635628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.635885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.636146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.636407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.636416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.638723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.638790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.639056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.639321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.639685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.639954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.640214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.640260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.640583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.640593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.642317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.642601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.642872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.642904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.643540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.643806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.643835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.644097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.644398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.644407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.646377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.646659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.646699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.646961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.647473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.647505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.647786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.648050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.648327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.648336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.650253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.439 [2024-07-12 12:08:59.650286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.650549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.650815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.651157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.651417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.651695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.651723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.652073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.652082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.653812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.654082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.654340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.654371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.654940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.655215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.655245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.655498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.655882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.655893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.657807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.658068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.658098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.658356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.658943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.658975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.659228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.659483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.659830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.659840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.661770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.661800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.662056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.662315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.662606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.662868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.663127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.663154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.663478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.663487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.665173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.665457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.665725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.665993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.666544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.666815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.666846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.667117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.667452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.667461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.669123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.669382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.669410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.669687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.670259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.670299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.670574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.670607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.670959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.670969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.672888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.672921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.673177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.673204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.673777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.674047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.674079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.674343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.674651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.674660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.676641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.676678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.676947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.676975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.677343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.677612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.677641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.677902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.678182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.440 [2024-07-12 12:08:59.678191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.703 [2024-07-12 12:08:59.680149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.680188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.681021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.681054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.681368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.682284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.682327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.682592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.682909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.682921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.684838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.684871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.685149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.685177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.685502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.685771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.685803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.686066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.686318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.686328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.688250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.688284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.688570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.688603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.688813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.689153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.689181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.689443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.689633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.689643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.692086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.692120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.693109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.693140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.693537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.694364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.694393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.695380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.695564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.695573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.697343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.697374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.698271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.698300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.698557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.699550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.699581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.700554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.700928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.700937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.702393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.702425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.703313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.703340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.703712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.704193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.704222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.704874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.705205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.705215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.707325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.707358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.708218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.708248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.708491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.709486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.709516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.710503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.710755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.710764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.713951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.713984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.714032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.714052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.714252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.714297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.715291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.715325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.715508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.715522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.716607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.716636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.716661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.716685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.716937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.716964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.716995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.717022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.717199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.717207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.718797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.718827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.718872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.718899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.719250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.719277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.719302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.719327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.719546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.704 [2024-07-12 12:08:59.719555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.720668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.720696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.720720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.720745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.720978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.721004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.721029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.721053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.721232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.721240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.722732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.722760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.722813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.722839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.723221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.723249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.723275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.723299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.723606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.723615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.724668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.724704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.724731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.724758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.725006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.725033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.725058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.725083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.725315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.725323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.726398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.726426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.726451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.726475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.726824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.726851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.726876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.726902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.727229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.727237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.728644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.728674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.728713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.728971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.729002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.729026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.729200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.761206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.763504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.764303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.764331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.766927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.767558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.767599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.767624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.770320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.770353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.770394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.771494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.771718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.772144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.772174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.772198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.772233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.773030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.773212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.773221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.774396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.774425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.774455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.774726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.774943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.774969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.775238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.775266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.775590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.775600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.776611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.777066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.777094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.777118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.777328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.778419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.778451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.779439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.779636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.779645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.782104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.782137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.782985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.783015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.783232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.783259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.784242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.784271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.784547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.784557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.787977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.705 [2024-07-12 12:08:59.788013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.788561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.788591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.789049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.789078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.789764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.789794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.790009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.790018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.791933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.791966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.792951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.792982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.793838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.793871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.794136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.794165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.794493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.794502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.797435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.797472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.798294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.798324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.799508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.799544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.800323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.800352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.800537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.800547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.802305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.802340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.803283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.803316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.804529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.804562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.805541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.805571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.805909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.805918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.809543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.809579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.809842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.809869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.810925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.810956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.811936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.811966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.812147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.812156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.814294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.814333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.814800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.814829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.815276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.815306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.815586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.815616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.815805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.815814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.818727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.818764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.819572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.819600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.820765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.820797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.821450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.821477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.821813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.821823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.824230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.824263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.825268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.825294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.825956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.825988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.826828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.826858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.827034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.827042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.830062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.830640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.830914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.830943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.831988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.832192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.833183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.834186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.834217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.834679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.834868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.834877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.836212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.836472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.836736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.837346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.837576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.838578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.839560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.840409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.706 [2024-07-12 12:08:59.841254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.841498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.841507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.845726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.846042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.846306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.847314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.847494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.848474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.849450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.849916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.850739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.850925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.850934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.852539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.852799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.853480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.854298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.854477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.855471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.856230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.857165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.858005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.858181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.858190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.860973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.861261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.862202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.863215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.863392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.864371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.864868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.865685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.866686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.866876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.866885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.868688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.869476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.870289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.871260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.871438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.872155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.873171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.874092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.875060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.875239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.875248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.877738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.878559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.879461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.880379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.880614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.881434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.882343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.883095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.883354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.883686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.883696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.886064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.887156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.888226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.888894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.889101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.890095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.891071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.891863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.892729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.893037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.893046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.895664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.896649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.897117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.898064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.898245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.899231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.900221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.900484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.900741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.901010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.901018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.903379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.903411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.904235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.904891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.905101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.906120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.907109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.907139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.907776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.907993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.908002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.910878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.911147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.911419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.911449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.911779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.912056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.912084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.912342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.912611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.707 [2024-07-12 12:08:59.912887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.912896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.914492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.914758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.914791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.915063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.915332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.915373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.915632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.915888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.916143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.916457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.916466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.918512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.918557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.918817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.919080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.919367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.919401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.919660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.919917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.920173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.920432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.920441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.921967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.922791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.923152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.923429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.923737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.923780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.924051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.924328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.924595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.924966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.924975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.928183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.929207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.929465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.929732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.930046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.930087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.930349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.930611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.930867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.931196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.931206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.932642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.932906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.933497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.933530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.933775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.933810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.934067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.934326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.934356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.934602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.934611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.937445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.937481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.937814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.938072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.938250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.938672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.938931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.939201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.939238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.939488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.939497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.941699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.941741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.942008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.942992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.943365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.943642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.944677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.944941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.944972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.945273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.708 [2024-07-12 12:08:59.945285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.971 [2024-07-12 12:08:59.947869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.971 [2024-07-12 12:08:59.947941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.971 [2024-07-12 12:08:59.948498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.971 [2024-07-12 12:08:59.949158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.971 [2024-07-12 12:08:59.949482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.949967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.950688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.950944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.950974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.951241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.951250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.952960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.953219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.953479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.953510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.953766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.954474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.954990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.955253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.955282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.955467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.955476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.958118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.958404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.958450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.958719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.958979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.959936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.960199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.960458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.960489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.960670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.960679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.962652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.962911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.962940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.963195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.963486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.963757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.964041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.964071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.964912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.965251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.965260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.967540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.967806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.968064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.968090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.968421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.968691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.968724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.968982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.969013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.969197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.969205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.970918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.970948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.971223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.971253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.971514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.971793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.972051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.972080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.972336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.972659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.972669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.974765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.974801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.975061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.975106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.975385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.975426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.975695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.975727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.975982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.976257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.976266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.978927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.978959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.979239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.979266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.979531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.979566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.980336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.980362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.980619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.980893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.980901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.983692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.983729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.983991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.984022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.984284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.984324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.985208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.985237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.985516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.985849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.985859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.987592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.987627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.988613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.988649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.972 [2024-07-12 12:08:59.988991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:08:59.989026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:08:59.989284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:08:59.989313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:08:59.990278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:08:59.990624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:08:59.990634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:08:59.994540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:08:59.994576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:08:59.994833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:08:59.994859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:08:59.995118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:08:59.995152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:08:59.995956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:08:59.995985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:08:59.996957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:08:59.997135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:08:59.997143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:08:59.999227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:08:59.999260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:08:59.999839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:08:59.999879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.000057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.000092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.000351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.000380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.000671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.000854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.000863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.003918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.003976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.004902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.004936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.005162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.005210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.006270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.006306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.007129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.007362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.007373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.009343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.009378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.010378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.011300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.011495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.011541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.012616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.012647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.013123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.013315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.013325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.016040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.016073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.016577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.016608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.016796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.016834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.016861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.017128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.017156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.017370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.017378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.018454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.018484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.018510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.018539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.018719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.018754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.018780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.018811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.018837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.019019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.019028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.022503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.022540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.022567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.022594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.022802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.022839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.022868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.022893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.022918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.023247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.023256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.024350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.024378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.024403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.024428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.024693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.024732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.024759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.024787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.024812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.024992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.025001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.973 [2024-07-12 12:09:00.027658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.027691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.027717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.027745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.027986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.028020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.028045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.028070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.028095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.028366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.028376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.029690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.029718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.029743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.029768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.029952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.029990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.030015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.030040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.030065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.030428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.030437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.033600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.033634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.033660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.033685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.034001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.034034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.034060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.034088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.034114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.034311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.034320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.035791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.035819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.035847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.035872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.036054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.036092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.036117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.036142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.036167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.036349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.036357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.038987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.039024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.039049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.039074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.039294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.039335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.039362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.039388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.039413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.039751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.039761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.041216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.042038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.042069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.042094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.042277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.042315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.042342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.043322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.043352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.043707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.043717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.047135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.047173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.047201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.047970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.048212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.048258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.048585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.048618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.048649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.048902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.048916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.050706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.050939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.052607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.052647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.052945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.054104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.054139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.054175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.054203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.054600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.054610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.056536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.974 [2024-07-12 12:09:00.057598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.057629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.057655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.057839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.058490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.058522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.058548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.058573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.058786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.058795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.060347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.060380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.060405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.060432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.060776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.061750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.061779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.061814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.061839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.062207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.062216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.065636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.065673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.065699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.065724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.065909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.066913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.066944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.066970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.066995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.067305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.067314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.069096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.069129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.069159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.069458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.069653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.070581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.070611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.070648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.071662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.071849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.071858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.074525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.075164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.075193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.075220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.075545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.075582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.075609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.075634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.076624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.076960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.076969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.078258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.079261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.079293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.079319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.079608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.079649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.079676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.079700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.080530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.080718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.080727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.083811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.084388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.084417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.084444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.084793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.084825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.084852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.084878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.085950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.086137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.086145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.088154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.088188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.088254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.089211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.089397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.089439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.089467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.089494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.090177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.090424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.090433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.092190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.092223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.093212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.093243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.093427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.093466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.093492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.093516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.093962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.094147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.094157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.095293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.975 [2024-07-12 12:09:00.095321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.096154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.096183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.096537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.096570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.096596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.097172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.097200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.097439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.097448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.100633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.100671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.100697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.101641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.101828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.101867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.102861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.102891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.103300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.103485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.103494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.105331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.106349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.106380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.107366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.107553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.107595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.107623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.108492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.108524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.108745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.108753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.112397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.113522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.113558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.113824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.114132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.115119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.115159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.116148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.116180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.116363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.116372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.119493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.120355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.120385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.120655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.120985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.121879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.121909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.122177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.122205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.122535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.122545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.125798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.126814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.126846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.127830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.128026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.128729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.128760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.129165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.129193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.129535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.129545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.132947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.133938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.133968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.134426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.134619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.135731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.135770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.136756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.136785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.136974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.136983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.140062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.140946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.140977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.141835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.976 [2024-07-12 12:09:00.142025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.143027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.143058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.143504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.143536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.143719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.143728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.146598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.147147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.147176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.147747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.148072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.148706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.148737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.149555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.149584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.149768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.149777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.152748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.153163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.153193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.154066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.154413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.154730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.154761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.155560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.155588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.155925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.155935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.158774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.159605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.159635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.159660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.159845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.160800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.160831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.161183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.161211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.161395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.161404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.164262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.165395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.165430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.166358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.166597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.167429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.168411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.168441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.169388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.169617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.169627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.174037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.174916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.175884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.176866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.177142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.178171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.179278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.180332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.181325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.181611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.181621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.185638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.186634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.187624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.188373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.188571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.189401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.190397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.191378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.191890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.192078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.192087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.194827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.195826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.196812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.197280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.197469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.198596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.199636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.200601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.201290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.201529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.201542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.204250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.205230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.205862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.206970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.207174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.208168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.209156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.209561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.210536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.210875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.210884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.213632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:09.977 [2024-07-12 12:09:00.213910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.214891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.215878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.216066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.216577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.217277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.217544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.218268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.218505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.218514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.221702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.222530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.223520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.224516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.224816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.225841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.226113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.226384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.227355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.227702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.227711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.231148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.232146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.233132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.233865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.234057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.234676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.234941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.235748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.236147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.236467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.236477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.240362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.241375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.242365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.242852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.243045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.243654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.243921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.244716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.245130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.245462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.245472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.247716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.247754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.248022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.248291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.248557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.249534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.249808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.249838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.250101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.250291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.250301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.252348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.252643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.253566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.253596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.253935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.254209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.254249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.254535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.255003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.255193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.255202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.257444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.258331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.258362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.240 [2024-07-12 12:09:00.258631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.258897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.258932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.259709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.259976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.260243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.260514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.260528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.263414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.263458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.263738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.264455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.264707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.264748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.265014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.265775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.266228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.266557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.266567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.268995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.269669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.269936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.270202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.270485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.270531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.271191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.271750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.272017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.272227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.272236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.274500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.274780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.275564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.275984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.276311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.276347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.276621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.276899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.277823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.278135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.278145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.280069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.280969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.281284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.281314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.281645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.281679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.282659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.282931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.282959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.283292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.283302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.285565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.285602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.285878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.286147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.286430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.287361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.287649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.287919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.287953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.288134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.288144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.290305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.290353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.291447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.291717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.292044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.292322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.292597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.293652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.293690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.294070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.294081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.296832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.296869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.297141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.297407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.297603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.297977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.298241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.298510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.298550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.298806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.298815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.302751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.303030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.303300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.303338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.303528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.303876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.304140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.305157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.305194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.305560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.305570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.309245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.309606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.241 [2024-07-12 12:09:00.309639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.309905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.310195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.310470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.311343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.311687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.311719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.312036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.312045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.315364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.315639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.315672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.316546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.316838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.317118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.317384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.317416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.317688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.317880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.317889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.320498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.320793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.321490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.321526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.321811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.322078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.322106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.322866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.322898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.323186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.323195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.326301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.326354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.326851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.326880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.327223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.327497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.327771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.327803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.328724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.329022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.329031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.331732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.331770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.332147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.332175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.332502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.332538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.333271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.333309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.333662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.333990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.333999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.336700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.336737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.337646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.337679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.338027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.338067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.338325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.338361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.338640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.338896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.338905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.343171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.343243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.343511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.343550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.343737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.343769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.344038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.344065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.344363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.344557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.344567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.346828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.346867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.347789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.347818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.348142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.348174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.348435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.348475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.348760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.349022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.349031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.352939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.352978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.353969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.354000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.354188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.354227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.354855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.354890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.355870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.356059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.356072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.360288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.360326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.242 [2024-07-12 12:09:00.360610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.360640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.360879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.360915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.361714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.361745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.362710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.362895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.362904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.366814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.366860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.367127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.367156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.367446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.367482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.368324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.368354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.368619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.368867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.368877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.372597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.372637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.373613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.374060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.374249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.374289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.374579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.374615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.374975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.375174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.375184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.377874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.377917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.378721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.378751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.378997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.379036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.379062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.380056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.380087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.380272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.380281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.382444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.382479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.382506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.382536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.382809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.382845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.382871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.382897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.382922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.383135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.383144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.386127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.386162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.386188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.386213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.386398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.386443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.386469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.386496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.386531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.386837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.386845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.390064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.390097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.390123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.390148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.390364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.390403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.390428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.390453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.390478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.390663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.390672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.393799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.393831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.393863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.393889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.394157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.394198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.394223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.394248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.394273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.394553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.394562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.396297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.396329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.396354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.396382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.396569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.396608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.396633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.396658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.243 [2024-07-12 12:09:00.396690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.396873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.396881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.399575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.399608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.399632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.399657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.399948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.399986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.400014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.400040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.400070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.400396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.400405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.404024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.404056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.404092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.404120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.404300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.404337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.404365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.404391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.404417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.404646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.404654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.407267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.407301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.407327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.407352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.407663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.407702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.407728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.407752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.407776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.407956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.407965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.410364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.411273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.411304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.411337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.411535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.411572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.411599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.412575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.412603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.412783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.412792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.415799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.415836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.244 [2024-07-12 12:09:00.415861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.416658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.416840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.416879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.417877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.417909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.417935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.418199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.418209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.421507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.421543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.421834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.421863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.422082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.422813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.422842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.422867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.422893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.423212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.423222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.426378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.427468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.427505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.427533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.427737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.428831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.428861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.428888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.428912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.429099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.429108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.432321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.432357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.432399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.432425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.432614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.433601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.433636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.433661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.433688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.434032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.434040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.437121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.437165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.437191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.437218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.437405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.437902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.437931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.437958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.437985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.438317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.438327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.442632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.442671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.442699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.443695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.443887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.444547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.444579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.444608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.444956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.445280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.445290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.448707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.449714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.449745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.245 [2024-07-12 12:09:00.449777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.450044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.450079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.450104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.450129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.450942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.451126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.451134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.453891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.454250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.454279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.454305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.454626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.454658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.454686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.454711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.455607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.455798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.455807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.458833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.459628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.459660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.459686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.460015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.460048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.460075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.460100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.460365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.460730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.460740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.464139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.464181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.464207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.465184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.465370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.465409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.465435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.465460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.466064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.466416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.466425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.469290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.469322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.470456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.470489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.470759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.470795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.470820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.470844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.471676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.471861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.471869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.474869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.474902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.475950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.475980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.476160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.476194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.476220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.477278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.477308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.477488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.477499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.480773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.480809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.480861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.481126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.481448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.481481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.246 [2024-07-12 12:09:00.482557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.247 [2024-07-12 12:09:00.482585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.510 [2024-07-12 12:09:00.483663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.510 [2024-07-12 12:09:00.483852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.510 [2024-07-12 12:09:00.483862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.510 [2024-07-12 12:09:00.487162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.510 [2024-07-12 12:09:00.487458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.510 [2024-07-12 12:09:00.487485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.510 [2024-07-12 12:09:00.487746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.510 [2024-07-12 12:09:00.488062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.510 [2024-07-12 12:09:00.488094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.510 [2024-07-12 12:09:00.488120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.510 [2024-07-12 12:09:00.488376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.510 [2024-07-12 12:09:00.488416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.510 [2024-07-12 12:09:00.488604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.510 [2024-07-12 12:09:00.488613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.510 [2024-07-12 12:09:00.491007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.510 [2024-07-12 12:09:00.492120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.510 [2024-07-12 12:09:00.492153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.510 [2024-07-12 12:09:00.493049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.510 [2024-07-12 12:09:00.493312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.510 [2024-07-12 12:09:00.493598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.510 [2024-07-12 12:09:00.493628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.510 [2024-07-12 12:09:00.493892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.510 [2024-07-12 12:09:00.493922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.510 [2024-07-12 12:09:00.494257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.510 [2024-07-12 12:09:00.494267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.510 [2024-07-12 12:09:00.497254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.510 [2024-07-12 12:09:00.498146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.498178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.499230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.499413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.499799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.499829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.500085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.500113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.500380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.500388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.502782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.503586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.503616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.504426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.504611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.505620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.505651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.506092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.506120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.506468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.506478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.509221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.510329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.510367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.511138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.511352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.512351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.512381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.513349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.513378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.513662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.513673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.515635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.516608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.516640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.517728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.517977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.518825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.518855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.519819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.519848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.520026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.520035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.522710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.523548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.523579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.524548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.524735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.525289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.525320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.525746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.525773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.525998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.526007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.528687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.528975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.529003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.530131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.530319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.531324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.531355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.532345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.532374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.532683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.532692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.536032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.536319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.536347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.536374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.536716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.537471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.537500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.538322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.538351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.538537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.538546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.542051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.542331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.542361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.542629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.542962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.543395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.544219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.544250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.545173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.545359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.545367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.548583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.548856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.549886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.550791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.550977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.551986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.552454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.553428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.554510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.554700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.554710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.511 [2024-07-12 12:09:00.558638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.559634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.560624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.561450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.561676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.562521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.563512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.564505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.565115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.565410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.565419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.568977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.569578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.570644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.571705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.571898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.572896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.573191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.573449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.573710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.574027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.574037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.577568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.578585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.579588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.580313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.580598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.580873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.581137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.581400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.581672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.581915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.581924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.584356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.584626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.584886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.585145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.585468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.585749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.586010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.586285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.586553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.586860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.586870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.589446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.589721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.589983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.590242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.590586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.590858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.591138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.591407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.591675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.591958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.591967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.593922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.594208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.594472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.594743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.595022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.595297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.595566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.595830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.596099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.596357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.596366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.598283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.598555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.598825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.599091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.599412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.599687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.599951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.600217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.512 [2024-07-12 12:09:00.600489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.600798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.600808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.602827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.602876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.603139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.603404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.603712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.603981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.604248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.604280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.604552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.604890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.604900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.606700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.606974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.607239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.607269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.607597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.607868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.607897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.608161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.608430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.608712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.608722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.610907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.611197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.611243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.611511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.611856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.611902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.612165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.612430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.612707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.612993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.613002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.615157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.615190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.615480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.615751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.616061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.616098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.616354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.616621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.616886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.617151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.617160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.619028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.619317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.619594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.619862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.620159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.620194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.620457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.620742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.621004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.621299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.621308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.623090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.623357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.623637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.623904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.624191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.624227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.624491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.513 [2024-07-12 12:09:00.624764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.625035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.625328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.625341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.627070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.627336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.627615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.627646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.627972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.628006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.628270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.628540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.628568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.628843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.628852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.630759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.630791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.631061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.631324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.631690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.631964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.632237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.632501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.632536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.632881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.632890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.634821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.634859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.635963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.636236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.636422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.636707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.636979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.637237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.637269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.637595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.637605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.639505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.639540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.639821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.640087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.640366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.640647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.640917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.641173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.641203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.641535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.641547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.643173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.643441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.643709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.643740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.644070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.644353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.644618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.644881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.644911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.645193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.645202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.647512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.648325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.648354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.649395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.649608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.650619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.514 [2024-07-12 12:09:00.651612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.652033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.652061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.652395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.652406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.654951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.656087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.656119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.657014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.657230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.658042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.659044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.659073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.660073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.660294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.660303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.662276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.663359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.664420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.664459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.664650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.665656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.665693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.666324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.666352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.666583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.666592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.668040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.668070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.668333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.668364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.668704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.669591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.670422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.670451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.671439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.671630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.671640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.673762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.673795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.674081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.674109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.674426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.674458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.674725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.674753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.675015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.675228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.675237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.677025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.677058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.677864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.677893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.678073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.678112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.679094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.679123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.679421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.679771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.679781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.682340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.682382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.683354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.683382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.683578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.683618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.515 [2024-07-12 12:09:00.684338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.684367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.685198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.685381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.685390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.687094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.687127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.687407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.687437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.687654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.687694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.688686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.688724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.689758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.689938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.689947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.692161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.692194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.692475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.692504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.692846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.692879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.693135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.693162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.693416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.693606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.693615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.695683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.695750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.696762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.696808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.696987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.697020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.698035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.698071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.698329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.698645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.698655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.701170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.701205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.702182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.702212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.702433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.702474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.516 [2024-07-12 12:09:00.703443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.703480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.704496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.704687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.704696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.706498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.706532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.706960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.707786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.707970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.708012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.709015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.709044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.709842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.710055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.710063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.711136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.711164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.711429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.711456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.711815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.711850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.711879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.712145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.712172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.712506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.712516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.713560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.713588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.713619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.713645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.713943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.713978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.714003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.714027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.714051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.714262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.714271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.715472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.715501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.715530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.715570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.715903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.715935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.715963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.716000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.716026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.716379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.716388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.717597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.717625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.717649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.717690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.717867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.717900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.717934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.717960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.717985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.718189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.718198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.719234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.719262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.719287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.517 [2024-07-12 12:09:00.719312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.719664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.719698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.719727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.719753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.719779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.720101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.720114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.721605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.721633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.721660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.721692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.721873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.721906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.721933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.721969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.721994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.722170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.722179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.723293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.723320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.723345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.723369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.723606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.723646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.723673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.723701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.723731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.724101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.724111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.725742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.725771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.725821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.725850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.726033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.726067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.726095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.726129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.726155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.726335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.726346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.727523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.727567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.727592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.727618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.727801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.727843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.727869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.727893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.727919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.728191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.728200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.730281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.731370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.731398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.731430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.731614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.731647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.731680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.732696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.732725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.732906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.732914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.735027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.735060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.735103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.518 [2024-07-12 12:09:00.735368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.735707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.735741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.736005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.736035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.736064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.736401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.736411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.737465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.737493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.737947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.737976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.738181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.739302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.739334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.739359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.739385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.739571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.739581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.741117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.741400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.741427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.741453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.741649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.742480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.742509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.742537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.742562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.742747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.742755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.744810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.744844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.744869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.744894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.745077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.745468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.745496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.745526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.745551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.745885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.745895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.748416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.748450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.748475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.748500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.748687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.749598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.749627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.749654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.749679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.749865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.749873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.751264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.751295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.751325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.751590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.751943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.752212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.752242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.519 [2024-07-12 12:09:00.752270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.783 [2024-07-12 12:09:00.753088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.783 [2024-07-12 12:09:00.753275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.783 [2024-07-12 12:09:00.753284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.783 [2024-07-12 12:09:00.754451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.783 [2024-07-12 12:09:00.755443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.783 [2024-07-12 12:09:00.755474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.783 [2024-07-12 12:09:00.755503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.783 [2024-07-12 12:09:00.755691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.783 [2024-07-12 12:09:00.755730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.783 [2024-07-12 12:09:00.755756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.783 [2024-07-12 12:09:00.755781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.783 [2024-07-12 12:09:00.756277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.783 [2024-07-12 12:09:00.756638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.783 [2024-07-12 12:09:00.756648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.783 [2024-07-12 12:09:00.758251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.783 [2024-07-12 12:09:00.759099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.783 [2024-07-12 12:09:00.759130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.783 [2024-07-12 12:09:00.759157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.783 [2024-07-12 12:09:00.759335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.783 [2024-07-12 12:09:00.759372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.783 [2024-07-12 12:09:00.759397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.783 [2024-07-12 12:09:00.759427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.783 [2024-07-12 12:09:00.760427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.783 [2024-07-12 12:09:00.760710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.783 [2024-07-12 12:09:00.760719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.783 [2024-07-12 12:09:00.761764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.783 [2024-07-12 12:09:00.762043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.783 [2024-07-12 12:09:00.762071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.783 [2024-07-12 12:09:00.762098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.783 [2024-07-12 12:09:00.762427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.783 [2024-07-12 12:09:00.762459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.783 [2024-07-12 12:09:00.762485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.783 [2024-07-12 12:09:00.762511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.783 [2024-07-12 12:09:00.762788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.763108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.763118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.764712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.764761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.764788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.765622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.765815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.765851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.765888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.765915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.766890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.767087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.767096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.768877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.768907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.769740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.769770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.769988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.770026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.770052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.770078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.771068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.771247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.771256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.772357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.772384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.773427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.773464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.773733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.773766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.773803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.774061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.774088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.774371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.774379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.776646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.776678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.776719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.777364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.777553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.777590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.778404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.778432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.779435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.779622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.779632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.781380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.781720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.781750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.782568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.782747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.782786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.782815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.783787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.783816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.783995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.784004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.785115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.785887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.785916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.786185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.786532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.786803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.786833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.787099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.787128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.787310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.787319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.788454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.789296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.789327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.790314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.790495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.791515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.791547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.791804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.791831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.792150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.792160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.793608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.794602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.794633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.795623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.795875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.796955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.796991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.798103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.798135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.798317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.798326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.799863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.800129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.800157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.800890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.801117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.802131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.802162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.784 [2024-07-12 12:09:00.803154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.803184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.803399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.803408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.804460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.804976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.805007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.805271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.805585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.805857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.805885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.806151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.806181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.806362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.806371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.807584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.808590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.808620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.809459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.809754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.810025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.810054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.810318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.810346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.810676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.810686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.811761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.812451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.812484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.813553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.813737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.814735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.814767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.815746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.815775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.816135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.816144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.818085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.818941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.818972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.818997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.819178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.820211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.820242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.821089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.821120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.821312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.821321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.822666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.822935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.822964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.823228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.823552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.824478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.825322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.825353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.826360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.826547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.826560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.828644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.828910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.829168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.829426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.829766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.830284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.831116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.832120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.833098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.833277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.833287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.835429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.835729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.835988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.836244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.836559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.836826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.837091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.837351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.837612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.837963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.837972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.839858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.840125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.840389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.840667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.840938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.841216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.841480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.841769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.842033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.842295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.842305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.844258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.844544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.844813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.845071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.845366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.845635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.845895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.785 [2024-07-12 12:09:00.846160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.846423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.846795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.846805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.848785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.849047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.849305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.849570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.849858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.850129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.850399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.850662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.850921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.851247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.851257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.853233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.853492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.853760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.854026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.854380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.854657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.854915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.855171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.855430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.855792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.855802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.857824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.858096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.858360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.858620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.858894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.859156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.859415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.859700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.859974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.860313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.860323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.862155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.862423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.862690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.862957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.863220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.863499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.863769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.864032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.864295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.864633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.864643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.866523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.866555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.866822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.867092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.867367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.867651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.867916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.867947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.868244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.868576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.868588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.870357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.870636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.870904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.870936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.871240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.871533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.871570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.871831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.872095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.872450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.872459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.874329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.874593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.874625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.874883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.875168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.875210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.875495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.875766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.876028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.876357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.876367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.878326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.878361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.878644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.878910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.879249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.879305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.879581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.879845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.880108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.880431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.880442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.882117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.882398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.882668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.882935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.883215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.883257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.786 [2024-07-12 12:09:00.883526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.883793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.884057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.884387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.884397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.886061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.886327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.886597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.887244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.887467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.887510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.888150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.888723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.888999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.889332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.889342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.891129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.891391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.891654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.891682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.892034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.892067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.892326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.892593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.892626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.892970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.892979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.895386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.895426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.895692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.895952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.896276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.896545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.896805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.897069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.897123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.897422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.897432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.899592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.899628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.899889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.900149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.900450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.900725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.901051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.901923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.901956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.902142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.902151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.904291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.904324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.905292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.906062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.906367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.906643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.906907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.907173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.907203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.907388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.907397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.908541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.908961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.909783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.909813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.910006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.911076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.912054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.912327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.912357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.912686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.912698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.915145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.916140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.916173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.916947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.917140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.917957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.918927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.919920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.787 [2024-07-12 12:09:00.919950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.920211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.920222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.923136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.924004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.924035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.925111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.925300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.925768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.926600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.926631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.927616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.927801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.927810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.929300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.929582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.930315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.930345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.930576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.931597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.931629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.932607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.932637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.932919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.932930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.934564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.934625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.934899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.934926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.935216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.935481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.935861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.935890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.936697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.936880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.936890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.939023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.939056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.940040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.940071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.940303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.940345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.940610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.940643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.940899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.941233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.941242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.943534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.943569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.944078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.944107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.944285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.944325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.945349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.945386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.946362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.946562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.946575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.948794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.948827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.949630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.949659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.949839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.949877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.950869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.950900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.951513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.951760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.951773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.953309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.953345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.953604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.953654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.954003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.954040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.954297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.954327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.955281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.955468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.955476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.957604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.957646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.959010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.959039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.959222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.959263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.959535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.959563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.959828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.960134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.960143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.962508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.962545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.963242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.963271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.963455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.788 [2024-07-12 12:09:00.963490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.964312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.964341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.965326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.965514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.965527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.967437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.967469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.968526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.968557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.968743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.968777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.969823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.969860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.970858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.971161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.971170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.972436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.972469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.972738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.973002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.973340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.973374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.974167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.974197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.975013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.975198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.975208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.976323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.976352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.977329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.977360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.977547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.977586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.977611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.977877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.977909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.978248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.978258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.979798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.979829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.979856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.979881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.980065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.980109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.980135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.980159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.980184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.980363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.980371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.981532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.981560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.981585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.981613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.981796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.981834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.981859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.981884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.981909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.982170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.982179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.984178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.984210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.984251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.984276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.984458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.984496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.984529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.984556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.984580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.984762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.984770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.985925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.985953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.985978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.986003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.986185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.986223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.986249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.986274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.986299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.986480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.986489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.988092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.988125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.988151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.988179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.988505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.988542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.988569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.988598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.988623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.988806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.988815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.989942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.989970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.989995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.990020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.990240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.789 [2024-07-12 12:09:00.990281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.990306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.990331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.990356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.990539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.990548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.991890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.991919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.991964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.991991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.992331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.992364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.992390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.992415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.992441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.992766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.992776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.993828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.993857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.993883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.993908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.994198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.994239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.994265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.994290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.994314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.994526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.994534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.995674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.995943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.995971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.995997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.996252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.996292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.996318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.996587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.996615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.996816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.996825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.998671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.998705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.998730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.999627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.999814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:00.999853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.000845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.000879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.000904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.001196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.001205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.003104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.003133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.004005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.004034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.004217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.005302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.005338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.005364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.005389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.005576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.005587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.006692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.007505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.007539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.007565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.007863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.008137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.008166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.008193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.008230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.008573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.008582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.010839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.010873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.010897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.010921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.011254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.012336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.012365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.012396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.012420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.012611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.012620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.014212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.014243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.014268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.014291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.014614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.015256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.015285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.015310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.015334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.015565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.015574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.017478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.017509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.017558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.018531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.018716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.790 [2024-07-12 12:09:01.019278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.791 [2024-07-12 12:09:01.019325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.791 [2024-07-12 12:09:01.019356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.791 [2024-07-12 12:09:01.019617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.791 [2024-07-12 12:09:01.019937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.791 [2024-07-12 12:09:01.019946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.791 [2024-07-12 12:09:01.021620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.791 [2024-07-12 12:09:01.022603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.791 [2024-07-12 12:09:01.022637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.791 [2024-07-12 12:09:01.022676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.791 [2024-07-12 12:09:01.022859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.791 [2024-07-12 12:09:01.022892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.791 [2024-07-12 12:09:01.022923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.791 [2024-07-12 12:09:01.022951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.791 [2024-07-12 12:09:01.023694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.791 [2024-07-12 12:09:01.023923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.791 [2024-07-12 12:09:01.023932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:10.791 [2024-07-12 12:09:01.025154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.054 [2024-07-12 12:09:01.025424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.054 [2024-07-12 12:09:01.025453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.054 [2024-07-12 12:09:01.025480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.025823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.025857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.025883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.025909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.026194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.026383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.026392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.027523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.028460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.028489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.028514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.028716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.028758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.028787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.028811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.029804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.029986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.030000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.032210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.032244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.032268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.033100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.033286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.033326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.033352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.033376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.034378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.034688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.034698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.035745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.035784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.036128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.036157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.036497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.036535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.036562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.036586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.036844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.037177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.037187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.038334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.038366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.039277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.039308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.039524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.039561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.039587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.040400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.040432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.040624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.040633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.042359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.042394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.042422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.042704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.042890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.042927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.043824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.043856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.044923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.045112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.045121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.046307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.047426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.047458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.047732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.048075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.048109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.048136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.048399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.048427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.048763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.048774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.049874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.050458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.050490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.051457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.051651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.052642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.052676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.053734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.053771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.054026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.054036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.055905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.056765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.056795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.057690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.057875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.058405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.058436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.059253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.059284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.059469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.059478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.061098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.061367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.055 [2024-07-12 12:09:01.061395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.062480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.062671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.063668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.063699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.064680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.064709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.065017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.065026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.066067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.066338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.066369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.066639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.066913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.067189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.067218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.067815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.067845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.068080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.068089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.069247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.070062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.070093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.071075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.071263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.071901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.071944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.072208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.072236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.072565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.072574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.074082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.075078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.075109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.076241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.076536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.077433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.077462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.078448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.078478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.078664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.078673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.080292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.080570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.080600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.081621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.081882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.082876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.082906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.083897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.083928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.084229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.084238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.085279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.085562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.085592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.085617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.085950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.086214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.086241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.086497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.086530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.086742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.086750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.088485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.089326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.089359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.090358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.090545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.091267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.091535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.091567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.091822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.092088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.092098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.093979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.094239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.094496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.094768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.095035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.095306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.095570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.095833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.096091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.096368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.096377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.098342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.098625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.098902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.099164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.099483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.099756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.100016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.100280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.100549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.100903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.100916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.102875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.103155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.056 [2024-07-12 12:09:01.103415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.103683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.104000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.104267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.104537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.104802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.105057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.105384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.105394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.107324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.107594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.107854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.108122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.108409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.108685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.108943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.109200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.109460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.109801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.109812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.111734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.111999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.112261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.112531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.112857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.113119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.113376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.113643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.113907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.114245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.114254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.116224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.116490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.116756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.117014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.117338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.117610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.117874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.118140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.118398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.118764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.118775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.120663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.120924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.121183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.121446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.121701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.121974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.122233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.122493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.122781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.123064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.123073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.125016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.125298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.125574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.125844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.126184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.126458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.126728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.126995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.127264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.127627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.127638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.129546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.129583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.129856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.130114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.130446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.130722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.130983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.131014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.131273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.131635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.131645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.133421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.133717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.133988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.134019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.134333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.134611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.134645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.134909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.135177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.135438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.135447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.137626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.137904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.137946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.138213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.138541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.138578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.138855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.139117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.139385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.139671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.139681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.142017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.142063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.142331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.142606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.057 [2024-07-12 12:09:01.142940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.142978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.143249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.143515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.144574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.144916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.144925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.146596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.146865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.147131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.147406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.147705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.147761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.148035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.148295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.148557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.148865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.148874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.150668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.150950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.151218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.151497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.151803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.151847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.152103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.152361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.152631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.152926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.152936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.154790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.155083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.155352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.155384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.155698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.155748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.156019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.156284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.156314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.156635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.156646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.159115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.159170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.160197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.160928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.161150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.162134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.163137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.163890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.163923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.164230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.164239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.166694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.166728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.167728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.168711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.168955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.169917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.170780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.171769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.171801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.171988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.171997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.174010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.174045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.174846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.175844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.176031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.177126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.177795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.178602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.178632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.178813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.178822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.180280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.180556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.180834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.180864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.181044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.181870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.182834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.183828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.183859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.184144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.184152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.185473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.185743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.185773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.186036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.186363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.186846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.187701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.188688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.188721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.188914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.188924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.191023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.191769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.191799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.192076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.058 [2024-07-12 12:09:01.192397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.192676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.192945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.192976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.194056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.194265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.194275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.195428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.196529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.197615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.197645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.197840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.198108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.198139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.198393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.198422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.198667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.198676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.201068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.201124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.201981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.202011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.202193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.203003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.203992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.204023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.205010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.205271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.205281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.208224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.208266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.209288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.209325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.209509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.209549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.210564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.210600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.211247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.211468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.211477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.212876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.212909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.213175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.213206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.213541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.213575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.214273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.214302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.215138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.215327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.215335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.217486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.217525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.218514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.218549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.218904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.218946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.219211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.219239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.219503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.219848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.219858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.221983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.222035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.222648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.222678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.222906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.222945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.223935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.223967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.224966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.225221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.225231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.227872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.227910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.228768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.228798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.228976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.229015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.229986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.230020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.059 [2024-07-12 12:09:01.230526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.230742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.230751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.232141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.232172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.232453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.232483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.232818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.232854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.233470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.233498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.234320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.234502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.234511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.236601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.236635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.237622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.237651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.237966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.238017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.238281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.238310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.238573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.238923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.238934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.241059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.241109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.241586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.242413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.242601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.242648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.243628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.243658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.244377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.244668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.244678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.246419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.246447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.247256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.247285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.247463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.247501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.247530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.248496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.248529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.248893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.248902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.249951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.249979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.250004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.250036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.250314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.250354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.250380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.250404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.250429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.250741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.250752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.252293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.252322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.252349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.252374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.252571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.252609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.252635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.252660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.252684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.252865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.252873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.254034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.254063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.254089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.254116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.254294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.254331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.254356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.254388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.254414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.254674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.254684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.256382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.256409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.256434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.256459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.256694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.256732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.256758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.256783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.256807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.256990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.256998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.258110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.258138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.258163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.258187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.258361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.258399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.258423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.258448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.258478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.060 [2024-07-12 12:09:01.258660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.258669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.260423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.260452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.260477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.260502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.260773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.260807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.260833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.260857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.260883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.261098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.261107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.262222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.262250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.262277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.262302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.262537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.262575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.262600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.262625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.262652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.262833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.262841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.264356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.264386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.264410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.264435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.264775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.264812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.264839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.264866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.264892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.265133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.265142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.266184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.266969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.266999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.267024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.267250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.267287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.267313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.268310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.268340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.268525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.268534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.270466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.270502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.270531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.271345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.271529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.271563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.272659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.272689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.272714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.272897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.272905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.274056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.274084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.274840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.274870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.275165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.275434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.275463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.275489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.275515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.275830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.275838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.277159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.278144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.278176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.278201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.278483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.279485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.279523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.279550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.279575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.279756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.279764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.281364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.281399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.281425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.281453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.281771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.282321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.282349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.282374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.282399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.282631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.282639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.284597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.284631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.284657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.284681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.284867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.285874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.285905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.285937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.285964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.061 [2024-07-12 12:09:01.286253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.062 [2024-07-12 12:09:01.286261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.062 [2024-07-12 12:09:01.289096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.062 [2024-07-12 12:09:01.289131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.062 [2024-07-12 12:09:01.289162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.062 [2024-07-12 12:09:01.290147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.062 [2024-07-12 12:09:01.290333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.062 [2024-07-12 12:09:01.291460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.062 [2024-07-12 12:09:01.291492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.062 [2024-07-12 12:09:01.291526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.062 [2024-07-12 12:09:01.292382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.062 [2024-07-12 12:09:01.292631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.062 [2024-07-12 12:09:01.292640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.062 [2024-07-12 12:09:01.293951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.062 [2024-07-12 12:09:01.294224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.062 [2024-07-12 12:09:01.294264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.062 [2024-07-12 12:09:01.294300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.062 [2024-07-12 12:09:01.294666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.062 [2024-07-12 12:09:01.294700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.062 [2024-07-12 12:09:01.294727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.062 [2024-07-12 12:09:01.294752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.062 [2024-07-12 12:09:01.295140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.062 [2024-07-12 12:09:01.295323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.062 [2024-07-12 12:09:01.295332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.062 [2024-07-12 12:09:01.296466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.297285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.297317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.297343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.297530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.297571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.297597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.297622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.298608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.298837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.298847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.300659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.301522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.301552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.301577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.301800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.301839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.301864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.301889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.302873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.303057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.303069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.305176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.305209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.305234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.305503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.305834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.305871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.305898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.305923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.306188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.306525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.306535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.307732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.307760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.308587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.308618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.308802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.308837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.308863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.308888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.309780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.309965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.309974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.311537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.311567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.311845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.311878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.312202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.312234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.312276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.313280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.313316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.313499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.313507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.315677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.315710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.315735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.316722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.316910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.316952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.317222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.317254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.317515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.317819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.317829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.319204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.319772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.319810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.320919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.321106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.324 [2024-07-12 12:09:01.321145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.321170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.322285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.322318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.322624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.322634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.324455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.325317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.325348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.326322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.326506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.327328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.327360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.328413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.328448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.328631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.328640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.330094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.330380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.330411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.330677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.330945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.331773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.331804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.332775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.332803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.332980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.332989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.334209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.335209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.335240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.335595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.335971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.336237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.336277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.336539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.336566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.336874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.336882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.337977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.338581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.338615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.339437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.339624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.340618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.340648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.341274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.341307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.341650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.341660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.343300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.344201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.344232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.345300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.345483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.345957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.345987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.346822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.346850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.347030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.347039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.348417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.348687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.348721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.348975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.349156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.350009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.350040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.351048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.351077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.351255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.351263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.352469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.353451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.353481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.353775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.354153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.354424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.354470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.354743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.354786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.355102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.355111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.356798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.357061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.357089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.357116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.357345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.357637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.357671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.357939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.357971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.358293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.358303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.360164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.360428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.360458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.360731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.361056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.325 [2024-07-12 12:09:01.361321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.361605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.361638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.361914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.362252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.362263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.364315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.364582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.364851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.365110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.365371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.365666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.365947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.366213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.366483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.366831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.366842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.369080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.369353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.369627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.369897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.370259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.370535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.370816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.371084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.371353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.371612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.371622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.374130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.374411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.374685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.374950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.375303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.375583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.375856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.376130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.376394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.376725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.376734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.378813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.379088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.379356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.379630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.379937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.380211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.380481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.380754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.381021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.381342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.381352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.383382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.383659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.383940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.384227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.384591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.384864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.385137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.385403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.385682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.386054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.386066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.387751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.388025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.388297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.388655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.388841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.389117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.389383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.390388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.390659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.390982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.390991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.392825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.393780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.394050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.394315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.394602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.394877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.395856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.396119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.396502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.396691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.396700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.399512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.399790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.400085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.401007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.401360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.401643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.401913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.402192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.403128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.403480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.403489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.405409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.405453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.405748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.406663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.407020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.326 [2024-07-12 12:09:01.407291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.408364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.408402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.408665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.408986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.408995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.410570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.410902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.411808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.411840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.412197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.412470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.412504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.412785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.413194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.413379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.413389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.415299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.415593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.415638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.416042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.416222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.416264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.416530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.416894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.417718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.418058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.418071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.419869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.419905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.420649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.420915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.421180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.421217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.422020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.422279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.422541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.422844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.422853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.424428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.425222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.425503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.425774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.426083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.426128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.426600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.427343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.427612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.427880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.427889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.429500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.429947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.430711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.430973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.431242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.431279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.432068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.432331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.432615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.432924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.432934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.434456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.435230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.435491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.435525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.435803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.435847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.436109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.436369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.436400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.436709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.436721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.438670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.438721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.438989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.439248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.439570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.440221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.441041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.442041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.442073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.442261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.442270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.444392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.444426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.444690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.444948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.445243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.445514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.445877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.446750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.446793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.446976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.446985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.449063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.449096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.450094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.450650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.327 [2024-07-12 12:09:01.450989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.451258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.451521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.451781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.451810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.451990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.451999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.453141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.454020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.455010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.455039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.455216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.456277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.456540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.456799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.456829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.457092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.457101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.459485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.460204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.460238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.461316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.461499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.462520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.463511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.463864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.463895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.464228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.464238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.466750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.467850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.467882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.468821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.469050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.469876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.470867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.470897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.471882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.472094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.472103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.474052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.475019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.475877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.475907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.476091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.477101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.477132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.477569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.477598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.477792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.477801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.479324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.479360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.479630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.479664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.479982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.480719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.481547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.481578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.482566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.482751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.482760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.484928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.484962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.485232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.485263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.485598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.485632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.485898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.485928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.486191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.486449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.486458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.488225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.488259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.489096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.489125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.489307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.489347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.490330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.328 [2024-07-12 12:09:01.490361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.490717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.491077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.491086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.493635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.493671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.494740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.494771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.494951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.494990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.495612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.495641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.496453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.496645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.496654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.498371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.498407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.498673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.498702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.498885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.498919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.499777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.499807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.500827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.501010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.501018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.503343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.503376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.503665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.503698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.504028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.504061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.504330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.504361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.504627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.504824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.504833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.507020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.507076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.508168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.508204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.508387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.508419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.509509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.509550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.509832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.510161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.510171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.512650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.512684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.513667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.513695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.513918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.513957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.514937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.514968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.515944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.516128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.516137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.517961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.517994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.518435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.519264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.519447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.519491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.520470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.520498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.521299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.521523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.521532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.522725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.522755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.523036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.523068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.523388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.523422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.523460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.523748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.523780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.524102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.524111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.525185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.525220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.525244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.525270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.525504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.525560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.525586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.525611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.525636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.525843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.525852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.527177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.527211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.527236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.527261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.527575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.527621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.527657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.329 [2024-07-12 12:09:01.527684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.527708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.528058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.528068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.529260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.529291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.529315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.529339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.529514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.529555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.529581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.529606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.529630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.529807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.529815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.531004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.531034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.531060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.531086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.531431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.531464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.531491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.531520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.531548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.531821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.531833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.533255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.533282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.533324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.533349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.533532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.533570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.533595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.533621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.533645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.533900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.533910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.535026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.535054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.535087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.535114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.535398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.535429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.535455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.535480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.535505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.535837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.535847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.537328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.537356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.537383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.537407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.537587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.537626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.537651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.537678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.537704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.537881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.537890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.540617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.540649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.540676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.540702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.541015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.541051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.541077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.541102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.541128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.541487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.541496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.543939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.544782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.544814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.544840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.545024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.545064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.545090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.546078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.546108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.546364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.546374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.549361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.549401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.549435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.550385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.550625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.550670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.551495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.551527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.551552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.551734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.551743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.554926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.554959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.555825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.555856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.556038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.557147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.557184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.557209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.330 [2024-07-12 12:09:01.557234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.331 [2024-07-12 12:09:01.557420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.331 [2024-07-12 12:09:01.557430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.331 [2024-07-12 12:09:01.560593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.331 [2024-07-12 12:09:01.560865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.331 [2024-07-12 12:09:01.560894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.331 [2024-07-12 12:09:01.560921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.331 [2024-07-12 12:09:01.561291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.331 [2024-07-12 12:09:01.561606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.331 [2024-07-12 12:09:01.561636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.331 [2024-07-12 12:09:01.561662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.331 [2024-07-12 12:09:01.561686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.331 [2024-07-12 12:09:01.561912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.331 [2024-07-12 12:09:01.561921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.331 [2024-07-12 12:09:01.565799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.331 [2024-07-12 12:09:01.565835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.331 [2024-07-12 12:09:01.565864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.331 [2024-07-12 12:09:01.565889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.331 [2024-07-12 12:09:01.566208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.331 [2024-07-12 12:09:01.566493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.331 [2024-07-12 12:09:01.566526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.331 [2024-07-12 12:09:01.566553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.331 [2024-07-12 12:09:01.566581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.331 [2024-07-12 12:09:01.566875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.331 [2024-07-12 12:09:01.566884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.632 [2024-07-12 12:09:01.569966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.632 [2024-07-12 12:09:01.570004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.632 [2024-07-12 12:09:01.570029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.632 [2024-07-12 12:09:01.570054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.632 [2024-07-12 12:09:01.570283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.632 [2024-07-12 12:09:01.571288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.632 [2024-07-12 12:09:01.571318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.632 [2024-07-12 12:09:01.571344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.632 [2024-07-12 12:09:01.571369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.632 [2024-07-12 12:09:01.571555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.632 [2024-07-12 12:09:01.571564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.632 [2024-07-12 12:09:01.574998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.632 [2024-07-12 12:09:01.575037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.632 [2024-07-12 12:09:01.575064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.632 [2024-07-12 12:09:01.576056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.632 [2024-07-12 12:09:01.576241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.632 [2024-07-12 12:09:01.576929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.632 [2024-07-12 12:09:01.576960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.632 [2024-07-12 12:09:01.576988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.632 [2024-07-12 12:09:01.577806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.632 [2024-07-12 12:09:01.577995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.632 [2024-07-12 12:09:01.578004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.632 [2024-07-12 12:09:01.581058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.632 [2024-07-12 12:09:01.581898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.632 [2024-07-12 12:09:01.581930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.632 [2024-07-12 12:09:01.581956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.632 [2024-07-12 12:09:01.582141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.632 [2024-07-12 12:09:01.582182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.632 [2024-07-12 12:09:01.582208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.632 [2024-07-12 12:09:01.582233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.632 [2024-07-12 12:09:01.583216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.632 [2024-07-12 12:09:01.583439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.632 [2024-07-12 12:09:01.583447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.632 [2024-07-12 12:09:01.586809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.632 [2024-07-12 12:09:01.587097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.587124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.587150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.587492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.587528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.587555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.587581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.588333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.588558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.588567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.591585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.592575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.592604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.592636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.592923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.592956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.592982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.593006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.593264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.593565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.593575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.596793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.596828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.596852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.597766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.598043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.598085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.598111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.598137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.598399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.598709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.598718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.601181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.601214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.602138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.602168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.602412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.602452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.602478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.602504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.603495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.603677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.603686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.605908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.605943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.606954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.606985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.607170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.607209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.607237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.607981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.608009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.608190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.608198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.611120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.611155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.611198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.611790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.612027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.612071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.613049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.613078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.614053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.614238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.614247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.615369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.615911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.615942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.616198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.616522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.616576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.616603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.616865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.616892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.617183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.617192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.618280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.619078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.619110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.619898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.620084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.621099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.621130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.621582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.621612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.621798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.621807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.623562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.624020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.624049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.624904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.625090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.626095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.626126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.626901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.626931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.627117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.627126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.628354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.633 [2024-07-12 12:09:01.628638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.628667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.628928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.629272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.629624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.629655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.630472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.630499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.630702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.630711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.631936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.632941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.632971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.633955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.634207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.634478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.634507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.634790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.634818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.635146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.635156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.636962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.637223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.637249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.637504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.637819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.638100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.638154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.638411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.638439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.638755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.638765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.640477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.640762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.640792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.641059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.641371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.641674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.641703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.641961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.641992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.642258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.642270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.644064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.644334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.644365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.644636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.644956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.645227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.645258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.645523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.645552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.645894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.645903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.647887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.648153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.648181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.648207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.648504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.648788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.648827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.649110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.649139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.649462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.649472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.651428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.651714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.651745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.652006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.652343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.652621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.652895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.652930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.653194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.653525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.653535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.655598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.655866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.656133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.656398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.656726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.657011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.657278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.657545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.657807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.658122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.658131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.660152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.660438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.660727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.660997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.661338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.661612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.661876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.662141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.662408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.662732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.662742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.664755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.665030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.634 [2024-07-12 12:09:01.665295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.665561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.665894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.666260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.666534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.666799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.667060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.667411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.667420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.669488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.669761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.670028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.670295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.670583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.670855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.671118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.671382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.671652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.671935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.671944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.674080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.674345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.674624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.674887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.675215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.675484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.675754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.676028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.676300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.676630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.676640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.678796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.679077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.679341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.679616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.679916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.680196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.680460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.680726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.680994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.681288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.681296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.718889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.719802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.722074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.722570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.722610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.722846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.723102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.723450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.723492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.723758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.723809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.724861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.724904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.725882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.726284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.726472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.726481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.726488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.728299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.728941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.729686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.730655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.730849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.731561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.732363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.733338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.733965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.734259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.734269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.734276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.736652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.737637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.738239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.739334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.739552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.740536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.741053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.741321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.741598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.741961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.741971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.741981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.743663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.744659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.745739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.746826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.747107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.747383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.747651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.747923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.748188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.748367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.748375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.748386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.750584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.750866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.635 [2024-07-12 12:09:01.751122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.751380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.751691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.752234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.753020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.753928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.754759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.755005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.755014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.755021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.756495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.756760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.757018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.758022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.758245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.759166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.759810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.760852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.761769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.761957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.761966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.761973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.764419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.764738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.764996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.765907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.766094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.766667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.767765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.768627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.768971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.769326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.769336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.769344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.771742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.772121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.773014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.774063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.774251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.774649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.774922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.775178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.775435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.775719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.775728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.775735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.777935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.778226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.778492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.778758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.779089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.779639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.780384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.781253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.781829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.782016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.782026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.782033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.783897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.784727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.785443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.786079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.786296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.787209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.787972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.788236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.788499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.788870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.788880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.788888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.790913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.791823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.792361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.792632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.792962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.793234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.793498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.794538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.795494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.795787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.795796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.636 [2024-07-12 12:09:01.795803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.797391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.797660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.798107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.798876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.799064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.799340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.800363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.801427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.801695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.802013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.802023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.802033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.804615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.804955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.805863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.806753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.807030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.807302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.807570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.807836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.808670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.808890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.808899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.808906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.810817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.811089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.811353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.811620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.811949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.812565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.813165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.813928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.814384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.814730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.814740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.814747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.816682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.816957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.817221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.817482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.817811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.818090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.818359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.818629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.818893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.819234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.819246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.819254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.821167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.821435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.821708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.821980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.822307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.822588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.822852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.823115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.823380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.823689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.823699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.823706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.825785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.826073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.826345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.826614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.826911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.827181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.827447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.827723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.827994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.828314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.828324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.828333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.830198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.830469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.830737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.831001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.831204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.831895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.832600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.833410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.833827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.834179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.834189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.834197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.637 [2024-07-12 12:09:01.836511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.837529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.838284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.838761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.839110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.839383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.839665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.839932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.841008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.841358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.841368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.841374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.843188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.843550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.844398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.845502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.845873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.846770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.847060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.847319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.847597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.847939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.847949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.847957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.849536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.849820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.849851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.850115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.850437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.850716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.851662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.852067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.852855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.853151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.853161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.853169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.855736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.856241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.856930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.857190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.857522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.857576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.857840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.858105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.859194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.859466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.859476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.859483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.861303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.861612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.903 [2024-07-12 12:09:01.862628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.862908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.863094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.863509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.863772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.864028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.864285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.864586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.864596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.864602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.866093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.866374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.866641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.866910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.867200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.868267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.868586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.869368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.869641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.869972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.869982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.869990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.871814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.872646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.872905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.873164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.873423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.873778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.873790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.874422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.875331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.875926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.876750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.877097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.877450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.877460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.877467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.877477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.879621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.879656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.880566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.880603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.880788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.880797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.881065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.881095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.881352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.881379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.881688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.881698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.881706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.881714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.883440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.883473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.884332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.884364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.884612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.884621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.884888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.884917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.885195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.885236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.885599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.885610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.885617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.885625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.887822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.887856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.888827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.888859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.889222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.889231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.889502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.889536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.889800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.889826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.890136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.890146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.890155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.890163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.892302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.892351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.893127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.893157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.893457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.893466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.893747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.893776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.894040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.894067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.894398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.894407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.894415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.894424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.896533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.896568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.896904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.896932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.904 [2024-07-12 12:09:01.897279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.897288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.897561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.897589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.897852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.897880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.898130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.898139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.898146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.898153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.900299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.900349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.900615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.900643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.900973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.900983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.901250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.901279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.901543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.901578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.901765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.901773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.901780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.901787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.903385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.903416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.903700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.903727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.904020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.904029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.904296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.904324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.904958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.904987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.905219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.905229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.905236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.905242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.906667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.906699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.906741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.906770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.907093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.907102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.907371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.907400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.907427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.907452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.907793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.907807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.907814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.907820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.909029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.909063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.909100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.909127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.909309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.909317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.909351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.909379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.909405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.909430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.909743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.909752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.909760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.909767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.911434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.911460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.911500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.911539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.911728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.911735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.911772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.911798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.911822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.911847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.912129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.912140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.912146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.912153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.913580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.913609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.913653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.913679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.913971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.913980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.914009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.914035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.914060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.914087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.914411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.914420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.914428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.914435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.915552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.915581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.915610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.915638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.915824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.915833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.905 [2024-07-12 12:09:01.915867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.915894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.915939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.915969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.916154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.916163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.916170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.916176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.917730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.917757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.917798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.917826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.918152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.918161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.918192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.918219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.918244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.918270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.918453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.918460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.918466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.918473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.919614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.919648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.919673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.919697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.919882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.919891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.919927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.919957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.919981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.920006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.920189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.920197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.920204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.920210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.921607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.921634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.921675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.921700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.922005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.922018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.922048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.922075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.922100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.922126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.922449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.922458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.922466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.922474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.923533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.923561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.923589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.923620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.923871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.923880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.923911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.923937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.923963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.923988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.924206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.924215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.924221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.924228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.925440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.925469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.925495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.925528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.925854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.925864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.925894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.925926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.925955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.925990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.926351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.926360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.926368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.926376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.927568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.927609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.927635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.927660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.927844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.927852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.927887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.927914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.927940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.927966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.928153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.928162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.928169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.928176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.929307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.929347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.929374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.929399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.929754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.929764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.906 [2024-07-12 12:09:01.929795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.929822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.929848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.929873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.930165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.930174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.930181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.930188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.931600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.931627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.931654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.931679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.931862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.931870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.931906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.931932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.931958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.931983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.932238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.932246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.932253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.932259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.933337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.933366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.933398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.933425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.933744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.933754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.933783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.933809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.933833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.933858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.934180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.934189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.934197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.934211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.935671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.935698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.935723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.935748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.935930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.935938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.935974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.935999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.936023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.936048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.936233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.936240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.936247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.936253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.937449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.937491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.937522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.937546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.937740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.937749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.937782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.937808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.937832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.937857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.938196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.938204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.938211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.938218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.939872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.939902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.939935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.939962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.940142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.940150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.940180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.940206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.940241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.940266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.940444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.940451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.940458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.940464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.941613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.941641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.907 [2024-07-12 12:09:01.941665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.941690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.941867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.941876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.941911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.941936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.941959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.941983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.942241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.942250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.942257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.942264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.944369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.944402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.944427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.944453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.944653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.944662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.944703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.944730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.944754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.944778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.944957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.944964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.944971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.944977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.946136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.946175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.946199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.946223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.946402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.946411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.946445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.946472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.946497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.946527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.946846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.946855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.946862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.946870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.948523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.949438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.949468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.949493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.949679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.949687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.949723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.949751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.949779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.950276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.950463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.950472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.950479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.950485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.951718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.951747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.951774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.951799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.952128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.952138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.952168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.952194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.952219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.952258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.952618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.952628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.952635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.952644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.953846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.953875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.953899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.953930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.954213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.954222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.954269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.954294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.954318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.954345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.954531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.954540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.954546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.954553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.956127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.956154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.956195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.956222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.956552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.956562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.956591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.956618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.956644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.956673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.956866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.956874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.956880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.956887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.957993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.958022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.958052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.908 [2024-07-12 12:09:01.958076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.958256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.958265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.958312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.958342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.958366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.958391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.958572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.958581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.958591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.958598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.960357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.960979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.961031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.961950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.962171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.962180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.962241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.963247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.963303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.964319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.964507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.964520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.964527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.964534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.966284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.966674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.966725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.967479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.967686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.967696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.967760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.968734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.968789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.969593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.969808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.969817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.969824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.969832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.971267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.971511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.971579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.971821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.972008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.972017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.972074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.972994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.973047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.973987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.974266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.974275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.974281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.974288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.975445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.975713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.975763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.976006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.976319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.976328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.976382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.977057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.977108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.977872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.978052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.978061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.978068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.978074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.981023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.981292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.981352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.981613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.981926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.981935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.981989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.982875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.982937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.983951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.984274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.984283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.984290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.984297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.985482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.985750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.985777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.986034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.986358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.986368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.986419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.987147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.987197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.987988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.988172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.988180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.988188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.988195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.991075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.991344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.991372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.991649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.909 [2024-07-12 12:09:01.991953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:01.991966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:01.992002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:01.992839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:01.992869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:01.993812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:01.993991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:01.994000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:01.994007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:01.994014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:01.995139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:01.995406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:01.995687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:01.995951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:01.996283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:01.996293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:01.996325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:01.997228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:01.998064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:01.998970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:01.999249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:01.999258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:01.999265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:01.999273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.002461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.002875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.003685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.004690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.004875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.004884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.005370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.006101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.007071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.007974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.008245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.008254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.008261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.008269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.011017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.012004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.012956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.013716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.013945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.013954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.014933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.015821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.016084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.016349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.016646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.016656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.016663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.016671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.019349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.019641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.019908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.020173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.020502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.020512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.021572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.022416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.022791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.023746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.023934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.023946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.023952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.023960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.025750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.026177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.026985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.027999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.028299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.028308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.029400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.030487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.030764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.031029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.031321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.031330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.031337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.031345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.034482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.034773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.035041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.035304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.035623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.035633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.036473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.037223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.037792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.038635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.038822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.038831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.038838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.038848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.040757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.041851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.042712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.043070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.043255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.910 [2024-07-12 12:09:02.043264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.044188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.044638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.044902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.045163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.045513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.045526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.045534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.045542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.047523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.047795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.048060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.048324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.048554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.048564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.049295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.050111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.050715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.051623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.051826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.051835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.051842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.051850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.054704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.055470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.055923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.056908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.057092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.057101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.057649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.057917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.058179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.058442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.058746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.058755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.058762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.058768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.062094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.062366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.062631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.063493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.063731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.063740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.064489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.065090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.065710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.065977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.066310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.066322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.066330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.066338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.068946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.069221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.069487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.069754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.070101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.070111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.070380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.070655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.070928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.071194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.071485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.071493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.071500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.071507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.073927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.074198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.074472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.074742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.075074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.075084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.075351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.075621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.075888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.076161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.076494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.076504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.076512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.076526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.078524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.078798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.079061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.079325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.079629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.079638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.079916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.080193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.080460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.080729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.081047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.081056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.081065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.911 [2024-07-12 12:09:02.081073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.083311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.083587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.083852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.084116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.084329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.084338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.084942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.085658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.086169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.086435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.086780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.086789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.086796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.086803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.089311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.089603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.089870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.090133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.090478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.090488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.090995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.091715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.092700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.093215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.093399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.093408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.093415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.093422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.096070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.096651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.096922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.097183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.097543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.097553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.097820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.098713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.099038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.100085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.100432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.100441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.100448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.100455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.103296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.103591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.104642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.104919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.105263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.105273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.105544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.105810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.106180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.107021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.107342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.107351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.107361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.107368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.110785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.111526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.112443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.112747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.113088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.113098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.113367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.113634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.113895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.114780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.115033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.115042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.115048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.115056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.116965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.117250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.118310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.118801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.119002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.119012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.119301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.119570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.119834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.120098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.120372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.120380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.120387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.120394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.123216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.124094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.124751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.125381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.125573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.125583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.125973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.126240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.912 [2024-07-12 12:09:02.126503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.913 [2024-07-12 12:09:02.126779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.913 [2024-07-12 12:09:02.127113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.913 [2024-07-12 12:09:02.127124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.913 [2024-07-12 12:09:02.127132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.913 [2024-07-12 12:09:02.127140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.913 [2024-07-12 12:09:02.128795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.913 [2024-07-12 12:09:02.128829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.913 [2024-07-12 12:09:02.129107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.913 [2024-07-12 12:09:02.129374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.913 [2024-07-12 12:09:02.129720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.913 [2024-07-12 12:09:02.129731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.913 [2024-07-12 12:09:02.130001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.913 [2024-07-12 12:09:02.130974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.913 [2024-07-12 12:09:02.131315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.913 [2024-07-12 12:09:02.131345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.913 [2024-07-12 12:09:02.131531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.913 [2024-07-12 12:09:02.131540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.913 [2024-07-12 12:09:02.131547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.913 [2024-07-12 12:09:02.131554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.913 [2024-07-12 12:09:02.134236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.913 [2024-07-12 12:09:02.135177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.913 [2024-07-12 12:09:02.135868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.913 [2024-07-12 12:09:02.136413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.913 [2024-07-12 12:09:02.136757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.913 [2024-07-12 12:09:02.136767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.913 [2024-07-12 12:09:02.137035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.913 [2024-07-12 12:09:02.137311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.913 [2024-07-12 12:09:02.137579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.913 [2024-07-12 12:09:02.138528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.913 [2024-07-12 12:09:02.138716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.913 [2024-07-12 12:09:02.138726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.913 [2024-07-12 12:09:02.138732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.913 [2024-07-12 12:09:02.138739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.913 [2024-07-12 12:09:02.140996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.913 [2024-07-12 12:09:02.141263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.913 [2024-07-12 12:09:02.141687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.913 [2024-07-12 12:09:02.142483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.913 [2024-07-12 12:09:02.142846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:11.913 [2024-07-12 12:09:02.142856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.177 [2024-07-12 12:09:02.143263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.177 [2024-07-12 12:09:02.144068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.177 [2024-07-12 12:09:02.145076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.177 [2024-07-12 12:09:02.145472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.177 [2024-07-12 12:09:02.145663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.177 [2024-07-12 12:09:02.145672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.177 [2024-07-12 12:09:02.145679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.177 [2024-07-12 12:09:02.145686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.177 [2024-07-12 12:09:02.149264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.177 [2024-07-12 12:09:02.150093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.177 [2024-07-12 12:09:02.150696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.177 [2024-07-12 12:09:02.151586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.177 [2024-07-12 12:09:02.151785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.177 [2024-07-12 12:09:02.151792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.177 [2024-07-12 12:09:02.152526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.177 [2024-07-12 12:09:02.153019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.177 [2024-07-12 12:09:02.153288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.177 [2024-07-12 12:09:02.154198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.177 [2024-07-12 12:09:02.154508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.177 [2024-07-12 12:09:02.154520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.177 [2024-07-12 12:09:02.154527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.177 [2024-07-12 12:09:02.154534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.177 [2024-07-12 12:09:02.156499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.177 [2024-07-12 12:09:02.157397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.177 [2024-07-12 12:09:02.158027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.177 [2024-07-12 12:09:02.158303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.177 [2024-07-12 12:09:02.158633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.177 [2024-07-12 12:09:02.158642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.177 [2024-07-12 12:09:02.158910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.177 [2024-07-12 12:09:02.159173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.177 [2024-07-12 12:09:02.160216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.177 [2024-07-12 12:09:02.161183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.177 [2024-07-12 12:09:02.161551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.177 [2024-07-12 12:09:02.161561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.177 [2024-07-12 12:09:02.161568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.177 [2024-07-12 12:09:02.161576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.177 [2024-07-12 12:09:02.166004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.177 [2024-07-12 12:09:02.166043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.177 [2024-07-12 12:09:02.166305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.177 [2024-07-12 12:09:02.166332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.166595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.166604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.167360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.167391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.168239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.168272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.168521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.168530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.168537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.168544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.170169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.170202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.170465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.170501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.170693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.170702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.171439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.171469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.171846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.171880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.172066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.172075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.172082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.172089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.174866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.174904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.175508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.175541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.175774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.175784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.176773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.176806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.177343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.177373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.177564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.177577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.177584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.177590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.179485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.179516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.180349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.180380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.180568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.180577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.181093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.181131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.182135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.182172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.182359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.182368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.182374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.182381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.185214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.185251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.186072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.186101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.186285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.186294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.187273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.187304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.188081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.188111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.188295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.188303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.188310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.188323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.189723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.189754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.190033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.190060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.190418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.190427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.190740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.190770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.191583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.191611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.191795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.191804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.191811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.191818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.195587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.195643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.196541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.196571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.196911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.196920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.197194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.197222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.198255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.198291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.198659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.198669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.198677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.198685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.200820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.200852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.200896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.178 [2024-07-12 12:09:02.200929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.201194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.201202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.202071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.202102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.202127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.202152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.202336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.202344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.202351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.202357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.205233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.205273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.205317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.205342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.205528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.205537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.205571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.205598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.205627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.205652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.205837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.205845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.205851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.205858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.207026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.207054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.207079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.207104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.207286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.207298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.207334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.207359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.207384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.207408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.207596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.207604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.207611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.207617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.209731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.209764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.209806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.209834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.210107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.210114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.210146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.210171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.210196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.210222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.210437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.210445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.210452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.210458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.211579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.211608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.211635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.211665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.211854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.211862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.211894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.211923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.211954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.211991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.212175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.212183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.212190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.212197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.215248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.215279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.215320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.215345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.215568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.215577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.215612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.215639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.215663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.215687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.215870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.215878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.215884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.215891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.217052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.217082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.217116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.217140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.217322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.217331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.217378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.217407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.217431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.217456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.217654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.217663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.217670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.217677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.221213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.221245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.221291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.221315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.221498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.179 [2024-07-12 12:09:02.221507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.221545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.221571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.221600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.221624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.221811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.221819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.221825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.221832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.222986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.223015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.223043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.223067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.223251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.223259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.223295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.223321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.223348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.223372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.223560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.223568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.223578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.223584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.225866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.225897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.225939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.225971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.226354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.226362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.226400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.226441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.226484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.226513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.226699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.226708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.226715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.226721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.228296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.228324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.228349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.228373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.228587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.228596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.228630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.228655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.228680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.228705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.229032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.229041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.229049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.229056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.231487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.231555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.231591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.231617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.231800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.231807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.231844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.231869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.231897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.231923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.232228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.232237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.232244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.232251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.233958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.233986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.234029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.234053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.234237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.234247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.234284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.234310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.234337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.234378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.234732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.234741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.234748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.234754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.237446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.237481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.237529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.237559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.237746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.237754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.237785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.237810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.237834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.237861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.238189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.238200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.238208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.238217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.239455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.239485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.239509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.239539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.239884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.239894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.239949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.180 [2024-07-12 12:09:02.239977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.240002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.240026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.240211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.240221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.240228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.240235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.243108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.243142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.243200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.243226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.243410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.243418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.243458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.243484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.243525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.243556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.243741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.243749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.243756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.243763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.244933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.244962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.244990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.245015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.245231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.245240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.245277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.245304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.245332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.245369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.245559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.245568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.245575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.245582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.247562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.247595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.247645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.247674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.247858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.247866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.247896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.247930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.247959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.247984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.248168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.248178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.248185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.248192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.249352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.249383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.249409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.249447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.249816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.249827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.249859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.249885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.249911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.249936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.250251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.250260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.250267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.250274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.252939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.252971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.253012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.253036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.253220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.253228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.253265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.253290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.253315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.253340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.253525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.253537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.253544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.253550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.255248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.256341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.256372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.256397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.181 [2024-07-12 12:09:02.256765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.256777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.256809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.256836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.256864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.257286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.257473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.257482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.257488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.257495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.260690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.260724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.260766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.260797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.261131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.261140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.261171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.261198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.261223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.261250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.261580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.261589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.261595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.261602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.262772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.262801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.262826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.262850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.263081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.263090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.263125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.263151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.263175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.263200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.263382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.263390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.263396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.263402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.265576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.265611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.265653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.265679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.265985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.265994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.266027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.266052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.266076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.266101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.266316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.266324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.266331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.266338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.267482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.267511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.267545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.267570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.267753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.267761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.267796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.267822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.267847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.267871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.268121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.268130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.268137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.268144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.270325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.271441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.271481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.272231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.272477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.272486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.272531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.273498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.273532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.274145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.274333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.274342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.274348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.274355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.275870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.276139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.276167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.277057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.277249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.277261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.277300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.277716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.277748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.278607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.278793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.278803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.278810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.278817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.281955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.282782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.282814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.283703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.283913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.283923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.182 [2024-07-12 12:09:02.283963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.284880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.284910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.285789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.285978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.285987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.285995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.286001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.288084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.288513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.288548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.289333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.289526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.289536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.289577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.290566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.290597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.291405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.291621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.291631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.291638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.291644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.294438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.295240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.295274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.296241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.296515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.296530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.296581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.297583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.297619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.298730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.299010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.299019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.299026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.299033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.302652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.303455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.303487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.304389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.304664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.304674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.304710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.305507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.305542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.306428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.306645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.306654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.306661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.306669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.308660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.309031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.309064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.310101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.310295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.310305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.310343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.310807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.310839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.311490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.311819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.311829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.311837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.311845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.314577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.315140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.316045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.316713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.317081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.317093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.317132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.317397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.317668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.318130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.318329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.318338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.318348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.318356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.321992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.322564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.323222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.323487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.323724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.323735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.324467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.325326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.325878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.326778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.326971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.326980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.326988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.326996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.329568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.330410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.331313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.331731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.331921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.331931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.332206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.332468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.183 [2024-07-12 12:09:02.333502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.333772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.334093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.334102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.334109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.334117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.337588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.337866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.338130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.338916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.339140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.339149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.340044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.340588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.341485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.342207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.342404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.342412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.342419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.342427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.345454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.346451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.347018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.347937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.348131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.348139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.348712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.348987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.349251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.349515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.349838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.349848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.349856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.349864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.353302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.354129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.354396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.354666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.354941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.354951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.355223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.355485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.355756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.356023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.356280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.356290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.356298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.356306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.358405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.358691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.358961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.359227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.359572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.359582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.359865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.360131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.360405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.360784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.360973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.360983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.360990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.360997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.363232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.363527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.363792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.364056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.364307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.364317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.364601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.365278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.365836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.366101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.366313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.366322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.366329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.366336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.369214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.369505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.369789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.370080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.370266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.370275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.371185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.372160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.372621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.373424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.373618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.373627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.373634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.373642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.376371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.377125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.377804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.378542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.378791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.378801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.379547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.379814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.380077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.380344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.380688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.184 [2024-07-12 12:09:02.380699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.380706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.380717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.383864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.384154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.385113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.385376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.385687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.385698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.386738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.387012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.388074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.388346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.388681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.388691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.388699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.388709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.392047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.392782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.393288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.393560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.393757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.393766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.394348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.394619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.395156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.395831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.396054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.396068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.396075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.396082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.399341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.400087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.400576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.401638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.401987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.401998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.402269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.403372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.403651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.403916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.404105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.404115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.404121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.404128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.406548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.407329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.407615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.408550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.408854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.408863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.409710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.409975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.410529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.411198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.411535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.411545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.411553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.411561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.414868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.415141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.416013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.416687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.416928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.416939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.417495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.185 [2024-07-12 12:09:02.418151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.449 [2024-07-12 12:09:02.419042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.419373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.419717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.419727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.419735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.419743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.423567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.423848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.424113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.424378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.424715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.424725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.425550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.425936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.426843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.427147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.427335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.427344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.427351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.427358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.430119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.430482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.431564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.431833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.432172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.432181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.432451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.432724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.432991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.433958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.434209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.434218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.434225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.434231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.437951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.438237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.438897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.439635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.439824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.439833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.440253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.441151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.441996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.442771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.443020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.443029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.443036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.443043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.445494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.446195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.447098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.447678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.447871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.447884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.448262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.448531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.449528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.449800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.450121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.450131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.450138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.450147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.453679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.453957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.454979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.455245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.455577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.455587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.456637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.457568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.457855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.458934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.459121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.459132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.459138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.459145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.461582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.461621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.462353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.463233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.463526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.463535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.464461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.465434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.465713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.465745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.466076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.466086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.466094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.466102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.468247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.468984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.469888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.470413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.470794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.470804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.450 [2024-07-12 12:09:02.471077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.472118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.472389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.472695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.472882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.472891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.472898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.472905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.475506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.476016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.476757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.477022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.477278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.477288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.478022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.478961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.479482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.480672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.480866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.480876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.480882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.480890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.484311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.485012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.485582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.486500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.486695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.486704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.487289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.487572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.487856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.488805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.489158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.489167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.489175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.489183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.492379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.493498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.494553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.494822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.495181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.495190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.495775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.496420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.496689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.497514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.497740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.497749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.497756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.497766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.500472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.500508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.501485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.501514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.501800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.501810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.502097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.502128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.502522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.502553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.502739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.502748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.502755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.502762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.506015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.506057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.506819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.506849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.507104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.507113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.508096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.508127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.509093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.509124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.509434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.509444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.509452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.509459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.513575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.513625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.514623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.514654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.514841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.514849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.515955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.515986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.516698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.516733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.516953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.516962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.516969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.516976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.520022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.520060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.520340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.520368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.520715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.520725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.521809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.521838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.451 [2024-07-12 12:09:02.522958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.522992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.523181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.523189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.523196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.523202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.526168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.526229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.526499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.526534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.526874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.526884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.527697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.527729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.528030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.528061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.528399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.528409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.528417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.528424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.531606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.531650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.532741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.532786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.532973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.532982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.533998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.534035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.534302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.534331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.534664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.534674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.534682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.534690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.537052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.537106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.538071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.538101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.538419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.538429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.539483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.539525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.540550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.540586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.540772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.540782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.540788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.540795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.542538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.542576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.542602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.542628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.542838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.542847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.543658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.543689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.543714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.543739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.543958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.543968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.543974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.543981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.545987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.546023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.546065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.546090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.546276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.546286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.546321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.546348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.546374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.546401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.546738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.546749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.546756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.546764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.549231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.549265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.549293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.549318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.549556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.549564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.549597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.549622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.549648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.549672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.549979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.549988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.549996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.550003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.552221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.552255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.552293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.552320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.552506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.552514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.552555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.552584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.552609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.552633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.452 [2024-07-12 12:09:02.552818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.552826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.552835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.552843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.555208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.555244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.555273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.555301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.555488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.555496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.555534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.555565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.555594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.555630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.555817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.555825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.555831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.555837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.557865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.557905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.557933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.557958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.558213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.558221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.558253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.558280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.558304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.558330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.558673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.558683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.558690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.558698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.560122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.560155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.560181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.560205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.560537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.560545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.560585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.560611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.560635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.560660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.560889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.560898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.560905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.560911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.562594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.562627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.562652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.562677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.562980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.562990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.563024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.563051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.563076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.563103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.563436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.563445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.563453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.563461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.565483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.565516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.565557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.565586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.565774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.565782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.565813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.565838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.565892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.565922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.566105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.566113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.566119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.566126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.568246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.568280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.568305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.568331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.568524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.568533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.568566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.568590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.568622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.568647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.568832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.568840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.568846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.568853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.570808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.570841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.570866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.570890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.571243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.571255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.571295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.571326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.571351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.571376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.571725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.453 [2024-07-12 12:09:02.571735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.571742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.571749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.573106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.573139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.573163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.573187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.573369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.573377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.573414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.573438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.573463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.573487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.573784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.573792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.573799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.573806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.575918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.575951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.575980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.576006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.576268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.576277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.576309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.576335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.576364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.576389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.576640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.576649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.576656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.576663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.578772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.578805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.578846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.578871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.579101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.579109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.579147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.579172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.579198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.579222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.579408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.579415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.579422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.579428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.581050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.581084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.581112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.581138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.581408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.581416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.581448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.581473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.581499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.581531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.581746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.581755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.581762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.581768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.582882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.582914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.582940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.582964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.583148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.583157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.583193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.583219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.583245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.583270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.583538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.583547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.583555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.583562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.585102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.585133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.585163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.585189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.585446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.585454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.585485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.585511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.454 [2024-07-12 12:09:02.585543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.585568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.585798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.585806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.585813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.585823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.586925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.586957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.586982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.587007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.587189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.587198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.587235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.587260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.587285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.587309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.587621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.587631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.587639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.587646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.589563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.589598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.589639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.589664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.589866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.589874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.589911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.589939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.589963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.589988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.590175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.590183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.590189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.590196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.591279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.591321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.591350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.591375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.591568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.591578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.591613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.591640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.591666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.591691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.592005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.592013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.592021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.592028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.593387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.594149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.594183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.594208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.594396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.594405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.594441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.594466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.594495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.594983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.595191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.595201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.595208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.595215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.596418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.596452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.596479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.596504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.596792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.596802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.596835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.596861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.596887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.596914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.597251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.597260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.597267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.597275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.598184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.598216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.598243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.598268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.598455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.598464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.598495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.598527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.598552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.598581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.598765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.598773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.598779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.598785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.599972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.600016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.600047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.600073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.600434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.600443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.600474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.600503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.600537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.600563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.455 [2024-07-12 12:09:02.600850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.600858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.600865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.600872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.601938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.601969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.601995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.602027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.602214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.602223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.602254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.602286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.602323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.602350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.602634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.602643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.602650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.602657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.603999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.604758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.604791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.605544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.605783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.605792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.605831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.606571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.606604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.607568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.607806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.607816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.607823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.607831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.609436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.610460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.610499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.611407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.611687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.611697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.611746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.612747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.612783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.613836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.614185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.614194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.614201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.614209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.615629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.616474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.616509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.617461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.617788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.617798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.617839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.618759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.618792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.619649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.619907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.619917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.619927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.619935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.621199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.621924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.621958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.622759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.623043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.623053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.623092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.623980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.624012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.624721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.624982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.624992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.624999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.625006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.626238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.626974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.627008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.627654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.627905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.627914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.627953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.628850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.628881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.629423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.629767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.629777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.629784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.629793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.630946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.631690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.631724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.632145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.632359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.632368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.632406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.633304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.633337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.633668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.634028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.634038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.634046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.456 [2024-07-12 12:09:02.634053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.635204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.636135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.636168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.636434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.636626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.636636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.636672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.637721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.637758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.638023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.638363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.638372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.638380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.638387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.639627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.640698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.641058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.642133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.642441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.642451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.642491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.642776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.643044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.643313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.643499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.643508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.643515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.643529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.645373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.645871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.646142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.646407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.646755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.646764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.647802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.648697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.649015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.650032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.650304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.650313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.650321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.650327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.652725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.653017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.654091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.654358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.654721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.654737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.655009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.655277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.655554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.655824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.656189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.656199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.656206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.656214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.657651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.657951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.658218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.658486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.658765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.658776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.659057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.659330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.659605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.659879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.660156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.660166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.660173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.660180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.661618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.661897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.662177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.662444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.662813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.662823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.663095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.663368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.663650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.663918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.664289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.664298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.664306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.664314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.666124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.666637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.667601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.668560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.668757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.668766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.669044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.669310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.669581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.670617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.670803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.670812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.670819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.670826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.672472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.457 [2024-07-12 12:09:02.673464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.673848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.674116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.674420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.674429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.674707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.675814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.676085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.677128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.677455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.677468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.677475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.677482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.679564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.680569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.680838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.681793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.682064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.682073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.682348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.682627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.683059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.683938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.684252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.684262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.684270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.684278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.685575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.686062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.686795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.687421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.687614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.687623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.687901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.688173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.688443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.689413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.689684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.689694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.689701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.689711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.458 [2024-07-12 12:09:02.690885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.691160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.692263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.692554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.692739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.692748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.693020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.693290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.693561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.694402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.694626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.694635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.694642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.694649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.695683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.695960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.696226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.697041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.697305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.697314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.698297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.698572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.698854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.699139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.699442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.699451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.699458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.699465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.700446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.700761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.701034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.701744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.701949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.701959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.702763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.703177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.703444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.703719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.704015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.704024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.704031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.704037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.705661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.705966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.706236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.706522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.706814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.706824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.707928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.708549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.717 [2024-07-12 12:09:02.709230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.718 [2024-07-12 12:09:02.710089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.718 [2024-07-12 12:09:02.710323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.718 [2024-07-12 12:09:02.710332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.718 [2024-07-12 12:09:02.710340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.718 [2024-07-12 12:09:02.710347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.718 [2024-07-12 12:09:02.712031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.718 [2024-07-12 12:09:02.712546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.718 [2024-07-12 12:09:02.713263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.718 [2024-07-12 12:09:02.713536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.718 [2024-07-12 12:09:02.713799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.718 [2024-07-12 12:09:02.713808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.718 [2024-07-12 12:09:02.714244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.718 [2024-07-12 12:09:02.715031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.718 [2024-07-12 12:09:02.716014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.718 [2024-07-12 12:09:02.716649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.718 [2024-07-12 12:09:02.716872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.718 [2024-07-12 12:09:02.716881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.718 [2024-07-12 12:09:02.716887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.718 [2024-07-12 12:09:02.716894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.718 [2024-07-12 12:09:02.720002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.718 [2024-07-12 12:09:02.720718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.718 [2024-07-12 12:09:02.720906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.718 [2024-07-12 12:09:02.720915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:12.974 00:28:12.974 Latency(us) 00:28:12.975 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:12.975 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:28:12.975 Verification LBA range: start 0x0 length 0x100 00:28:12.975 crypto_ram : 5.53 59.28 3.71 0.00 0.00 2059698.67 45188.63 1669732.45 00:28:12.975 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:28:12.975 Verification LBA range: start 0x100 length 0x100 00:28:12.975 crypto_ram : 5.47 55.90 3.49 0.00 0.00 2183694.50 15416.56 1741634.80 00:28:12.975 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:28:12.975 Verification LBA range: start 0x0 length 0x100 00:28:12.975 crypto_ram1 : 5.56 63.90 3.99 0.00 0.00 1897164.43 39945.75 1533916.89 00:28:12.975 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:28:12.975 Verification LBA range: start 0x100 length 0x100 00:28:12.975 crypto_ram1 : 5.50 60.02 3.75 0.00 0.00 2007276.89 23592.96 1605819.25 00:28:12.975 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:28:12.975 Verification LBA range: start 0x0 length 0x100 00:28:12.975 crypto_ram2 : 5.38 417.57 26.10 0.00 0.00 285092.43 22968.81 423424.98 00:28:12.975 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:28:12.975 Verification LBA range: start 0x100 length 0x100 00:28:12.975 crypto_ram2 : 5.37 408.61 25.54 0.00 0.00 291178.15 18474.91 429416.84 00:28:12.975 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:28:12.975 Verification LBA range: start 0x0 length 0x100 00:28:12.975 crypto_ram3 : 5.43 428.92 26.81 0.00 0.00 272370.17 7521.04 335544.32 00:28:12.975 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:28:12.975 Verification LBA range: start 0x100 length 0x100 00:28:12.975 crypto_ram3 : 5.43 425.38 26.59 0.00 0.00 274513.51 10610.59 321563.31 00:28:12.975 =================================================================================================================== 00:28:12.975 Total : 1919.60 119.97 0.00 0.00 502680.59 7521.04 1741634.80 00:28:13.232 00:28:13.232 real 0m8.437s 00:28:13.232 user 0m16.207s 00:28:13.232 sys 0m0.326s 00:28:13.232 12:09:03 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:13.232 12:09:03 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:28:13.232 ************************************ 00:28:13.232 END TEST bdev_verify_big_io 00:28:13.232 ************************************ 00:28:13.232 12:09:03 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:28:13.232 12:09:03 blockdev_crypto_qat -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:28:13.232 12:09:03 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:28:13.232 12:09:03 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:13.232 12:09:03 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:28:13.232 ************************************ 00:28:13.232 START TEST bdev_write_zeroes 00:28:13.232 ************************************ 00:28:13.232 12:09:03 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:28:13.232 [2024-07-12 12:09:03.428921] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:28:13.232 [2024-07-12 12:09:03.428960] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid799246 ] 00:28:13.492 [2024-07-12 12:09:03.491455] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:13.492 [2024-07-12 12:09:03.564045] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:28:13.492 [2024-07-12 12:09:03.584914] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:28:13.492 [2024-07-12 12:09:03.592942] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:28:13.492 [2024-07-12 12:09:03.600959] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:28:13.492 [2024-07-12 12:09:03.695424] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:28:16.025 [2024-07-12 12:09:05.832297] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:28:16.025 [2024-07-12 12:09:05.832345] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:28:16.025 [2024-07-12 12:09:05.832369] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:16.025 [2024-07-12 12:09:05.840314] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:28:16.025 [2024-07-12 12:09:05.840326] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:28:16.025 [2024-07-12 12:09:05.840331] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:16.025 [2024-07-12 12:09:05.848336] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:28:16.025 [2024-07-12 12:09:05.848346] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:28:16.025 [2024-07-12 12:09:05.848352] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:16.025 [2024-07-12 12:09:05.856356] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:28:16.025 [2024-07-12 12:09:05.856365] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:28:16.025 [2024-07-12 12:09:05.856370] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:16.025 Running I/O for 1 seconds... 00:28:16.962 00:28:16.962 Latency(us) 00:28:16.962 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:16.962 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:28:16.962 crypto_ram : 1.02 2996.11 11.70 0.00 0.00 42500.32 3838.54 49432.87 00:28:16.962 Job: crypto_ram1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:28:16.962 crypto_ram1 : 1.02 3001.74 11.73 0.00 0.00 42260.48 3682.50 47435.58 00:28:16.962 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:28:16.962 crypto_ram2 : 1.01 23360.68 91.25 0.00 0.00 5423.11 1622.80 6990.51 00:28:16.962 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:28:16.962 crypto_ram3 : 1.01 23393.31 91.38 0.00 0.00 5404.09 1622.80 5991.86 00:28:16.962 =================================================================================================================== 00:28:16.962 Total : 52751.85 206.06 0.00 0.00 9629.04 1622.80 49432.87 00:28:17.221 00:28:17.221 real 0m3.866s 00:28:17.221 user 0m3.586s 00:28:17.221 sys 0m0.247s 00:28:17.221 12:09:07 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:17.221 12:09:07 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:28:17.221 ************************************ 00:28:17.221 END TEST bdev_write_zeroes 00:28:17.221 ************************************ 00:28:17.221 12:09:07 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:28:17.221 12:09:07 blockdev_crypto_qat -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:28:17.221 12:09:07 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:28:17.221 12:09:07 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:17.221 12:09:07 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:28:17.221 ************************************ 00:28:17.221 START TEST bdev_json_nonenclosed 00:28:17.221 ************************************ 00:28:17.221 12:09:07 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:28:17.221 [2024-07-12 12:09:07.339777] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:28:17.221 [2024-07-12 12:09:07.339811] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid799939 ] 00:28:17.221 [2024-07-12 12:09:07.402302] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:17.480 [2024-07-12 12:09:07.475097] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:28:17.480 [2024-07-12 12:09:07.475147] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:28:17.480 [2024-07-12 12:09:07.475158] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:28:17.480 [2024-07-12 12:09:07.475164] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:28:17.480 00:28:17.480 real 0m0.239s 00:28:17.480 user 0m0.157s 00:28:17.480 sys 0m0.081s 00:28:17.480 12:09:07 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:28:17.480 12:09:07 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:17.480 12:09:07 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:28:17.480 ************************************ 00:28:17.480 END TEST bdev_json_nonenclosed 00:28:17.480 ************************************ 00:28:17.480 12:09:07 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 234 00:28:17.480 12:09:07 blockdev_crypto_qat -- bdev/blockdev.sh@782 -- # true 00:28:17.480 12:09:07 blockdev_crypto_qat -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:28:17.480 12:09:07 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:28:17.480 12:09:07 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:17.480 12:09:07 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:28:17.480 ************************************ 00:28:17.480 START TEST bdev_json_nonarray 00:28:17.480 ************************************ 00:28:17.480 12:09:07 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:28:17.480 [2024-07-12 12:09:07.665342] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:28:17.480 [2024-07-12 12:09:07.665376] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid799966 ] 00:28:17.739 [2024-07-12 12:09:07.728383] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:17.739 [2024-07-12 12:09:07.800903] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:28:17.739 [2024-07-12 12:09:07.800959] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:28:17.739 [2024-07-12 12:09:07.800970] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:28:17.739 [2024-07-12 12:09:07.800976] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:28:17.739 00:28:17.739 real 0m0.256s 00:28:17.739 user 0m0.162s 00:28:17.739 sys 0m0.093s 00:28:17.739 12:09:07 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:28:17.739 12:09:07 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:17.739 12:09:07 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:28:17.739 ************************************ 00:28:17.739 END TEST bdev_json_nonarray 00:28:17.739 ************************************ 00:28:17.739 12:09:07 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 234 00:28:17.739 12:09:07 blockdev_crypto_qat -- bdev/blockdev.sh@785 -- # true 00:28:17.739 12:09:07 blockdev_crypto_qat -- bdev/blockdev.sh@787 -- # [[ crypto_qat == bdev ]] 00:28:17.739 12:09:07 blockdev_crypto_qat -- bdev/blockdev.sh@794 -- # [[ crypto_qat == gpt ]] 00:28:17.739 12:09:07 blockdev_crypto_qat -- bdev/blockdev.sh@798 -- # [[ crypto_qat == crypto_sw ]] 00:28:17.739 12:09:07 blockdev_crypto_qat -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:28:17.740 12:09:07 blockdev_crypto_qat -- bdev/blockdev.sh@811 -- # cleanup 00:28:17.740 12:09:07 blockdev_crypto_qat -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:28:17.740 12:09:07 blockdev_crypto_qat -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:28:17.740 12:09:07 blockdev_crypto_qat -- bdev/blockdev.sh@26 -- # [[ crypto_qat == rbd ]] 00:28:17.740 12:09:07 blockdev_crypto_qat -- bdev/blockdev.sh@30 -- # [[ crypto_qat == daos ]] 00:28:17.740 12:09:07 blockdev_crypto_qat -- bdev/blockdev.sh@34 -- # [[ crypto_qat = \g\p\t ]] 00:28:17.740 12:09:07 blockdev_crypto_qat -- bdev/blockdev.sh@40 -- # [[ crypto_qat == xnvme ]] 00:28:17.740 00:28:17.740 real 1m6.022s 00:28:17.740 user 2m39.024s 00:28:17.740 sys 0m5.871s 00:28:17.740 12:09:07 blockdev_crypto_qat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:17.740 12:09:07 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:28:17.740 ************************************ 00:28:17.740 END TEST blockdev_crypto_qat 00:28:17.740 ************************************ 00:28:17.740 12:09:07 -- common/autotest_common.sh@1142 -- # return 0 00:28:17.740 12:09:07 -- spdk/autotest.sh@360 -- # run_test chaining /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:28:17.740 12:09:07 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:28:17.740 12:09:07 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:17.740 12:09:07 -- common/autotest_common.sh@10 -- # set +x 00:28:18.022 ************************************ 00:28:18.022 START TEST chaining 00:28:18.022 ************************************ 00:28:18.022 12:09:07 chaining -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:28:18.022 * Looking for test storage... 00:28:18.022 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:28:18.022 12:09:08 chaining -- bdev/chaining.sh@14 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:28:18.022 12:09:08 chaining -- nvmf/common.sh@7 -- # uname -s 00:28:18.022 12:09:08 chaining -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:28:18.022 12:09:08 chaining -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:28:18.022 12:09:08 chaining -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:28:18.022 12:09:08 chaining -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:28:18.022 12:09:08 chaining -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:28:18.022 12:09:08 chaining -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:28:18.022 12:09:08 chaining -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:28:18.022 12:09:08 chaining -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:28:18.022 12:09:08 chaining -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:28:18.022 12:09:08 chaining -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:28:18.022 12:09:08 chaining -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:801347e8-3fd0-e911-906e-0017a4403562 00:28:18.022 12:09:08 chaining -- nvmf/common.sh@18 -- # NVME_HOSTID=801347e8-3fd0-e911-906e-0017a4403562 00:28:18.022 12:09:08 chaining -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:28:18.022 12:09:08 chaining -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:28:18.022 12:09:08 chaining -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:28:18.022 12:09:08 chaining -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:28:18.022 12:09:08 chaining -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:28:18.022 12:09:08 chaining -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:28:18.022 12:09:08 chaining -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:28:18.022 12:09:08 chaining -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:28:18.022 12:09:08 chaining -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:18.022 12:09:08 chaining -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:18.023 12:09:08 chaining -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:18.023 12:09:08 chaining -- paths/export.sh@5 -- # export PATH 00:28:18.023 12:09:08 chaining -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:18.023 12:09:08 chaining -- nvmf/common.sh@47 -- # : 0 00:28:18.023 12:09:08 chaining -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:28:18.023 12:09:08 chaining -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:28:18.023 12:09:08 chaining -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:28:18.023 12:09:08 chaining -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:28:18.023 12:09:08 chaining -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:28:18.023 12:09:08 chaining -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:28:18.023 12:09:08 chaining -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:28:18.023 12:09:08 chaining -- nvmf/common.sh@51 -- # have_pci_nics=0 00:28:18.023 12:09:08 chaining -- bdev/chaining.sh@16 -- # nqn=nqn.2016-06.io.spdk:cnode0 00:28:18.023 12:09:08 chaining -- bdev/chaining.sh@17 -- # key0=(00112233445566778899001122334455 11223344556677889900112233445500) 00:28:18.023 12:09:08 chaining -- bdev/chaining.sh@18 -- # key1=(22334455667788990011223344550011 33445566778899001122334455001122) 00:28:18.023 12:09:08 chaining -- bdev/chaining.sh@19 -- # bperfsock=/var/tmp/bperf.sock 00:28:18.023 12:09:08 chaining -- bdev/chaining.sh@20 -- # declare -A stats 00:28:18.023 12:09:08 chaining -- bdev/chaining.sh@66 -- # nvmftestinit 00:28:18.023 12:09:08 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:28:18.023 12:09:08 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:28:18.023 12:09:08 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:28:18.023 12:09:08 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:28:18.023 12:09:08 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:28:18.023 12:09:08 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:28:18.023 12:09:08 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:28:18.023 12:09:08 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:28:18.023 12:09:08 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:28:18.023 12:09:08 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:28:18.023 12:09:08 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:28:18.023 12:09:08 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@296 -- # e810=() 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@297 -- # x722=() 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@298 -- # mlx=() 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@335 -- # (( 2 == 0 )) 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:af:00.0 (0x8086 - 0x159b)' 00:28:24.627 Found 0000:af:00.0 (0x8086 - 0x159b) 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:af:00.1 (0x8086 - 0x159b)' 00:28:24.627 Found 0000:af:00.1 (0x8086 - 0x159b) 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@366 -- # (( 0 > 0 )) 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@372 -- # [[ '' == e810 ]] 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:af:00.0: cvl_0_0' 00:28:24.627 Found net devices under 0000:af:00.0: cvl_0_0 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:af:00.1: cvl_0_1' 00:28:24.627 Found net devices under 0000:af:00.1: cvl_0_1 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@404 -- # (( 2 == 0 )) 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@414 -- # is_hw=yes 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@416 -- # [[ yes == yes ]] 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@417 -- # [[ tcp == tcp ]] 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@418 -- # nvmf_tcp_init 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@229 -- # NVMF_INITIATOR_IP=10.0.0.1 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@230 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@231 -- # TCP_INTERFACE_LIST=("${net_devs[@]}") 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@234 -- # (( 2 > 1 )) 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@236 -- # NVMF_TARGET_INTERFACE=cvl_0_0 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@237 -- # NVMF_INITIATOR_INTERFACE=cvl_0_1 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@240 -- # NVMF_SECOND_TARGET_IP= 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@242 -- # NVMF_TARGET_NAMESPACE=cvl_0_0_ns_spdk 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@243 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@244 -- # ip -4 addr flush cvl_0_0 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@245 -- # ip -4 addr flush cvl_0_1 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@248 -- # ip netns add cvl_0_0_ns_spdk 00:28:24.627 12:09:13 chaining -- nvmf/common.sh@251 -- # ip link set cvl_0_0 netns cvl_0_0_ns_spdk 00:28:24.627 12:09:14 chaining -- nvmf/common.sh@254 -- # ip addr add 10.0.0.1/24 dev cvl_0_1 00:28:24.627 12:09:14 chaining -- nvmf/common.sh@255 -- # ip netns exec cvl_0_0_ns_spdk ip addr add 10.0.0.2/24 dev cvl_0_0 00:28:24.627 12:09:14 chaining -- nvmf/common.sh@258 -- # ip link set cvl_0_1 up 00:28:24.627 12:09:14 chaining -- nvmf/common.sh@260 -- # ip netns exec cvl_0_0_ns_spdk ip link set cvl_0_0 up 00:28:24.627 12:09:14 chaining -- nvmf/common.sh@261 -- # ip netns exec cvl_0_0_ns_spdk ip link set lo up 00:28:24.628 12:09:14 chaining -- nvmf/common.sh@264 -- # iptables -I INPUT 1 -i cvl_0_1 -p tcp --dport 4420 -j ACCEPT 00:28:24.628 12:09:14 chaining -- nvmf/common.sh@267 -- # ping -c 1 10.0.0.2 00:28:24.628 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:28:24.628 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.184 ms 00:28:24.628 00:28:24.628 --- 10.0.0.2 ping statistics --- 00:28:24.628 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:28:24.628 rtt min/avg/max/mdev = 0.184/0.184/0.184/0.000 ms 00:28:24.628 12:09:14 chaining -- nvmf/common.sh@268 -- # ip netns exec cvl_0_0_ns_spdk ping -c 1 10.0.0.1 00:28:24.628 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:28:24.628 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.177 ms 00:28:24.628 00:28:24.628 --- 10.0.0.1 ping statistics --- 00:28:24.628 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:28:24.628 rtt min/avg/max/mdev = 0.177/0.177/0.177/0.000 ms 00:28:24.628 12:09:14 chaining -- nvmf/common.sh@270 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:28:24.628 12:09:14 chaining -- nvmf/common.sh@422 -- # return 0 00:28:24.628 12:09:14 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:28:24.628 12:09:14 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:28:24.628 12:09:14 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:28:24.628 12:09:14 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:28:24.628 12:09:14 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:28:24.628 12:09:14 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:28:24.628 12:09:14 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:28:24.628 12:09:14 chaining -- bdev/chaining.sh@67 -- # nvmfappstart -m 0x2 00:28:24.628 12:09:14 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:28:24.628 12:09:14 chaining -- common/autotest_common.sh@722 -- # xtrace_disable 00:28:24.628 12:09:14 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:24.628 12:09:14 chaining -- nvmf/common.sh@481 -- # nvmfpid=803969 00:28:24.628 12:09:14 chaining -- nvmf/common.sh@482 -- # waitforlisten 803969 00:28:24.628 12:09:14 chaining -- nvmf/common.sh@480 -- # ip netns exec cvl_0_0_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:28:24.628 12:09:14 chaining -- common/autotest_common.sh@829 -- # '[' -z 803969 ']' 00:28:24.628 12:09:14 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:24.628 12:09:14 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:24.628 12:09:14 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:24.628 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:24.628 12:09:14 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:24.628 12:09:14 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:24.628 [2024-07-12 12:09:14.275401] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:28:24.628 [2024-07-12 12:09:14.275441] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:28:24.628 [2024-07-12 12:09:14.344047] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:24.628 [2024-07-12 12:09:14.422022] app.c: 604:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:28:24.628 [2024-07-12 12:09:14.422060] app.c: 605:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:28:24.628 [2024-07-12 12:09:14.422067] app.c: 610:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:28:24.628 [2024-07-12 12:09:14.422073] app.c: 611:app_setup_trace: *NOTICE*: SPDK application currently running. 00:28:24.628 [2024-07-12 12:09:14.422077] app.c: 612:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:28:24.628 [2024-07-12 12:09:14.422096] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:28:24.887 12:09:15 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:24.887 12:09:15 chaining -- common/autotest_common.sh@862 -- # return 0 00:28:24.887 12:09:15 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:28:24.887 12:09:15 chaining -- common/autotest_common.sh@728 -- # xtrace_disable 00:28:24.887 12:09:15 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:24.887 12:09:15 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:28:24.887 12:09:15 chaining -- bdev/chaining.sh@69 -- # mktemp 00:28:24.887 12:09:15 chaining -- bdev/chaining.sh@69 -- # input=/tmp/tmp.CEaeUJhzgJ 00:28:24.887 12:09:15 chaining -- bdev/chaining.sh@69 -- # mktemp 00:28:24.887 12:09:15 chaining -- bdev/chaining.sh@69 -- # output=/tmp/tmp.3JXcHUAwCU 00:28:24.887 12:09:15 chaining -- bdev/chaining.sh@70 -- # trap 'tgtcleanup; exit 1' SIGINT SIGTERM EXIT 00:28:24.887 12:09:15 chaining -- bdev/chaining.sh@72 -- # rpc_cmd 00:28:24.887 12:09:15 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:24.887 12:09:15 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:24.887 malloc0 00:28:24.887 true 00:28:24.887 true 00:28:24.887 [2024-07-12 12:09:15.131001] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:28:25.145 crypto0 00:28:25.145 [2024-07-12 12:09:15.139028] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:28:25.145 crypto1 00:28:25.145 [2024-07-12 12:09:15.147117] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:28:25.145 [2024-07-12 12:09:15.163260] tcp.c: 967:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:28:25.145 12:09:15 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:25.145 12:09:15 chaining -- bdev/chaining.sh@85 -- # update_stats 00:28:25.145 12:09:15 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:28:25.145 12:09:15 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:25.145 12:09:15 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:28:25.145 12:09:15 chaining -- bdev/chaining.sh@39 -- # opcode= 00:28:25.145 12:09:15 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:25.145 12:09:15 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:28:25.145 12:09:15 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:28:25.145 12:09:15 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:28:25.145 12:09:15 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:25.145 12:09:15 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:25.145 12:09:15 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:25.145 12:09:15 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=12 00:28:25.145 12:09:15 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:28:25.146 12:09:15 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:25.146 12:09:15 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:25.146 12:09:15 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:28:25.146 12:09:15 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:25.146 12:09:15 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:28:25.146 12:09:15 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:28:25.146 12:09:15 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:25.146 12:09:15 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:25.146 12:09:15 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:25.146 12:09:15 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:25.146 12:09:15 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]= 00:28:25.146 12:09:15 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:28:25.146 12:09:15 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:25.146 12:09:15 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:25.146 12:09:15 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:28:25.146 12:09:15 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:25.146 12:09:15 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:28:25.146 12:09:15 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:25.146 12:09:15 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:28:25.146 12:09:15 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:25.146 12:09:15 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:25.146 12:09:15 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:25.146 12:09:15 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:28:25.146 12:09:15 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:28:25.146 12:09:15 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:25.146 12:09:15 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:25.146 12:09:15 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:28:25.146 12:09:15 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:25.146 12:09:15 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:28:25.146 12:09:15 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:25.146 12:09:15 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:28:25.146 12:09:15 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:25.146 12:09:15 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:25.146 12:09:15 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:25.146 12:09:15 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:28:25.146 12:09:15 chaining -- bdev/chaining.sh@88 -- # dd if=/dev/urandom of=/tmp/tmp.CEaeUJhzgJ bs=1K count=64 00:28:25.146 64+0 records in 00:28:25.146 64+0 records out 00:28:25.146 65536 bytes (66 kB, 64 KiB) copied, 0.000838999 s, 78.1 MB/s 00:28:25.146 12:09:15 chaining -- bdev/chaining.sh@89 -- # spdk_dd --if /tmp/tmp.CEaeUJhzgJ --ob Nvme0n1 --bs 65536 --count 1 00:28:25.146 12:09:15 chaining -- bdev/chaining.sh@25 -- # local config 00:28:25.146 12:09:15 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:28:25.146 12:09:15 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:28:25.146 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:28:25.146 12:09:15 chaining -- bdev/chaining.sh@31 -- # config='{ 00:28:25.146 "subsystems": [ 00:28:25.146 { 00:28:25.146 "subsystem": "bdev", 00:28:25.146 "config": [ 00:28:25.146 { 00:28:25.146 "method": "bdev_nvme_attach_controller", 00:28:25.146 "params": { 00:28:25.146 "trtype": "tcp", 00:28:25.146 "adrfam": "IPv4", 00:28:25.146 "name": "Nvme0", 00:28:25.146 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:28:25.146 "traddr": "10.0.0.2", 00:28:25.146 "trsvcid": "4420" 00:28:25.146 } 00:28:25.146 }, 00:28:25.146 { 00:28:25.146 "method": "bdev_set_options", 00:28:25.146 "params": { 00:28:25.146 "bdev_auto_examine": false 00:28:25.146 } 00:28:25.146 } 00:28:25.146 ] 00:28:25.146 } 00:28:25.146 ] 00:28:25.146 }' 00:28:25.146 12:09:15 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.CEaeUJhzgJ --ob Nvme0n1 --bs 65536 --count 1 00:28:25.146 12:09:15 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:28:25.146 "subsystems": [ 00:28:25.146 { 00:28:25.146 "subsystem": "bdev", 00:28:25.146 "config": [ 00:28:25.146 { 00:28:25.146 "method": "bdev_nvme_attach_controller", 00:28:25.146 "params": { 00:28:25.146 "trtype": "tcp", 00:28:25.146 "adrfam": "IPv4", 00:28:25.146 "name": "Nvme0", 00:28:25.146 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:28:25.146 "traddr": "10.0.0.2", 00:28:25.146 "trsvcid": "4420" 00:28:25.146 } 00:28:25.146 }, 00:28:25.146 { 00:28:25.146 "method": "bdev_set_options", 00:28:25.146 "params": { 00:28:25.146 "bdev_auto_examine": false 00:28:25.146 } 00:28:25.146 } 00:28:25.146 ] 00:28:25.146 } 00:28:25.146 ] 00:28:25.146 }' 00:28:25.402 [2024-07-12 12:09:15.427146] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:28:25.402 [2024-07-12 12:09:15.427186] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid804212 ] 00:28:25.402 [2024-07-12 12:09:15.490964] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:25.402 [2024-07-12 12:09:15.563818] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:28:25.917  Copying: 64/64 [kB] (average 15 MBps) 00:28:25.917 00:28:25.917 12:09:15 chaining -- bdev/chaining.sh@90 -- # get_stat sequence_executed 00:28:25.917 12:09:15 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:25.917 12:09:15 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:28:25.917 12:09:15 chaining -- bdev/chaining.sh@39 -- # opcode= 00:28:25.917 12:09:15 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:25.917 12:09:15 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:28:25.917 12:09:15 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:28:25.917 12:09:15 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:28:25.917 12:09:15 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:25.917 12:09:15 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:25.917 12:09:15 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:25.917 12:09:16 chaining -- bdev/chaining.sh@90 -- # (( 13 == stats[sequence_executed] + 1 )) 00:28:25.917 12:09:16 chaining -- bdev/chaining.sh@91 -- # get_stat executed encrypt 00:28:25.917 12:09:16 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:25.917 12:09:16 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:25.917 12:09:16 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:28:25.917 12:09:16 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:25.917 12:09:16 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:28:25.917 12:09:16 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:25.917 12:09:16 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:25.917 12:09:16 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:25.917 12:09:16 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:28:25.917 12:09:16 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:25.917 12:09:16 chaining -- bdev/chaining.sh@91 -- # (( 2 == stats[encrypt_executed] + 2 )) 00:28:25.917 12:09:16 chaining -- bdev/chaining.sh@92 -- # get_stat executed decrypt 00:28:25.917 12:09:16 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:25.917 12:09:16 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:25.917 12:09:16 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:28:25.917 12:09:16 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:25.917 12:09:16 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:28:25.917 12:09:16 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:28:25.917 12:09:16 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:25.917 12:09:16 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:25.917 12:09:16 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:25.917 12:09:16 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:25.917 12:09:16 chaining -- bdev/chaining.sh@92 -- # (( 12 == stats[decrypt_executed] )) 00:28:25.917 12:09:16 chaining -- bdev/chaining.sh@95 -- # get_stat executed copy 00:28:25.917 12:09:16 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:25.917 12:09:16 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:25.917 12:09:16 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:28:25.917 12:09:16 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:25.917 12:09:16 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:28:25.917 12:09:16 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:25.917 12:09:16 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:28:25.917 12:09:16 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:25.917 12:09:16 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:25.917 12:09:16 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:25.917 12:09:16 chaining -- bdev/chaining.sh@95 -- # (( 4 == stats[copy_executed] )) 00:28:25.917 12:09:16 chaining -- bdev/chaining.sh@96 -- # update_stats 00:28:25.917 12:09:16 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:28:25.918 12:09:16 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:25.918 12:09:16 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:28:25.918 12:09:16 chaining -- bdev/chaining.sh@39 -- # opcode= 00:28:25.918 12:09:16 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:25.918 12:09:16 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:28:25.918 12:09:16 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:28:25.918 12:09:16 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:28:25.918 12:09:16 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:25.918 12:09:16 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:25.918 12:09:16 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:25.918 12:09:16 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=13 00:28:25.918 12:09:16 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:28:25.918 12:09:16 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:25.918 12:09:16 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:25.918 12:09:16 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:28:25.918 12:09:16 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:25.918 12:09:16 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:28:25.918 12:09:16 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:25.918 12:09:16 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:25.918 12:09:16 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:25.918 12:09:16 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:28:26.176 12:09:16 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:26.176 12:09:16 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=2 00:28:26.176 12:09:16 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:28:26.176 12:09:16 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:26.177 12:09:16 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:26.177 12:09:16 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:28:26.177 12:09:16 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:26.177 12:09:16 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:28:26.177 12:09:16 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:26.177 12:09:16 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:28:26.177 12:09:16 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:26.177 12:09:16 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:26.177 12:09:16 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:26.177 12:09:16 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:28:26.177 12:09:16 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:28:26.177 12:09:16 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:26.177 12:09:16 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:26.177 12:09:16 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:28:26.177 12:09:16 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:26.177 12:09:16 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:28:26.177 12:09:16 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:28:26.177 12:09:16 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:26.177 12:09:16 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:26.177 12:09:16 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:26.177 12:09:16 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:26.177 12:09:16 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:28:26.177 12:09:16 chaining -- bdev/chaining.sh@99 -- # spdk_dd --of /tmp/tmp.3JXcHUAwCU --ib Nvme0n1 --bs 65536 --count 1 00:28:26.177 12:09:16 chaining -- bdev/chaining.sh@25 -- # local config 00:28:26.177 12:09:16 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:28:26.177 12:09:16 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:28:26.177 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:28:26.177 12:09:16 chaining -- bdev/chaining.sh@31 -- # config='{ 00:28:26.177 "subsystems": [ 00:28:26.177 { 00:28:26.177 "subsystem": "bdev", 00:28:26.177 "config": [ 00:28:26.177 { 00:28:26.177 "method": "bdev_nvme_attach_controller", 00:28:26.177 "params": { 00:28:26.177 "trtype": "tcp", 00:28:26.177 "adrfam": "IPv4", 00:28:26.177 "name": "Nvme0", 00:28:26.177 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:28:26.177 "traddr": "10.0.0.2", 00:28:26.177 "trsvcid": "4420" 00:28:26.177 } 00:28:26.177 }, 00:28:26.177 { 00:28:26.177 "method": "bdev_set_options", 00:28:26.177 "params": { 00:28:26.177 "bdev_auto_examine": false 00:28:26.177 } 00:28:26.177 } 00:28:26.177 ] 00:28:26.177 } 00:28:26.177 ] 00:28:26.177 }' 00:28:26.177 12:09:16 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.3JXcHUAwCU --ib Nvme0n1 --bs 65536 --count 1 00:28:26.177 12:09:16 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:28:26.177 "subsystems": [ 00:28:26.177 { 00:28:26.177 "subsystem": "bdev", 00:28:26.177 "config": [ 00:28:26.177 { 00:28:26.177 "method": "bdev_nvme_attach_controller", 00:28:26.177 "params": { 00:28:26.177 "trtype": "tcp", 00:28:26.177 "adrfam": "IPv4", 00:28:26.177 "name": "Nvme0", 00:28:26.177 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:28:26.177 "traddr": "10.0.0.2", 00:28:26.177 "trsvcid": "4420" 00:28:26.177 } 00:28:26.177 }, 00:28:26.177 { 00:28:26.177 "method": "bdev_set_options", 00:28:26.177 "params": { 00:28:26.177 "bdev_auto_examine": false 00:28:26.177 } 00:28:26.177 } 00:28:26.177 ] 00:28:26.177 } 00:28:26.177 ] 00:28:26.177 }' 00:28:26.177 [2024-07-12 12:09:16.357836] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:28:26.177 [2024-07-12 12:09:16.357882] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid804291 ] 00:28:26.177 [2024-07-12 12:09:16.421081] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:26.436 [2024-07-12 12:09:16.494311] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:28:26.694  Copying: 64/64 [kB] (average 10 MBps) 00:28:26.694 00:28:26.694 12:09:16 chaining -- bdev/chaining.sh@100 -- # get_stat sequence_executed 00:28:26.694 12:09:16 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:26.694 12:09:16 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:28:26.694 12:09:16 chaining -- bdev/chaining.sh@39 -- # opcode= 00:28:26.694 12:09:16 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:26.694 12:09:16 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:28:26.694 12:09:16 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:28:26.694 12:09:16 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:28:26.694 12:09:16 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:26.694 12:09:16 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:26.694 12:09:16 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:26.953 12:09:16 chaining -- bdev/chaining.sh@100 -- # (( 14 == stats[sequence_executed] + 1 )) 00:28:26.953 12:09:16 chaining -- bdev/chaining.sh@101 -- # get_stat executed encrypt 00:28:26.953 12:09:16 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:26.953 12:09:16 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:26.953 12:09:16 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:28:26.953 12:09:16 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:26.953 12:09:16 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:28:26.953 12:09:16 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:26.953 12:09:16 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:28:26.953 12:09:16 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:26.953 12:09:16 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:26.953 12:09:16 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:26.953 12:09:16 chaining -- bdev/chaining.sh@101 -- # (( 2 == stats[encrypt_executed] )) 00:28:26.953 12:09:16 chaining -- bdev/chaining.sh@102 -- # get_stat executed decrypt 00:28:26.953 12:09:17 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:26.953 12:09:17 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:26.953 12:09:17 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:28:26.953 12:09:17 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:26.953 12:09:17 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:28:26.953 12:09:17 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:26.953 12:09:17 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:26.953 12:09:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:26.953 12:09:17 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:28:26.953 12:09:17 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:26.953 12:09:17 chaining -- bdev/chaining.sh@102 -- # (( 14 == stats[decrypt_executed] + 2 )) 00:28:26.953 12:09:17 chaining -- bdev/chaining.sh@103 -- # get_stat executed copy 00:28:26.954 12:09:17 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:26.954 12:09:17 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:26.954 12:09:17 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:28:26.954 12:09:17 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:26.954 12:09:17 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:28:26.954 12:09:17 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:26.954 12:09:17 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:26.954 12:09:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:26.954 12:09:17 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:28:26.954 12:09:17 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:26.954 12:09:17 chaining -- bdev/chaining.sh@103 -- # (( 4 == stats[copy_executed] )) 00:28:26.954 12:09:17 chaining -- bdev/chaining.sh@104 -- # cmp /tmp/tmp.CEaeUJhzgJ /tmp/tmp.3JXcHUAwCU 00:28:26.954 12:09:17 chaining -- bdev/chaining.sh@105 -- # spdk_dd --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:28:26.954 12:09:17 chaining -- bdev/chaining.sh@25 -- # local config 00:28:26.954 12:09:17 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:28:26.954 12:09:17 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:28:26.954 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:28:26.954 12:09:17 chaining -- bdev/chaining.sh@31 -- # config='{ 00:28:26.954 "subsystems": [ 00:28:26.954 { 00:28:26.954 "subsystem": "bdev", 00:28:26.954 "config": [ 00:28:26.954 { 00:28:26.954 "method": "bdev_nvme_attach_controller", 00:28:26.954 "params": { 00:28:26.954 "trtype": "tcp", 00:28:26.954 "adrfam": "IPv4", 00:28:26.954 "name": "Nvme0", 00:28:26.954 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:28:26.954 "traddr": "10.0.0.2", 00:28:26.954 "trsvcid": "4420" 00:28:26.954 } 00:28:26.954 }, 00:28:26.954 { 00:28:26.954 "method": "bdev_set_options", 00:28:26.954 "params": { 00:28:26.954 "bdev_auto_examine": false 00:28:26.954 } 00:28:26.954 } 00:28:26.954 ] 00:28:26.954 } 00:28:26.954 ] 00:28:26.954 }' 00:28:26.954 12:09:17 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:28:26.954 12:09:17 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:28:26.954 "subsystems": [ 00:28:26.954 { 00:28:26.954 "subsystem": "bdev", 00:28:26.954 "config": [ 00:28:26.954 { 00:28:26.954 "method": "bdev_nvme_attach_controller", 00:28:26.954 "params": { 00:28:26.954 "trtype": "tcp", 00:28:26.954 "adrfam": "IPv4", 00:28:26.954 "name": "Nvme0", 00:28:26.954 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:28:26.954 "traddr": "10.0.0.2", 00:28:26.954 "trsvcid": "4420" 00:28:26.954 } 00:28:26.954 }, 00:28:26.954 { 00:28:26.954 "method": "bdev_set_options", 00:28:26.954 "params": { 00:28:26.954 "bdev_auto_examine": false 00:28:26.954 } 00:28:26.954 } 00:28:26.954 ] 00:28:26.954 } 00:28:26.954 ] 00:28:26.954 }' 00:28:26.954 [2024-07-12 12:09:17.186800] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:28:26.954 [2024-07-12 12:09:17.186842] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid804535 ] 00:28:27.215 [2024-07-12 12:09:17.250153] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:27.215 [2024-07-12 12:09:17.319638] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:28:27.474  Copying: 64/64 [kB] (average 20 MBps) 00:28:27.474 00:28:27.474 12:09:17 chaining -- bdev/chaining.sh@106 -- # update_stats 00:28:27.474 12:09:17 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:28:27.474 12:09:17 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:27.474 12:09:17 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:28:27.474 12:09:17 chaining -- bdev/chaining.sh@39 -- # opcode= 00:28:27.474 12:09:17 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:27.474 12:09:17 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:28:27.474 12:09:17 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:28:27.474 12:09:17 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:28:27.474 12:09:17 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:27.474 12:09:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:27.474 12:09:17 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:27.734 12:09:17 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=15 00:28:27.734 12:09:17 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:28:27.734 12:09:17 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:27.734 12:09:17 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:27.734 12:09:17 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:28:27.734 12:09:17 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:27.734 12:09:17 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:28:27.734 12:09:17 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:27.734 12:09:17 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:28:27.734 12:09:17 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:27.734 12:09:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:27.734 12:09:17 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:27.734 12:09:17 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=4 00:28:27.734 12:09:17 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:28:27.734 12:09:17 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:27.734 12:09:17 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:27.734 12:09:17 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:28:27.734 12:09:17 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:27.734 12:09:17 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:28:27.734 12:09:17 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:27.734 12:09:17 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:28:27.734 12:09:17 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:27.734 12:09:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:27.734 12:09:17 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:27.734 12:09:17 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:28:27.734 12:09:17 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:28:27.734 12:09:17 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:27.734 12:09:17 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:27.734 12:09:17 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:28:27.734 12:09:17 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:27.734 12:09:17 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:28:27.734 12:09:17 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:28:27.734 12:09:17 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:27.734 12:09:17 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:27.734 12:09:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:27.734 12:09:17 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:27.734 12:09:17 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:28:27.734 12:09:17 chaining -- bdev/chaining.sh@109 -- # spdk_dd --if /tmp/tmp.CEaeUJhzgJ --ob Nvme0n1 --bs 4096 --count 16 00:28:27.734 12:09:17 chaining -- bdev/chaining.sh@25 -- # local config 00:28:27.734 12:09:17 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:28:27.734 12:09:17 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:28:27.734 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:28:27.734 12:09:17 chaining -- bdev/chaining.sh@31 -- # config='{ 00:28:27.734 "subsystems": [ 00:28:27.734 { 00:28:27.734 "subsystem": "bdev", 00:28:27.734 "config": [ 00:28:27.734 { 00:28:27.734 "method": "bdev_nvme_attach_controller", 00:28:27.734 "params": { 00:28:27.734 "trtype": "tcp", 00:28:27.734 "adrfam": "IPv4", 00:28:27.734 "name": "Nvme0", 00:28:27.734 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:28:27.734 "traddr": "10.0.0.2", 00:28:27.734 "trsvcid": "4420" 00:28:27.734 } 00:28:27.734 }, 00:28:27.734 { 00:28:27.734 "method": "bdev_set_options", 00:28:27.734 "params": { 00:28:27.734 "bdev_auto_examine": false 00:28:27.734 } 00:28:27.734 } 00:28:27.734 ] 00:28:27.734 } 00:28:27.734 ] 00:28:27.734 }' 00:28:27.734 12:09:17 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.CEaeUJhzgJ --ob Nvme0n1 --bs 4096 --count 16 00:28:27.734 12:09:17 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:28:27.734 "subsystems": [ 00:28:27.734 { 00:28:27.734 "subsystem": "bdev", 00:28:27.734 "config": [ 00:28:27.734 { 00:28:27.734 "method": "bdev_nvme_attach_controller", 00:28:27.734 "params": { 00:28:27.734 "trtype": "tcp", 00:28:27.734 "adrfam": "IPv4", 00:28:27.734 "name": "Nvme0", 00:28:27.734 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:28:27.734 "traddr": "10.0.0.2", 00:28:27.734 "trsvcid": "4420" 00:28:27.734 } 00:28:27.734 }, 00:28:27.734 { 00:28:27.734 "method": "bdev_set_options", 00:28:27.734 "params": { 00:28:27.734 "bdev_auto_examine": false 00:28:27.734 } 00:28:27.734 } 00:28:27.734 ] 00:28:27.734 } 00:28:27.734 ] 00:28:27.734 }' 00:28:27.734 [2024-07-12 12:09:17.920079] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:28:27.734 [2024-07-12 12:09:17.920122] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid804579 ] 00:28:27.993 [2024-07-12 12:09:17.985615] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:27.993 [2024-07-12 12:09:18.062787] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:28:28.252  Copying: 64/64 [kB] (average 15 MBps) 00:28:28.252 00:28:28.252 12:09:18 chaining -- bdev/chaining.sh@110 -- # get_stat sequence_executed 00:28:28.252 12:09:18 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:28.252 12:09:18 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:28:28.252 12:09:18 chaining -- bdev/chaining.sh@39 -- # opcode= 00:28:28.252 12:09:18 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:28.252 12:09:18 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:28:28.252 12:09:18 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:28:28.252 12:09:18 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:28:28.252 12:09:18 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:28.252 12:09:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:28.252 12:09:18 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@110 -- # (( 31 == stats[sequence_executed] + 16 )) 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@111 -- # get_stat executed encrypt 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:28.511 12:09:18 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:28.511 12:09:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:28:28.511 12:09:18 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@111 -- # (( 36 == stats[encrypt_executed] + 32 )) 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@112 -- # get_stat executed decrypt 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:28.511 12:09:18 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:28.511 12:09:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:28:28.511 12:09:18 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@112 -- # (( 14 == stats[decrypt_executed] )) 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@113 -- # get_stat executed copy 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:28.511 12:09:18 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:28.511 12:09:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:28.511 12:09:18 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@113 -- # (( 4 == stats[copy_executed] )) 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@114 -- # update_stats 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@39 -- # opcode= 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:28:28.511 12:09:18 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:28.511 12:09:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:28.511 12:09:18 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=31 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:28:28.511 12:09:18 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:28.511 12:09:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:28.511 12:09:18 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=36 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:28:28.511 12:09:18 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:28.512 12:09:18 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:28:28.512 12:09:18 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:28.512 12:09:18 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:28:28.512 12:09:18 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:28.512 12:09:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:28.512 12:09:18 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:28.771 12:09:18 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:28:28.771 12:09:18 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:28:28.771 12:09:18 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:28.771 12:09:18 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:28.771 12:09:18 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:28:28.771 12:09:18 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:28.771 12:09:18 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:28:28.771 12:09:18 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:28.771 12:09:18 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:28.771 12:09:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:28.771 12:09:18 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:28:28.771 12:09:18 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:28.771 12:09:18 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:28:28.771 12:09:18 chaining -- bdev/chaining.sh@117 -- # : 00:28:28.771 12:09:18 chaining -- bdev/chaining.sh@118 -- # spdk_dd --of /tmp/tmp.3JXcHUAwCU --ib Nvme0n1 --bs 4096 --count 16 00:28:28.771 12:09:18 chaining -- bdev/chaining.sh@25 -- # local config 00:28:28.771 12:09:18 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:28:28.771 12:09:18 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:28:28.771 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:28:28.771 12:09:18 chaining -- bdev/chaining.sh@31 -- # config='{ 00:28:28.771 "subsystems": [ 00:28:28.771 { 00:28:28.771 "subsystem": "bdev", 00:28:28.771 "config": [ 00:28:28.771 { 00:28:28.771 "method": "bdev_nvme_attach_controller", 00:28:28.771 "params": { 00:28:28.771 "trtype": "tcp", 00:28:28.771 "adrfam": "IPv4", 00:28:28.771 "name": "Nvme0", 00:28:28.771 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:28:28.771 "traddr": "10.0.0.2", 00:28:28.771 "trsvcid": "4420" 00:28:28.771 } 00:28:28.771 }, 00:28:28.771 { 00:28:28.771 "method": "bdev_set_options", 00:28:28.771 "params": { 00:28:28.771 "bdev_auto_examine": false 00:28:28.771 } 00:28:28.771 } 00:28:28.771 ] 00:28:28.771 } 00:28:28.771 ] 00:28:28.771 }' 00:28:28.771 12:09:18 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.3JXcHUAwCU --ib Nvme0n1 --bs 4096 --count 16 00:28:28.771 12:09:18 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:28:28.771 "subsystems": [ 00:28:28.771 { 00:28:28.771 "subsystem": "bdev", 00:28:28.771 "config": [ 00:28:28.771 { 00:28:28.771 "method": "bdev_nvme_attach_controller", 00:28:28.771 "params": { 00:28:28.771 "trtype": "tcp", 00:28:28.771 "adrfam": "IPv4", 00:28:28.771 "name": "Nvme0", 00:28:28.771 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:28:28.771 "traddr": "10.0.0.2", 00:28:28.771 "trsvcid": "4420" 00:28:28.771 } 00:28:28.771 }, 00:28:28.771 { 00:28:28.771 "method": "bdev_set_options", 00:28:28.771 "params": { 00:28:28.771 "bdev_auto_examine": false 00:28:28.771 } 00:28:28.771 } 00:28:28.771 ] 00:28:28.771 } 00:28:28.771 ] 00:28:28.771 }' 00:28:28.771 [2024-07-12 12:09:18.897780] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:28:28.771 [2024-07-12 12:09:18.897819] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid804832 ] 00:28:28.772 [2024-07-12 12:09:18.961545] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:29.031 [2024-07-12 12:09:19.033185] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:28:29.290  Copying: 64/64 [kB] (average 735 kBps) 00:28:29.290 00:28:29.290 12:09:19 chaining -- bdev/chaining.sh@119 -- # get_stat sequence_executed 00:28:29.290 12:09:19 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:29.290 12:09:19 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:28:29.290 12:09:19 chaining -- bdev/chaining.sh@39 -- # opcode= 00:28:29.290 12:09:19 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:29.290 12:09:19 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:28:29.290 12:09:19 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:28:29.290 12:09:19 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:28:29.290 12:09:19 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:29.290 12:09:19 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:29.549 12:09:19 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:29.549 12:09:19 chaining -- bdev/chaining.sh@119 -- # (( 47 == stats[sequence_executed] + 16 )) 00:28:29.549 12:09:19 chaining -- bdev/chaining.sh@120 -- # get_stat executed encrypt 00:28:29.549 12:09:19 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:29.549 12:09:19 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:29.549 12:09:19 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:28:29.549 12:09:19 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:29.549 12:09:19 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:28:29.549 12:09:19 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:29.549 12:09:19 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:28:29.549 12:09:19 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:29.549 12:09:19 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:29.549 12:09:19 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:29.549 12:09:19 chaining -- bdev/chaining.sh@120 -- # (( 36 == stats[encrypt_executed] )) 00:28:29.549 12:09:19 chaining -- bdev/chaining.sh@121 -- # get_stat executed decrypt 00:28:29.549 12:09:19 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:29.549 12:09:19 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:29.549 12:09:19 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:28:29.549 12:09:19 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:29.549 12:09:19 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:28:29.549 12:09:19 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:29.549 12:09:19 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:28:29.550 12:09:19 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:29.550 12:09:19 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:29.550 12:09:19 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:29.550 12:09:19 chaining -- bdev/chaining.sh@121 -- # (( 46 == stats[decrypt_executed] + 32 )) 00:28:29.550 12:09:19 chaining -- bdev/chaining.sh@122 -- # get_stat executed copy 00:28:29.550 12:09:19 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:29.550 12:09:19 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:29.550 12:09:19 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:28:29.550 12:09:19 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:29.550 12:09:19 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:28:29.550 12:09:19 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:28:29.550 12:09:19 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:29.550 12:09:19 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:29.550 12:09:19 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:29.550 12:09:19 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:29.550 12:09:19 chaining -- bdev/chaining.sh@122 -- # (( 4 == stats[copy_executed] )) 00:28:29.550 12:09:19 chaining -- bdev/chaining.sh@123 -- # cmp /tmp/tmp.CEaeUJhzgJ /tmp/tmp.3JXcHUAwCU 00:28:29.550 12:09:19 chaining -- bdev/chaining.sh@125 -- # trap - SIGINT SIGTERM EXIT 00:28:29.550 12:09:19 chaining -- bdev/chaining.sh@126 -- # tgtcleanup 00:28:29.550 12:09:19 chaining -- bdev/chaining.sh@58 -- # rm -f /tmp/tmp.CEaeUJhzgJ /tmp/tmp.3JXcHUAwCU 00:28:29.550 12:09:19 chaining -- bdev/chaining.sh@59 -- # nvmftestfini 00:28:29.550 12:09:19 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:28:29.550 12:09:19 chaining -- nvmf/common.sh@117 -- # sync 00:28:29.550 12:09:19 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:28:29.550 12:09:19 chaining -- nvmf/common.sh@120 -- # set +e 00:28:29.550 12:09:19 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:28:29.550 12:09:19 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:28:29.550 rmmod nvme_tcp 00:28:29.550 rmmod nvme_fabrics 00:28:29.550 rmmod nvme_keyring 00:28:29.550 12:09:19 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:28:29.550 12:09:19 chaining -- nvmf/common.sh@124 -- # set -e 00:28:29.550 12:09:19 chaining -- nvmf/common.sh@125 -- # return 0 00:28:29.550 12:09:19 chaining -- nvmf/common.sh@489 -- # '[' -n 803969 ']' 00:28:29.550 12:09:19 chaining -- nvmf/common.sh@490 -- # killprocess 803969 00:28:29.550 12:09:19 chaining -- common/autotest_common.sh@948 -- # '[' -z 803969 ']' 00:28:29.550 12:09:19 chaining -- common/autotest_common.sh@952 -- # kill -0 803969 00:28:29.550 12:09:19 chaining -- common/autotest_common.sh@953 -- # uname 00:28:29.550 12:09:19 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:29.550 12:09:19 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 803969 00:28:29.809 12:09:19 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:28:29.809 12:09:19 chaining -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:28:29.809 12:09:19 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 803969' 00:28:29.809 killing process with pid 803969 00:28:29.809 12:09:19 chaining -- common/autotest_common.sh@967 -- # kill 803969 00:28:29.809 12:09:19 chaining -- common/autotest_common.sh@972 -- # wait 803969 00:28:29.809 12:09:19 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:28:29.809 12:09:19 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:28:29.809 12:09:19 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:28:29.809 12:09:19 chaining -- nvmf/common.sh@274 -- # [[ cvl_0_0_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:28:29.809 12:09:19 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:28:29.809 12:09:19 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:28:29.809 12:09:19 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:28:29.809 12:09:19 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:28:32.345 12:09:22 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush cvl_0_1 00:28:32.345 12:09:22 chaining -- bdev/chaining.sh@129 -- # trap 'bperfcleanup; exit 1' SIGINT SIGTERM EXIT 00:28:32.345 12:09:22 chaining -- bdev/chaining.sh@132 -- # bperfpid=805351 00:28:32.345 12:09:22 chaining -- bdev/chaining.sh@134 -- # waitforlisten 805351 00:28:32.345 12:09:22 chaining -- bdev/chaining.sh@131 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:28:32.345 12:09:22 chaining -- common/autotest_common.sh@829 -- # '[' -z 805351 ']' 00:28:32.345 12:09:22 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:32.345 12:09:22 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:32.345 12:09:22 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:32.345 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:32.345 12:09:22 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:32.345 12:09:22 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:32.345 [2024-07-12 12:09:22.102551] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:28:32.345 [2024-07-12 12:09:22.102591] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid805351 ] 00:28:32.345 [2024-07-12 12:09:22.162975] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:32.345 [2024-07-12 12:09:22.244624] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:28:32.914 12:09:22 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:32.914 12:09:22 chaining -- common/autotest_common.sh@862 -- # return 0 00:28:32.914 12:09:22 chaining -- bdev/chaining.sh@135 -- # rpc_cmd 00:28:32.914 12:09:22 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:32.914 12:09:22 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:32.914 malloc0 00:28:32.914 true 00:28:32.914 true 00:28:32.914 [2024-07-12 12:09:23.025765] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:28:32.914 crypto0 00:28:32.914 [2024-07-12 12:09:23.033789] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:28:32.914 crypto1 00:28:32.914 12:09:23 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:32.914 12:09:23 chaining -- bdev/chaining.sh@145 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:28:32.914 Running I/O for 5 seconds... 00:28:38.185 00:28:38.185 Latency(us) 00:28:38.185 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:38.185 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:28:38.185 Verification LBA range: start 0x0 length 0x2000 00:28:38.185 crypto1 : 5.01 17469.96 68.24 0.00 0.00 14618.44 4244.24 11359.57 00:28:38.185 =================================================================================================================== 00:28:38.185 Total : 17469.96 68.24 0.00 0.00 14618.44 4244.24 11359.57 00:28:38.185 0 00:28:38.185 12:09:28 chaining -- bdev/chaining.sh@146 -- # killprocess 805351 00:28:38.185 12:09:28 chaining -- common/autotest_common.sh@948 -- # '[' -z 805351 ']' 00:28:38.185 12:09:28 chaining -- common/autotest_common.sh@952 -- # kill -0 805351 00:28:38.185 12:09:28 chaining -- common/autotest_common.sh@953 -- # uname 00:28:38.185 12:09:28 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:38.185 12:09:28 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 805351 00:28:38.185 12:09:28 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:38.185 12:09:28 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:38.185 12:09:28 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 805351' 00:28:38.185 killing process with pid 805351 00:28:38.185 12:09:28 chaining -- common/autotest_common.sh@967 -- # kill 805351 00:28:38.185 Received shutdown signal, test time was about 5.000000 seconds 00:28:38.185 00:28:38.185 Latency(us) 00:28:38.185 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:38.185 =================================================================================================================== 00:28:38.185 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:28:38.185 12:09:28 chaining -- common/autotest_common.sh@972 -- # wait 805351 00:28:38.185 12:09:28 chaining -- bdev/chaining.sh@152 -- # bperfpid=806475 00:28:38.185 12:09:28 chaining -- bdev/chaining.sh@151 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:28:38.185 12:09:28 chaining -- bdev/chaining.sh@154 -- # waitforlisten 806475 00:28:38.185 12:09:28 chaining -- common/autotest_common.sh@829 -- # '[' -z 806475 ']' 00:28:38.185 12:09:28 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:38.185 12:09:28 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:38.185 12:09:28 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:38.185 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:38.185 12:09:28 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:38.185 12:09:28 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:38.185 [2024-07-12 12:09:28.430515] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:28:38.185 [2024-07-12 12:09:28.430576] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid806475 ] 00:28:38.444 [2024-07-12 12:09:28.494270] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:38.444 [2024-07-12 12:09:28.572011] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:28:39.012 12:09:29 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:39.012 12:09:29 chaining -- common/autotest_common.sh@862 -- # return 0 00:28:39.012 12:09:29 chaining -- bdev/chaining.sh@155 -- # rpc_cmd 00:28:39.012 12:09:29 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:39.012 12:09:29 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:39.271 malloc0 00:28:39.271 true 00:28:39.271 true 00:28:39.271 [2024-07-12 12:09:29.329533] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:28:39.271 [2024-07-12 12:09:29.329567] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:39.271 [2024-07-12 12:09:29.329580] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf36330 00:28:39.271 [2024-07-12 12:09:29.329586] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:39.271 [2024-07-12 12:09:29.330317] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:39.271 [2024-07-12 12:09:29.330334] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:28:39.271 pt0 00:28:39.271 [2024-07-12 12:09:29.337561] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:28:39.271 crypto0 00:28:39.271 [2024-07-12 12:09:29.345581] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:28:39.271 crypto1 00:28:39.271 12:09:29 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:39.271 12:09:29 chaining -- bdev/chaining.sh@166 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:28:39.271 Running I/O for 5 seconds... 00:28:44.545 00:28:44.545 Latency(us) 00:28:44.545 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:44.545 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:28:44.545 Verification LBA range: start 0x0 length 0x2000 00:28:44.545 crypto1 : 5.01 14189.49 55.43 0.00 0.00 18001.34 4369.07 12670.29 00:28:44.545 =================================================================================================================== 00:28:44.545 Total : 14189.49 55.43 0.00 0.00 18001.34 4369.07 12670.29 00:28:44.545 0 00:28:44.545 12:09:34 chaining -- bdev/chaining.sh@167 -- # killprocess 806475 00:28:44.545 12:09:34 chaining -- common/autotest_common.sh@948 -- # '[' -z 806475 ']' 00:28:44.545 12:09:34 chaining -- common/autotest_common.sh@952 -- # kill -0 806475 00:28:44.545 12:09:34 chaining -- common/autotest_common.sh@953 -- # uname 00:28:44.545 12:09:34 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:44.545 12:09:34 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 806475 00:28:44.545 12:09:34 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:44.545 12:09:34 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:44.545 12:09:34 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 806475' 00:28:44.545 killing process with pid 806475 00:28:44.545 12:09:34 chaining -- common/autotest_common.sh@967 -- # kill 806475 00:28:44.545 Received shutdown signal, test time was about 5.000000 seconds 00:28:44.545 00:28:44.545 Latency(us) 00:28:44.545 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:44.545 =================================================================================================================== 00:28:44.545 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:28:44.545 12:09:34 chaining -- common/autotest_common.sh@972 -- # wait 806475 00:28:44.545 12:09:34 chaining -- bdev/chaining.sh@169 -- # trap - SIGINT SIGTERM EXIT 00:28:44.545 12:09:34 chaining -- bdev/chaining.sh@170 -- # killprocess 806475 00:28:44.545 12:09:34 chaining -- common/autotest_common.sh@948 -- # '[' -z 806475 ']' 00:28:44.545 12:09:34 chaining -- common/autotest_common.sh@952 -- # kill -0 806475 00:28:44.545 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh: line 952: kill: (806475) - No such process 00:28:44.545 12:09:34 chaining -- common/autotest_common.sh@975 -- # echo 'Process with pid 806475 is not found' 00:28:44.545 Process with pid 806475 is not found 00:28:44.546 12:09:34 chaining -- bdev/chaining.sh@171 -- # wait 806475 00:28:44.546 12:09:34 chaining -- bdev/chaining.sh@175 -- # nvmftestinit 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:28:44.546 12:09:34 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:28:44.546 12:09:34 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:28:44.546 12:09:34 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@296 -- # e810=() 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@297 -- # x722=() 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@298 -- # mlx=() 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@335 -- # (( 2 == 0 )) 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:af:00.0 (0x8086 - 0x159b)' 00:28:44.546 Found 0000:af:00.0 (0x8086 - 0x159b) 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:af:00.1 (0x8086 - 0x159b)' 00:28:44.546 Found 0000:af:00.1 (0x8086 - 0x159b) 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@366 -- # (( 0 > 0 )) 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@372 -- # [[ '' == e810 ]] 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:af:00.0: cvl_0_0' 00:28:44.546 Found net devices under 0000:af:00.0: cvl_0_0 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:af:00.1: cvl_0_1' 00:28:44.546 Found net devices under 0000:af:00.1: cvl_0_1 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@404 -- # (( 2 == 0 )) 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@414 -- # is_hw=yes 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@416 -- # [[ yes == yes ]] 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@417 -- # [[ tcp == tcp ]] 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@418 -- # nvmf_tcp_init 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@229 -- # NVMF_INITIATOR_IP=10.0.0.1 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@230 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@231 -- # TCP_INTERFACE_LIST=("${net_devs[@]}") 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@234 -- # (( 2 > 1 )) 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@236 -- # NVMF_TARGET_INTERFACE=cvl_0_0 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@237 -- # NVMF_INITIATOR_INTERFACE=cvl_0_1 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@240 -- # NVMF_SECOND_TARGET_IP= 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@242 -- # NVMF_TARGET_NAMESPACE=cvl_0_0_ns_spdk 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@243 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@244 -- # ip -4 addr flush cvl_0_0 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@245 -- # ip -4 addr flush cvl_0_1 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@248 -- # ip netns add cvl_0_0_ns_spdk 00:28:44.546 12:09:34 chaining -- nvmf/common.sh@251 -- # ip link set cvl_0_0 netns cvl_0_0_ns_spdk 00:28:44.805 12:09:34 chaining -- nvmf/common.sh@254 -- # ip addr add 10.0.0.1/24 dev cvl_0_1 00:28:44.805 12:09:34 chaining -- nvmf/common.sh@255 -- # ip netns exec cvl_0_0_ns_spdk ip addr add 10.0.0.2/24 dev cvl_0_0 00:28:44.805 12:09:34 chaining -- nvmf/common.sh@258 -- # ip link set cvl_0_1 up 00:28:44.805 12:09:34 chaining -- nvmf/common.sh@260 -- # ip netns exec cvl_0_0_ns_spdk ip link set cvl_0_0 up 00:28:44.805 12:09:34 chaining -- nvmf/common.sh@261 -- # ip netns exec cvl_0_0_ns_spdk ip link set lo up 00:28:44.805 12:09:34 chaining -- nvmf/common.sh@264 -- # iptables -I INPUT 1 -i cvl_0_1 -p tcp --dport 4420 -j ACCEPT 00:28:44.805 12:09:34 chaining -- nvmf/common.sh@267 -- # ping -c 1 10.0.0.2 00:28:44.805 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:28:44.805 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.179 ms 00:28:44.805 00:28:44.805 --- 10.0.0.2 ping statistics --- 00:28:44.805 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:28:44.805 rtt min/avg/max/mdev = 0.179/0.179/0.179/0.000 ms 00:28:44.805 12:09:34 chaining -- nvmf/common.sh@268 -- # ip netns exec cvl_0_0_ns_spdk ping -c 1 10.0.0.1 00:28:44.805 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:28:44.805 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.164 ms 00:28:44.805 00:28:44.805 --- 10.0.0.1 ping statistics --- 00:28:44.805 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:28:44.805 rtt min/avg/max/mdev = 0.164/0.164/0.164/0.000 ms 00:28:44.805 12:09:34 chaining -- nvmf/common.sh@270 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:28:44.805 12:09:34 chaining -- nvmf/common.sh@422 -- # return 0 00:28:44.805 12:09:34 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:28:44.805 12:09:34 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:28:44.805 12:09:34 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:28:44.805 12:09:34 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:28:44.805 12:09:34 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:28:44.805 12:09:34 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:28:44.805 12:09:34 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:28:44.805 12:09:34 chaining -- bdev/chaining.sh@176 -- # nvmfappstart -m 0x2 00:28:44.805 12:09:34 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:28:44.805 12:09:34 chaining -- common/autotest_common.sh@722 -- # xtrace_disable 00:28:44.805 12:09:34 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:44.805 12:09:34 chaining -- nvmf/common.sh@481 -- # nvmfpid=807436 00:28:44.805 12:09:34 chaining -- nvmf/common.sh@482 -- # waitforlisten 807436 00:28:44.805 12:09:34 chaining -- nvmf/common.sh@480 -- # ip netns exec cvl_0_0_ns_spdk ip netns exec cvl_0_0_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:28:44.805 12:09:34 chaining -- common/autotest_common.sh@829 -- # '[' -z 807436 ']' 00:28:44.805 12:09:34 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:44.805 12:09:34 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:44.805 12:09:34 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:44.805 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:44.805 12:09:34 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:44.805 12:09:34 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:44.805 [2024-07-12 12:09:35.048953] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:28:44.805 [2024-07-12 12:09:35.048994] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:28:45.064 [2024-07-12 12:09:35.114556] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:45.064 [2024-07-12 12:09:35.190546] app.c: 604:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:28:45.064 [2024-07-12 12:09:35.190580] app.c: 605:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:28:45.064 [2024-07-12 12:09:35.190587] app.c: 610:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:28:45.064 [2024-07-12 12:09:35.190593] app.c: 611:app_setup_trace: *NOTICE*: SPDK application currently running. 00:28:45.064 [2024-07-12 12:09:35.190598] app.c: 612:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:28:45.064 [2024-07-12 12:09:35.190616] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:28:45.632 12:09:35 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:45.632 12:09:35 chaining -- common/autotest_common.sh@862 -- # return 0 00:28:45.632 12:09:35 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:28:45.632 12:09:35 chaining -- common/autotest_common.sh@728 -- # xtrace_disable 00:28:45.632 12:09:35 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:45.632 12:09:35 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:28:45.632 12:09:35 chaining -- bdev/chaining.sh@178 -- # rpc_cmd 00:28:45.632 12:09:35 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:45.632 12:09:35 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:45.891 malloc0 00:28:45.891 [2024-07-12 12:09:35.886642] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:28:45.891 [2024-07-12 12:09:35.902793] tcp.c: 967:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:28:45.891 12:09:35 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:45.891 12:09:35 chaining -- bdev/chaining.sh@186 -- # trap 'bperfcleanup || :; nvmftestfini || :; exit 1' SIGINT SIGTERM EXIT 00:28:45.891 12:09:35 chaining -- bdev/chaining.sh@189 -- # bperfpid=807669 00:28:45.891 12:09:35 chaining -- bdev/chaining.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:28:45.891 12:09:35 chaining -- bdev/chaining.sh@191 -- # waitforlisten 807669 /var/tmp/bperf.sock 00:28:45.891 12:09:35 chaining -- common/autotest_common.sh@829 -- # '[' -z 807669 ']' 00:28:45.891 12:09:35 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/bperf.sock 00:28:45.891 12:09:35 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:45.891 12:09:35 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:28:45.891 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:28:45.891 12:09:35 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:45.891 12:09:35 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:45.891 [2024-07-12 12:09:35.964200] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:28:45.891 [2024-07-12 12:09:35.964236] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid807669 ] 00:28:45.891 [2024-07-12 12:09:36.027118] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:45.891 [2024-07-12 12:09:36.098958] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:28:46.828 12:09:36 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:46.828 12:09:36 chaining -- common/autotest_common.sh@862 -- # return 0 00:28:46.828 12:09:36 chaining -- bdev/chaining.sh@192 -- # rpc_bperf 00:28:46.828 12:09:36 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:28:46.828 [2024-07-12 12:09:37.065898] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:28:46.828 nvme0n1 00:28:46.828 true 00:28:46.828 crypto0 00:28:47.087 12:09:37 chaining -- bdev/chaining.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:28:47.087 Running I/O for 5 seconds... 00:28:52.352 00:28:52.352 Latency(us) 00:28:52.352 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:52.352 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:28:52.352 Verification LBA range: start 0x0 length 0x2000 00:28:52.352 crypto0 : 5.01 13025.10 50.88 0.00 0.00 19606.81 3245.59 16352.79 00:28:52.352 =================================================================================================================== 00:28:52.352 Total : 13025.10 50.88 0.00 0.00 19606.81 3245.59 16352.79 00:28:52.352 0 00:28:52.352 12:09:42 chaining -- bdev/chaining.sh@205 -- # get_stat_bperf sequence_executed 00:28:52.352 12:09:42 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:28:52.352 12:09:42 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:52.352 12:09:42 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:28:52.352 12:09:42 chaining -- bdev/chaining.sh@39 -- # opcode= 00:28:52.352 12:09:42 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:28:52.352 12:09:42 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:28:52.352 12:09:42 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:28:52.352 12:09:42 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:28:52.352 12:09:42 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:28:52.352 12:09:42 chaining -- bdev/chaining.sh@205 -- # sequence=130606 00:28:52.352 12:09:42 chaining -- bdev/chaining.sh@206 -- # get_stat_bperf executed encrypt 00:28:52.352 12:09:42 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:28:52.352 12:09:42 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:52.352 12:09:42 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:52.352 12:09:42 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:28:52.352 12:09:42 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:28:52.352 12:09:42 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:28:52.352 12:09:42 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:28:52.352 12:09:42 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:28:52.352 12:09:42 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:28:52.352 12:09:42 chaining -- bdev/chaining.sh@206 -- # encrypt=65303 00:28:52.352 12:09:42 chaining -- bdev/chaining.sh@207 -- # get_stat_bperf executed decrypt 00:28:52.352 12:09:42 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:28:52.352 12:09:42 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:52.352 12:09:42 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:52.352 12:09:42 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:28:52.352 12:09:42 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:28:52.352 12:09:42 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:28:52.352 12:09:42 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:28:52.352 12:09:42 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:28:52.352 12:09:42 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:28:52.612 12:09:42 chaining -- bdev/chaining.sh@207 -- # decrypt=65303 00:28:52.612 12:09:42 chaining -- bdev/chaining.sh@208 -- # get_stat_bperf executed crc32c 00:28:52.612 12:09:42 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:28:52.612 12:09:42 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:52.612 12:09:42 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:52.612 12:09:42 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:28:52.612 12:09:42 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:28:52.612 12:09:42 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:28:52.612 12:09:42 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:28:52.612 12:09:42 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:28:52.612 12:09:42 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:28:52.927 12:09:42 chaining -- bdev/chaining.sh@208 -- # crc32c=130606 00:28:52.927 12:09:42 chaining -- bdev/chaining.sh@210 -- # (( sequence > 0 )) 00:28:52.927 12:09:42 chaining -- bdev/chaining.sh@211 -- # (( encrypt + decrypt == sequence )) 00:28:52.927 12:09:42 chaining -- bdev/chaining.sh@212 -- # (( encrypt + decrypt == crc32c )) 00:28:52.927 12:09:42 chaining -- bdev/chaining.sh@214 -- # killprocess 807669 00:28:52.927 12:09:42 chaining -- common/autotest_common.sh@948 -- # '[' -z 807669 ']' 00:28:52.927 12:09:42 chaining -- common/autotest_common.sh@952 -- # kill -0 807669 00:28:52.927 12:09:42 chaining -- common/autotest_common.sh@953 -- # uname 00:28:52.927 12:09:42 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:52.927 12:09:42 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 807669 00:28:52.927 12:09:42 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:52.927 12:09:42 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:52.927 12:09:42 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 807669' 00:28:52.927 killing process with pid 807669 00:28:52.927 12:09:42 chaining -- common/autotest_common.sh@967 -- # kill 807669 00:28:52.927 Received shutdown signal, test time was about 5.000000 seconds 00:28:52.927 00:28:52.927 Latency(us) 00:28:52.927 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:52.927 =================================================================================================================== 00:28:52.927 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:28:52.927 12:09:42 chaining -- common/autotest_common.sh@972 -- # wait 807669 00:28:52.927 12:09:43 chaining -- bdev/chaining.sh@219 -- # bperfpid=808822 00:28:52.927 12:09:43 chaining -- bdev/chaining.sh@221 -- # waitforlisten 808822 /var/tmp/bperf.sock 00:28:52.927 12:09:43 chaining -- common/autotest_common.sh@829 -- # '[' -z 808822 ']' 00:28:52.927 12:09:43 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/bperf.sock 00:28:52.927 12:09:43 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:52.927 12:09:43 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:28:52.927 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:28:52.927 12:09:43 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:52.927 12:09:43 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:52.927 12:09:43 chaining -- bdev/chaining.sh@217 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 65536 -q 32 --wait-for-rpc -z 00:28:53.214 [2024-07-12 12:09:43.195738] Starting SPDK v24.09-pre git sha1 b2ac96cc2 / DPDK 24.03.0 initialization... 00:28:53.214 [2024-07-12 12:09:43.195780] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid808822 ] 00:28:53.214 [2024-07-12 12:09:43.262406] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:53.214 [2024-07-12 12:09:43.334501] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:28:53.780 12:09:43 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:53.780 12:09:43 chaining -- common/autotest_common.sh@862 -- # return 0 00:28:53.780 12:09:43 chaining -- bdev/chaining.sh@222 -- # rpc_bperf 00:28:53.780 12:09:43 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:28:54.344 [2024-07-12 12:09:44.305295] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:28:54.344 nvme0n1 00:28:54.344 true 00:28:54.344 crypto0 00:28:54.344 12:09:44 chaining -- bdev/chaining.sh@231 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:28:54.344 Running I/O for 5 seconds... 00:28:59.613 00:28:59.613 Latency(us) 00:28:59.613 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:59.613 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:28:59.613 Verification LBA range: start 0x0 length 0x200 00:28:59.613 crypto0 : 5.00 2391.51 149.47 0.00 0.00 13107.39 795.79 13419.28 00:28:59.613 =================================================================================================================== 00:28:59.613 Total : 2391.51 149.47 0.00 0.00 13107.39 795.79 13419.28 00:28:59.613 0 00:28:59.614 12:09:49 chaining -- bdev/chaining.sh@233 -- # get_stat_bperf sequence_executed 00:28:59.614 12:09:49 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:28:59.614 12:09:49 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:59.614 12:09:49 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:28:59.614 12:09:49 chaining -- bdev/chaining.sh@39 -- # opcode= 00:28:59.614 12:09:49 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:28:59.614 12:09:49 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:28:59.614 12:09:49 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:28:59.614 12:09:49 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:28:59.614 12:09:49 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:28:59.614 12:09:49 chaining -- bdev/chaining.sh@233 -- # sequence=23938 00:28:59.614 12:09:49 chaining -- bdev/chaining.sh@234 -- # get_stat_bperf executed encrypt 00:28:59.614 12:09:49 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:28:59.614 12:09:49 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:59.614 12:09:49 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:59.614 12:09:49 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:28:59.614 12:09:49 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:28:59.614 12:09:49 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:28:59.614 12:09:49 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:28:59.614 12:09:49 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:28:59.614 12:09:49 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:28:59.614 12:09:49 chaining -- bdev/chaining.sh@234 -- # encrypt=11969 00:28:59.614 12:09:49 chaining -- bdev/chaining.sh@235 -- # get_stat_bperf executed decrypt 00:28:59.614 12:09:49 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:28:59.614 12:09:49 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:59.614 12:09:49 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:59.614 12:09:49 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:28:59.614 12:09:49 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:28:59.614 12:09:49 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:28:59.614 12:09:49 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:28:59.614 12:09:49 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:28:59.614 12:09:49 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:28:59.873 12:09:49 chaining -- bdev/chaining.sh@235 -- # decrypt=11969 00:28:59.873 12:09:49 chaining -- bdev/chaining.sh@236 -- # get_stat_bperf executed crc32c 00:28:59.873 12:09:49 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:28:59.873 12:09:49 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:59.873 12:09:49 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:59.873 12:09:49 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:28:59.873 12:09:49 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:28:59.873 12:09:49 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:28:59.873 12:09:49 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:28:59.873 12:09:49 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:28:59.873 12:09:49 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:29:00.132 12:09:50 chaining -- bdev/chaining.sh@236 -- # crc32c=23938 00:29:00.132 12:09:50 chaining -- bdev/chaining.sh@238 -- # (( sequence > 0 )) 00:29:00.132 12:09:50 chaining -- bdev/chaining.sh@239 -- # (( encrypt + decrypt == sequence )) 00:29:00.132 12:09:50 chaining -- bdev/chaining.sh@240 -- # (( encrypt + decrypt == crc32c )) 00:29:00.132 12:09:50 chaining -- bdev/chaining.sh@242 -- # killprocess 808822 00:29:00.132 12:09:50 chaining -- common/autotest_common.sh@948 -- # '[' -z 808822 ']' 00:29:00.132 12:09:50 chaining -- common/autotest_common.sh@952 -- # kill -0 808822 00:29:00.132 12:09:50 chaining -- common/autotest_common.sh@953 -- # uname 00:29:00.132 12:09:50 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:00.132 12:09:50 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 808822 00:29:00.132 12:09:50 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:00.132 12:09:50 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:00.132 12:09:50 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 808822' 00:29:00.132 killing process with pid 808822 00:29:00.132 12:09:50 chaining -- common/autotest_common.sh@967 -- # kill 808822 00:29:00.132 Received shutdown signal, test time was about 5.000000 seconds 00:29:00.132 00:29:00.132 Latency(us) 00:29:00.132 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:00.132 =================================================================================================================== 00:29:00.132 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:29:00.132 12:09:50 chaining -- common/autotest_common.sh@972 -- # wait 808822 00:29:00.132 12:09:50 chaining -- bdev/chaining.sh@243 -- # nvmftestfini 00:29:00.132 12:09:50 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:29:00.132 12:09:50 chaining -- nvmf/common.sh@117 -- # sync 00:29:00.132 12:09:50 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:29:00.132 12:09:50 chaining -- nvmf/common.sh@120 -- # set +e 00:29:00.132 12:09:50 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:29:00.132 12:09:50 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:29:00.391 rmmod nvme_tcp 00:29:00.391 rmmod nvme_fabrics 00:29:00.391 rmmod nvme_keyring 00:29:00.391 12:09:50 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:29:00.391 12:09:50 chaining -- nvmf/common.sh@124 -- # set -e 00:29:00.391 12:09:50 chaining -- nvmf/common.sh@125 -- # return 0 00:29:00.391 12:09:50 chaining -- nvmf/common.sh@489 -- # '[' -n 807436 ']' 00:29:00.391 12:09:50 chaining -- nvmf/common.sh@490 -- # killprocess 807436 00:29:00.391 12:09:50 chaining -- common/autotest_common.sh@948 -- # '[' -z 807436 ']' 00:29:00.391 12:09:50 chaining -- common/autotest_common.sh@952 -- # kill -0 807436 00:29:00.391 12:09:50 chaining -- common/autotest_common.sh@953 -- # uname 00:29:00.391 12:09:50 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:00.391 12:09:50 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 807436 00:29:00.391 12:09:50 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:29:00.391 12:09:50 chaining -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:29:00.391 12:09:50 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 807436' 00:29:00.392 killing process with pid 807436 00:29:00.392 12:09:50 chaining -- common/autotest_common.sh@967 -- # kill 807436 00:29:00.392 12:09:50 chaining -- common/autotest_common.sh@972 -- # wait 807436 00:29:00.651 12:09:50 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:29:00.651 12:09:50 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:29:00.651 12:09:50 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:29:00.651 12:09:50 chaining -- nvmf/common.sh@274 -- # [[ cvl_0_0_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:29:00.651 12:09:50 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:29:00.651 12:09:50 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:29:00.651 12:09:50 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:29:00.651 12:09:50 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:29:02.556 12:09:52 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush cvl_0_1 00:29:02.556 12:09:52 chaining -- bdev/chaining.sh@245 -- # trap - SIGINT SIGTERM EXIT 00:29:02.556 00:29:02.556 real 0m44.742s 00:29:02.556 user 0m54.472s 00:29:02.556 sys 0m9.750s 00:29:02.556 12:09:52 chaining -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:02.556 12:09:52 chaining -- common/autotest_common.sh@10 -- # set +x 00:29:02.556 ************************************ 00:29:02.556 END TEST chaining 00:29:02.556 ************************************ 00:29:02.556 12:09:52 -- common/autotest_common.sh@1142 -- # return 0 00:29:02.556 12:09:52 -- spdk/autotest.sh@363 -- # [[ 0 -eq 1 ]] 00:29:02.556 12:09:52 -- spdk/autotest.sh@367 -- # [[ 0 -eq 1 ]] 00:29:02.556 12:09:52 -- spdk/autotest.sh@371 -- # [[ 0 -eq 1 ]] 00:29:02.556 12:09:52 -- spdk/autotest.sh@375 -- # [[ 0 -eq 1 ]] 00:29:02.556 12:09:52 -- spdk/autotest.sh@380 -- # trap - SIGINT SIGTERM EXIT 00:29:02.556 12:09:52 -- spdk/autotest.sh@382 -- # timing_enter post_cleanup 00:29:02.556 12:09:52 -- common/autotest_common.sh@722 -- # xtrace_disable 00:29:02.556 12:09:52 -- common/autotest_common.sh@10 -- # set +x 00:29:02.556 12:09:52 -- spdk/autotest.sh@383 -- # autotest_cleanup 00:29:02.556 12:09:52 -- common/autotest_common.sh@1392 -- # local autotest_es=0 00:29:02.556 12:09:52 -- common/autotest_common.sh@1393 -- # xtrace_disable 00:29:02.556 12:09:52 -- common/autotest_common.sh@10 -- # set +x 00:29:06.745 INFO: APP EXITING 00:29:06.745 INFO: killing all VMs 00:29:06.745 INFO: killing vhost app 00:29:06.745 INFO: EXIT DONE 00:29:09.275 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:29:09.533 Waiting for block devices as requested 00:29:09.792 0000:5e:00.0 (8086 0a54): vfio-pci -> nvme 00:29:09.792 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:29:09.792 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:29:10.050 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:29:10.050 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:29:10.050 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:29:10.050 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:29:10.309 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:29:10.309 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:29:10.309 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:29:10.309 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:29:10.568 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:29:10.568 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:29:10.568 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:29:10.827 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:29:10.827 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:29:10.827 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:29:14.114 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:29:14.114 Cleaning 00:29:14.114 Removing: /var/run/dpdk/spdk0/config 00:29:14.114 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-0 00:29:14.114 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-1 00:29:14.114 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-2 00:29:14.114 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-3 00:29:14.114 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-0 00:29:14.114 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-1 00:29:14.114 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-2 00:29:14.114 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-3 00:29:14.114 Removing: /var/run/dpdk/spdk0/fbarray_memzone 00:29:14.114 Removing: /var/run/dpdk/spdk0/hugepage_info 00:29:14.114 Removing: /dev/shm/nvmf_trace.0 00:29:14.114 Removing: /dev/shm/spdk_tgt_trace.pid546516 00:29:14.114 Removing: /var/run/dpdk/spdk0 00:29:14.114 Removing: /var/run/dpdk/spdk_pid542930 00:29:14.114 Removing: /var/run/dpdk/spdk_pid545340 00:29:14.114 Removing: /var/run/dpdk/spdk_pid546516 00:29:14.114 Removing: /var/run/dpdk/spdk_pid547145 00:29:14.114 Removing: /var/run/dpdk/spdk_pid548079 00:29:14.114 Removing: /var/run/dpdk/spdk_pid548323 00:29:14.114 Removing: /var/run/dpdk/spdk_pid549280 00:29:14.114 Removing: /var/run/dpdk/spdk_pid549385 00:29:14.114 Removing: /var/run/dpdk/spdk_pid549632 00:29:14.114 Removing: /var/run/dpdk/spdk_pid552288 00:29:14.114 Removing: /var/run/dpdk/spdk_pid554000 00:29:14.114 Removing: /var/run/dpdk/spdk_pid554278 00:29:14.114 Removing: /var/run/dpdk/spdk_pid554566 00:29:14.114 Removing: /var/run/dpdk/spdk_pid554861 00:29:14.114 Removing: /var/run/dpdk/spdk_pid555202 00:29:14.114 Removing: /var/run/dpdk/spdk_pid555446 00:29:14.114 Removing: /var/run/dpdk/spdk_pid555654 00:29:14.114 Removing: /var/run/dpdk/spdk_pid555943 00:29:14.114 Removing: /var/run/dpdk/spdk_pid556868 00:29:14.114 Removing: /var/run/dpdk/spdk_pid559837 00:29:14.114 Removing: /var/run/dpdk/spdk_pid560083 00:29:14.114 Removing: /var/run/dpdk/spdk_pid560369 00:29:14.114 Removing: /var/run/dpdk/spdk_pid560638 00:29:14.114 Removing: /var/run/dpdk/spdk_pid560659 00:29:14.114 Removing: /var/run/dpdk/spdk_pid560940 00:29:14.114 Removing: /var/run/dpdk/spdk_pid561184 00:29:14.114 Removing: /var/run/dpdk/spdk_pid561426 00:29:14.114 Removing: /var/run/dpdk/spdk_pid561679 00:29:14.114 Removing: /var/run/dpdk/spdk_pid561921 00:29:14.114 Removing: /var/run/dpdk/spdk_pid562171 00:29:14.114 Removing: /var/run/dpdk/spdk_pid562416 00:29:14.114 Removing: /var/run/dpdk/spdk_pid562662 00:29:14.114 Removing: /var/run/dpdk/spdk_pid562914 00:29:14.114 Removing: /var/run/dpdk/spdk_pid563159 00:29:14.114 Removing: /var/run/dpdk/spdk_pid563408 00:29:14.114 Removing: /var/run/dpdk/spdk_pid563651 00:29:14.114 Removing: /var/run/dpdk/spdk_pid563897 00:29:14.114 Removing: /var/run/dpdk/spdk_pid564146 00:29:14.114 Removing: /var/run/dpdk/spdk_pid564475 00:29:14.114 Removing: /var/run/dpdk/spdk_pid564765 00:29:14.114 Removing: /var/run/dpdk/spdk_pid565022 00:29:14.114 Removing: /var/run/dpdk/spdk_pid565267 00:29:14.114 Removing: /var/run/dpdk/spdk_pid565521 00:29:14.114 Removing: /var/run/dpdk/spdk_pid565823 00:29:14.114 Removing: /var/run/dpdk/spdk_pid566349 00:29:14.114 Removing: /var/run/dpdk/spdk_pid566642 00:29:14.114 Removing: /var/run/dpdk/spdk_pid567107 00:29:14.114 Removing: /var/run/dpdk/spdk_pid567350 00:29:14.114 Removing: /var/run/dpdk/spdk_pid567603 00:29:14.114 Removing: /var/run/dpdk/spdk_pid567966 00:29:14.114 Removing: /var/run/dpdk/spdk_pid568317 00:29:14.114 Removing: /var/run/dpdk/spdk_pid568570 00:29:14.114 Removing: /var/run/dpdk/spdk_pid568825 00:29:14.372 Removing: /var/run/dpdk/spdk_pid569097 00:29:14.372 Removing: /var/run/dpdk/spdk_pid569405 00:29:14.372 Removing: /var/run/dpdk/spdk_pid569770 00:29:14.372 Removing: /var/run/dpdk/spdk_pid570228 00:29:14.372 Removing: /var/run/dpdk/spdk_pid570263 00:29:14.372 Removing: /var/run/dpdk/spdk_pid573996 00:29:14.372 Removing: /var/run/dpdk/spdk_pid576053 00:29:14.372 Removing: /var/run/dpdk/spdk_pid578016 00:29:14.372 Removing: /var/run/dpdk/spdk_pid579161 00:29:14.372 Removing: /var/run/dpdk/spdk_pid580324 00:29:14.372 Removing: /var/run/dpdk/spdk_pid580675 00:29:14.372 Removing: /var/run/dpdk/spdk_pid580812 00:29:14.372 Removing: /var/run/dpdk/spdk_pid580842 00:29:14.372 Removing: /var/run/dpdk/spdk_pid585487 00:29:14.372 Removing: /var/run/dpdk/spdk_pid586185 00:29:14.372 Removing: /var/run/dpdk/spdk_pid587330 00:29:14.372 Removing: /var/run/dpdk/spdk_pid587589 00:29:14.372 Removing: /var/run/dpdk/spdk_pid592890 00:29:14.372 Removing: /var/run/dpdk/spdk_pid594404 00:29:14.372 Removing: /var/run/dpdk/spdk_pid595279 00:29:14.372 Removing: /var/run/dpdk/spdk_pid599754 00:29:14.372 Removing: /var/run/dpdk/spdk_pid601568 00:29:14.372 Removing: /var/run/dpdk/spdk_pid602357 00:29:14.372 Removing: /var/run/dpdk/spdk_pid606557 00:29:14.372 Removing: /var/run/dpdk/spdk_pid608726 00:29:14.372 Removing: /var/run/dpdk/spdk_pid609713 00:29:14.372 Removing: /var/run/dpdk/spdk_pid618947 00:29:14.372 Removing: /var/run/dpdk/spdk_pid620903 00:29:14.372 Removing: /var/run/dpdk/spdk_pid621918 00:29:14.372 Removing: /var/run/dpdk/spdk_pid631215 00:29:14.372 Removing: /var/run/dpdk/spdk_pid633451 00:29:14.372 Removing: /var/run/dpdk/spdk_pid634458 00:29:14.372 Removing: /var/run/dpdk/spdk_pid644127 00:29:14.372 Removing: /var/run/dpdk/spdk_pid647317 00:29:14.372 Removing: /var/run/dpdk/spdk_pid648323 00:29:14.372 Removing: /var/run/dpdk/spdk_pid658651 00:29:14.372 Removing: /var/run/dpdk/spdk_pid661033 00:29:14.372 Removing: /var/run/dpdk/spdk_pid662048 00:29:14.372 Removing: /var/run/dpdk/spdk_pid673098 00:29:14.373 Removing: /var/run/dpdk/spdk_pid675482 00:29:14.373 Removing: /var/run/dpdk/spdk_pid676500 00:29:14.373 Removing: /var/run/dpdk/spdk_pid686926 00:29:14.373 Removing: /var/run/dpdk/spdk_pid690548 00:29:14.373 Removing: /var/run/dpdk/spdk_pid691747 00:29:14.373 Removing: /var/run/dpdk/spdk_pid692793 00:29:14.373 Removing: /var/run/dpdk/spdk_pid695766 00:29:14.373 Removing: /var/run/dpdk/spdk_pid700784 00:29:14.373 Removing: /var/run/dpdk/spdk_pid703638 00:29:14.373 Removing: /var/run/dpdk/spdk_pid708572 00:29:14.373 Removing: /var/run/dpdk/spdk_pid711998 00:29:14.373 Removing: /var/run/dpdk/spdk_pid717452 00:29:14.373 Removing: /var/run/dpdk/spdk_pid720360 00:29:14.373 Removing: /var/run/dpdk/spdk_pid726725 00:29:14.373 Removing: /var/run/dpdk/spdk_pid728992 00:29:14.373 Removing: /var/run/dpdk/spdk_pid735143 00:29:14.373 Removing: /var/run/dpdk/spdk_pid737802 00:29:14.373 Removing: /var/run/dpdk/spdk_pid743968 00:29:14.373 Removing: /var/run/dpdk/spdk_pid746244 00:29:14.373 Removing: /var/run/dpdk/spdk_pid750654 00:29:14.373 Removing: /var/run/dpdk/spdk_pid751009 00:29:14.373 Removing: /var/run/dpdk/spdk_pid751358 00:29:14.373 Removing: /var/run/dpdk/spdk_pid751822 00:29:14.373 Removing: /var/run/dpdk/spdk_pid752348 00:29:14.373 Removing: /var/run/dpdk/spdk_pid753101 00:29:14.373 Removing: /var/run/dpdk/spdk_pid753928 00:29:14.373 Removing: /var/run/dpdk/spdk_pid754290 00:29:14.373 Removing: /var/run/dpdk/spdk_pid756098 00:29:14.373 Removing: /var/run/dpdk/spdk_pid757937 00:29:14.373 Removing: /var/run/dpdk/spdk_pid759775 00:29:14.373 Removing: /var/run/dpdk/spdk_pid761224 00:29:14.373 Removing: /var/run/dpdk/spdk_pid763059 00:29:14.373 Removing: /var/run/dpdk/spdk_pid764897 00:29:14.373 Removing: /var/run/dpdk/spdk_pid766741 00:29:14.373 Removing: /var/run/dpdk/spdk_pid768295 00:29:14.373 Removing: /var/run/dpdk/spdk_pid769173 00:29:14.373 Removing: /var/run/dpdk/spdk_pid769855 00:29:14.373 Removing: /var/run/dpdk/spdk_pid771918 00:29:14.373 Removing: /var/run/dpdk/spdk_pid774232 00:29:14.373 Removing: /var/run/dpdk/spdk_pid776473 00:29:14.373 Removing: /var/run/dpdk/spdk_pid777803 00:29:14.373 Removing: /var/run/dpdk/spdk_pid779215 00:29:14.373 Removing: /var/run/dpdk/spdk_pid779779 00:29:14.373 Removing: /var/run/dpdk/spdk_pid779930 00:29:14.373 Removing: /var/run/dpdk/spdk_pid780002 00:29:14.373 Removing: /var/run/dpdk/spdk_pid780252 00:29:14.373 Removing: /var/run/dpdk/spdk_pid780494 00:29:14.373 Removing: /var/run/dpdk/spdk_pid781588 00:29:14.373 Removing: /var/run/dpdk/spdk_pid783531 00:29:14.373 Removing: /var/run/dpdk/spdk_pid785306 00:29:14.373 Removing: /var/run/dpdk/spdk_pid786242 00:29:14.373 Removing: /var/run/dpdk/spdk_pid787305 00:29:14.373 Removing: /var/run/dpdk/spdk_pid787554 00:29:14.373 Removing: /var/run/dpdk/spdk_pid787579 00:29:14.373 Removing: /var/run/dpdk/spdk_pid787610 00:29:14.373 Removing: /var/run/dpdk/spdk_pid788796 00:29:14.373 Removing: /var/run/dpdk/spdk_pid789487 00:29:14.373 Removing: /var/run/dpdk/spdk_pid789964 00:29:14.373 Removing: /var/run/dpdk/spdk_pid792023 00:29:14.373 Removing: /var/run/dpdk/spdk_pid794205 00:29:14.373 Removing: /var/run/dpdk/spdk_pid796525 00:29:14.373 Removing: /var/run/dpdk/spdk_pid797755 00:29:14.373 Removing: /var/run/dpdk/spdk_pid799246 00:29:14.373 Removing: /var/run/dpdk/spdk_pid799939 00:29:14.373 Removing: /var/run/dpdk/spdk_pid799966 00:29:14.373 Removing: /var/run/dpdk/spdk_pid804212 00:29:14.373 Removing: /var/run/dpdk/spdk_pid804291 00:29:14.373 Removing: /var/run/dpdk/spdk_pid804535 00:29:14.633 Removing: /var/run/dpdk/spdk_pid804579 00:29:14.633 Removing: /var/run/dpdk/spdk_pid804832 00:29:14.633 Removing: /var/run/dpdk/spdk_pid805351 00:29:14.633 Removing: /var/run/dpdk/spdk_pid806475 00:29:14.633 Removing: /var/run/dpdk/spdk_pid807669 00:29:14.633 Removing: /var/run/dpdk/spdk_pid808822 00:29:14.633 Clean 00:29:14.633 12:10:04 -- common/autotest_common.sh@1451 -- # return 0 00:29:14.633 12:10:04 -- spdk/autotest.sh@384 -- # timing_exit post_cleanup 00:29:14.633 12:10:04 -- common/autotest_common.sh@728 -- # xtrace_disable 00:29:14.633 12:10:04 -- common/autotest_common.sh@10 -- # set +x 00:29:14.633 12:10:04 -- spdk/autotest.sh@386 -- # timing_exit autotest 00:29:14.633 12:10:04 -- common/autotest_common.sh@728 -- # xtrace_disable 00:29:14.633 12:10:04 -- common/autotest_common.sh@10 -- # set +x 00:29:14.633 12:10:04 -- spdk/autotest.sh@387 -- # chmod a+r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:29:14.633 12:10:04 -- spdk/autotest.sh@389 -- # [[ -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log ]] 00:29:14.633 12:10:04 -- spdk/autotest.sh@389 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log 00:29:14.633 12:10:04 -- spdk/autotest.sh@391 -- # hash lcov 00:29:14.633 12:10:04 -- spdk/autotest.sh@391 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:29:14.633 12:10:04 -- spdk/autotest.sh@393 -- # hostname 00:29:14.633 12:10:04 -- spdk/autotest.sh@393 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -d /var/jenkins/workspace/crypto-phy-autotest/spdk -t spdk-wfp-03 -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info 00:29:14.891 geninfo: WARNING: invalid characters removed from testname! 00:29:32.994 12:10:20 -- spdk/autotest.sh@394 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:29:32.994 12:10:23 -- spdk/autotest.sh@395 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:29:34.904 12:10:24 -- spdk/autotest.sh@396 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '/usr/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:29:36.283 12:10:26 -- spdk/autotest.sh@397 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:29:38.189 12:10:28 -- spdk/autotest.sh@398 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:29:40.088 12:10:29 -- spdk/autotest.sh@399 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:29:41.461 12:10:31 -- spdk/autotest.sh@400 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:29:41.461 12:10:31 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:29:41.461 12:10:31 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:29:41.461 12:10:31 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:29:41.461 12:10:31 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:29:41.461 12:10:31 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:41.461 12:10:31 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:41.461 12:10:31 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:41.461 12:10:31 -- paths/export.sh@5 -- $ export PATH 00:29:41.461 12:10:31 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:41.461 12:10:31 -- common/autobuild_common.sh@443 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:41.461 12:10:31 -- common/autobuild_common.sh@444 -- $ date +%s 00:29:41.461 12:10:31 -- common/autobuild_common.sh@444 -- $ mktemp -dt spdk_1720779031.XXXXXX 00:29:41.461 12:10:31 -- common/autobuild_common.sh@444 -- $ SPDK_WORKSPACE=/tmp/spdk_1720779031.mFy5e9 00:29:41.461 12:10:31 -- common/autobuild_common.sh@446 -- $ [[ -n '' ]] 00:29:41.461 12:10:31 -- common/autobuild_common.sh@450 -- $ '[' -n '' ']' 00:29:41.461 12:10:31 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:29:41.461 12:10:31 -- common/autobuild_common.sh@457 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:29:41.461 12:10:31 -- common/autobuild_common.sh@459 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:29:41.461 12:10:31 -- common/autobuild_common.sh@460 -- $ get_config_params 00:29:41.461 12:10:31 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:29:41.461 12:10:31 -- common/autotest_common.sh@10 -- $ set +x 00:29:41.461 12:10:31 -- common/autobuild_common.sh@460 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:29:41.461 12:10:31 -- common/autobuild_common.sh@462 -- $ start_monitor_resources 00:29:41.461 12:10:31 -- pm/common@17 -- $ local monitor 00:29:41.461 12:10:31 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:29:41.461 12:10:31 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:29:41.461 12:10:31 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:29:41.461 12:10:31 -- pm/common@21 -- $ date +%s 00:29:41.461 12:10:31 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:29:41.461 12:10:31 -- pm/common@21 -- $ date +%s 00:29:41.461 12:10:31 -- pm/common@25 -- $ sleep 1 00:29:41.461 12:10:31 -- pm/common@21 -- $ date +%s 00:29:41.461 12:10:31 -- pm/common@21 -- $ date +%s 00:29:41.461 12:10:31 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1720779031 00:29:41.461 12:10:31 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1720779031 00:29:41.462 12:10:31 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1720779031 00:29:41.462 12:10:31 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1720779031 00:29:41.720 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1720779031_collect-cpu-temp.pm.log 00:29:41.720 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1720779031_collect-vmstat.pm.log 00:29:41.720 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1720779031_collect-cpu-load.pm.log 00:29:41.720 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1720779031_collect-bmc-pm.bmc.pm.log 00:29:42.655 12:10:32 -- common/autobuild_common.sh@463 -- $ trap stop_monitor_resources EXIT 00:29:42.655 12:10:32 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j96 00:29:42.655 12:10:32 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:42.655 12:10:32 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:29:42.655 12:10:32 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:29:42.655 12:10:32 -- spdk/autopackage.sh@19 -- $ timing_finish 00:29:42.655 12:10:32 -- common/autotest_common.sh@734 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:29:42.655 12:10:32 -- common/autotest_common.sh@735 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:29:42.655 12:10:32 -- common/autotest_common.sh@737 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:29:42.655 12:10:32 -- spdk/autopackage.sh@20 -- $ exit 0 00:29:42.655 12:10:32 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:29:42.655 12:10:32 -- pm/common@29 -- $ signal_monitor_resources TERM 00:29:42.655 12:10:32 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:29:42.655 12:10:32 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:29:42.655 12:10:32 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:29:42.655 12:10:32 -- pm/common@44 -- $ pid=819871 00:29:42.655 12:10:32 -- pm/common@50 -- $ kill -TERM 819871 00:29:42.655 12:10:32 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:29:42.655 12:10:32 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:29:42.655 12:10:32 -- pm/common@44 -- $ pid=819872 00:29:42.655 12:10:32 -- pm/common@50 -- $ kill -TERM 819872 00:29:42.655 12:10:32 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:29:42.655 12:10:32 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:29:42.655 12:10:32 -- pm/common@44 -- $ pid=819875 00:29:42.655 12:10:32 -- pm/common@50 -- $ kill -TERM 819875 00:29:42.655 12:10:32 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:29:42.655 12:10:32 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:29:42.655 12:10:32 -- pm/common@44 -- $ pid=819899 00:29:42.655 12:10:32 -- pm/common@50 -- $ sudo -E kill -TERM 819899 00:29:42.655 + [[ -n 425363 ]] 00:29:42.655 + sudo kill 425363 00:29:42.664 [Pipeline] } 00:29:42.684 [Pipeline] // stage 00:29:42.689 [Pipeline] } 00:29:42.707 [Pipeline] // timeout 00:29:42.712 [Pipeline] } 00:29:42.728 [Pipeline] // catchError 00:29:42.733 [Pipeline] } 00:29:42.749 [Pipeline] // wrap 00:29:42.755 [Pipeline] } 00:29:42.768 [Pipeline] // catchError 00:29:42.777 [Pipeline] stage 00:29:42.778 [Pipeline] { (Epilogue) 00:29:42.792 [Pipeline] catchError 00:29:42.793 [Pipeline] { 00:29:42.807 [Pipeline] echo 00:29:42.808 Cleanup processes 00:29:42.815 [Pipeline] sh 00:29:43.102 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:43.102 820012 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/sdr.cache 00:29:43.102 820267 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:43.113 [Pipeline] sh 00:29:43.393 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:43.393 ++ grep -v 'sudo pgrep' 00:29:43.393 ++ awk '{print $1}' 00:29:43.393 + sudo kill -9 820012 00:29:43.404 [Pipeline] sh 00:29:43.685 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:29:51.811 [Pipeline] sh 00:29:52.093 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:29:52.093 Artifacts sizes are good 00:29:52.105 [Pipeline] archiveArtifacts 00:29:52.112 Archiving artifacts 00:29:52.257 [Pipeline] sh 00:29:52.562 + sudo chown -R sys_sgci /var/jenkins/workspace/crypto-phy-autotest 00:29:52.576 [Pipeline] cleanWs 00:29:52.585 [WS-CLEANUP] Deleting project workspace... 00:29:52.585 [WS-CLEANUP] Deferred wipeout is used... 00:29:52.592 [WS-CLEANUP] done 00:29:52.593 [Pipeline] } 00:29:52.610 [Pipeline] // catchError 00:29:52.619 [Pipeline] sh 00:29:52.898 + logger -p user.info -t JENKINS-CI 00:29:52.934 [Pipeline] } 00:29:52.955 [Pipeline] // stage 00:29:52.959 [Pipeline] } 00:29:52.968 [Pipeline] // node 00:29:52.971 [Pipeline] End of Pipeline 00:29:53.065 Finished: SUCCESS